First of all, there is a story from an Alexa user’s grandson, and I’d love to call it ALEXA PROBLEM:
My grandmother is 94 and loves echo. She has such a nice voice and sometimes my mother worries that she is asking Alexa too many questions. She gets her morning news, has books read to her, looks up information my grandmother is interested in, and now they talk almost every night. And turns on and off her lights, along with waking her up in the morning.
In this description, there is almost no word about the action of grandmother, and the only one is “talk”. In the context, Alexa seemed like grandmother’s digital prosthetics.
There is a humanoid robot named Sophia just became a citizen.
Mark Goldfeder, an Atlanta-based rabbi and law professor, has reached a similar conclusion: If an entity acts human, he wrote recently, “I cannot start poking it to see if it bleeds. I have a responsibility to treat all that seem human as humans, and it is better to err on the side of caution from an ethical perspective.”
The obvious conclusion is that rights ought to be accorded not on the basis of biology but on something even more fundamental: personhood.
Ryan Calo, an expert in robotics and cyber law at the University of Washington in Seattle, says our laws are unlikely to bend that far. “Our legal system reflects our basic biology,” he says. If we one day invent some sort of artificial person, “it would break everything about the law, as we understand it today.”