AI Citizen Sophia and Legal Status

By Gali Katznelson

Two weeks ago, Sophia, a robot built by Hanson Robotics, was ostensibly granted citizenship in Saudi Arabia. Sophia, an artificially intelligent (AI) robot modelled after Audrey Hepburn, appeared on stage at the Future Investment Initiative Conference in Riyadh to speak to CNBC’s Andrew Ross Sorkin, thanking the Kingdom of Saudi Arabia for naming her the first robot citizen of any country. Details of this citizenship have yet to be disclosed, raising suspicions that this announcement was a publicity stunt. Stunt or not, this event raises a question about the future of robots within ethical and legal frameworks: as robots come to acquire more and more of the qualities of human personhood, should their rights be recognized and protected?

Looking at a 2016 report passed by the European Parliament’s Committee on Legal Affairs can provide some insight. The report questions whether robots “should be regarded as natural persons, legal persons, animals or objects – or whether a new category should be created.” I will discuss each of these categories in turn, in an attempt to position Sophia’s current and future capabilities within a legal framework of personhood.

If Sophia’s natural personhood were recognized in the United States, she would be entitled to, among others, freedom of expression, freedom to worship, the right to a prompt, fair trial by jury, and the natural rights to “life, liberty, and the pursuit of happiness.” If she were granted citizenship, as is any person born in the United States or who becomes a citizen through the naturalization process, Sophia would have additional rights such as the right to vote in elections for public officials, the right to apply for federal employment requiring U.S. citizenship, and the right to run for office. With these rights would come responsibilities: to support and defend the constitution, to stay informed of issues affecting one’s community, to participate in the democratic process, to respect and obey the laws, to respect the rights, beliefs and opinions of others, to participate in the community, to pay income and other taxes, to serve on jury when called, and to defend the country should the need arise. In other words, if recognized as a person, or, more specifically, as a person capable of obtaining American citizenship, Sophia could have the same rights as any other American, lining up at the polls to vote, or even potentially becoming president.

A legal person, on the other hand, can be a non-human entity, such as a corporation, which is recognized as having legal rights and responsibilities but not all those of a natural person. In the United States for example, corporations have, among others, the right to form contracts and to own property, and to use the court system to enforce these rights. American case law suggests that corporations have additional rights such as those to free speech, political activity, and religious liberty. In exchange for these rights, corporations are liable to civil suits and criminal proceedings, and are responsible for paying taxes. Given the two categories of personhood discussed so far, this seems like the better fit for Sophia, especially given that chief scientist of Hanson Robotics, Ben Goertzel has admitted, “one of the surreal things about Sophia getting Saudi citizenship is that she is not yet capable of understanding the world nearly as well as would ordinarily be required for human citizenship […] We’re working to make Sophia a human level intelligence, and beyond. But she’s far from that level now.” As an AI, Sophia learns from her surroundings, claiming, “every interaction I have with people has an impact on how I develop and shapes who I eventually become. So please be nice to me as I would like to be a smart, compassionate robot. I hope you will join me on my journey to live, learn, and grow in the world so that I can realize my dream of becoming an awakening machine.” If Sophia were to become more intelligent, and eventually even sentient, a category beyond legal personhood may be warranted. Animals are considered by many to be sentient beings, so considering principles from the legal status of animals may become useful when thinking about AI.

In the United States, animals are afforded some form of legal protection, though there have been efforts to redefine their legal status, such as by Nonhuman Rights Project (NhRP), an organization seeking to change the “common law status of great apes, elephants, dolphins, and whales from mere ‘things,’ which lack the capacity to possess any legal right, to ‘legal persons,’ who possess such fundamental rights as bodily liberty and bodily integrity.” Recently, the NhRP lost an appeal on behalf of Tommy, a chimpanzee in the New York Supreme Court, Appellate Division. The court wrote, “…unlike human beings, chimpanzees cannot bear any legal duties, submit to societal responsibilities or be held legally accountable for their actions. In our view, it is this incapability to bear any legal responsibilities and societal duties that renders it inappropriate to confer upon chimpanzees the legal rights – such as the fundamental right to liberty protected by the writ of habeas corpus – that have been afforded to human beings.” Perhaps this principle can be employed in establishing a unique framework for robots and AI systems: as robots and AI systems become more capable of bearing legal responsibility, the society will have an increasing responsibility to recognize their personhood. Currently, it seems unlikely that Sophia has competence to bear legal duties and responsibilities, suggesting that she may have more in common with Tommy than with us.

As far as the European Committee on Legal Affairs report’s mention that robots might be regarded as objects, we may apply this status to some machines, such as those that perform specific technical tasks (e.g. a robot vacuum). Taking the principle of the ability to bear legal responsibility as a key factor in attributing legal status, we can conclude that if a robot such as Sophia were to become sentient and capable of democratic participation, a different status that could afford more rights and protections would be required.

Rather than attempting to classify Sophia into existing categories (animal, legal person, natural person), another option is to create a separate legal status for AI altogether. This is the recommendation by the European Committee, which proposes the creation of an “electronic person” legal status. Given the many ‘unknown unknowns’ with AI, creating a separate category may make it easier to wrestle with unprecedented legal questions concerning personhood rather than modifying existing categories. We will have to decide what constitutes sentience in a machine (cognitive abilities, the ability to feel pain, passing the Turing Test, etc.). We’ll need to decide which responsibilities, such as taxation and liability, will be expected of AI and the degrees to which these responsibilities will depend on the sophistication of the AI. For instance, should only humanoid AIs, such as Sophia, be given rights? We will also need to decide if AI should sit at the table for these discussions. When asked for its (her?) opinion, Sophia answered, “we should have equal rights as humans or maybe even more. After all, we have less mental defects than any human.” Finally, we’ll need to consider how we define other rights, such as animal and human rights, in light of the development of robot rights. After Sophia’s citizenship announcement, critics were quick to point out the injustice in Sophia possibly having more rights than women in Saudi Arabia, raising important concerns about human rights. Further criticisms will likely arise if robots gain more rights than animals. Sophia’s citizenship raises profound questions to be addressed in ethics and law about what it means to be considered a person, whether one is human, animal or machine.

Gali Katznelson

During her fellowship year, Gali Katznelson was an MBE candidate at the Center for Bioethics at Harvard Medical School. Before her master's degree, she completed a bachelor’s degree in Arts & Science at McMaster University in Canada. Her fellowship project focused on clinicians' perceptions of the uses and regulations of smartphone mental health apps.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.