
Confucianism Provides an Alternative to Granting Rights to Robots
A recent study challenges the idea that robots should be granted rights and proposes instead a system of role obligations, similar to that found in Confucianism. It suggests that avoiding human-robot conflict and fostering teamwork can be achieved through seeing robots as players in social rites rather than as rights bearers, and it adds that respect for robots constructed in our image is a reflection of our own self-respect.
The ethical and legal ramifications of robotics have been studied by prominent philosophers and lawyers, with some arguing that robots should be granted legal rights. Recent research on robot rights has shown that it is not a good idea to grant robots legal protections, despite the increasing prevalence of robots in everyday life. Instead, a Confucian-inspired strategy is proposed by the research.
This critique was written by a professor at Carnegie Mellon University (CMU) and published in the latest issue of Communications of the ACM.
“People are worried about the risks of granting rights to robots,” says Tae Wan Kim, Associate Professor of Business Ethics at Carnegie Mellon University’s Tepper School of Business. To address the moral condition of robots, granting rights is not the only option. It could be more effective to think about robots in terms of rites bearers rather than rights bearers.
Kim disagrees with the common view that showing robots respect necessitates giving them legal protections. Traditional Chinese thought, known as Confucianism, places a premium on social harmony and holds that what makes us truly human is our capacity to think of our own interests not in isolation but in the context of our relationships with others and with the larger community. In turn, this calls for a fresh take on rituals, with participants gaining spiritual benefits from taking part in the right ceremonies.
Kim argues that rather than granting robots rights, we should follow the Confucian option and give them ceremonies, or what he calls role duties. Conflict between humans and machines is cause for concern, because the concept of rights is often combative and competitive.
“Assigning role obligations to robots encourages teamwork, which triggers an understanding that fulfilling those obligations should be done harmoniously,” says Kim. For robots to evolve into rituals carriers, they will need to be powered by a form of artificial intelligence that can mimic humans’ ability to perceive and execute team actions; this is something a machine can learn in a number of different ways.
Kim recognizes that some may wonder why robots should be accorded such respect. “To the extent that we make robots in our image, if we don’t treat them well, as entities capable of participating in rites, we degrade ourselves,” he argues.
Corporations and other artificial entities have the same legal status as humans and are even afforded some protections under the Bill of Rights. Furthermore, humans are not the only moral and legal species; in most industrialized nations, researchers are prohibited from using animals for scientific studies without just cause.