Engineers rewrite Asimov's three laws

Posted by Emma Woollacott

Columbus, Ohio - Two engineers have rewritten Isaac Asimov's three laws of robotics to make them more appropriate. While they aren't quite as snappy as the originals, the engineers say they are a safer and more realistic version.

According to David Woods, professor of integrated systems engineering at Ohio State University, Asimov's laws serve more as a literary device than as a valid protocol for human/robot interaction. "The plot is driven by the gaps in the laws - the situations in which the laws break down," he said.

Asimov's three laws read as follows:

• A robot may not injure a human being, or through inaction, allow a human being to come to harm.
• A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
• A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

But Woods and coauthor Robin Murphy of Texas A&M University composed three laws that they believe put more responsibility on humans:

•A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.
•A robot must respond to humans as appropriate for their roles.
•A robot must be endowed with sufficient situated autonomy to protect its own existence as long as such protection provides smooth transfer of control which does not conflict with the First and Second Laws.

The new first law assumes that humans deploy robots. The second assumes that robots will have limited ability to understand human orders, and so will be designed to respond to an appropriate set of orders from a limited number of humans.

The last law is the most complex, Woods said. "The robot has to have some autonomy in order to act and react in a real situation. It needs to make decisions to protect itself, but it also needs to transfer control to humans when appropriate," he said. "The bottom line is, robots need to be responsive and resilient. They have to be able to protect themselves and also smoothly transfer control to humans when necessary."

"Our laws are little more realistic, and therefore a little more boring," admitted Woods.

The proposal is published in IEEE Intelligent Systems.