In the current issue of journal IEEE Intelligent Systems, two engineers propose alternative laws to rewrite our future with robots.
The future they foresee is at once safer, and more realistic.
“When you think about it, our cultural view of robots has always been anti-people, pro-robot,” explained David Woods, professor of integrated systems engineering at Ohio State University. “The philosophy has been, ‘sure, people make mistakes, but robots will be better -- a perfect version of ourselves.’ We wanted to write three new laws to get people thinking about the human-robot relationship in more realistic, grounded ways.”
Asimov’s laws are iconic not only among engineers and science fiction enthusiasts, but the general public as well. The laws often serve as a starting point for discussions about the relationship between humans and robots.
But while evidence suggests that Asimov thought long and hard about his laws when he wrote them, Woods believes that the author did not intend for engineers to create robots that followed those laws to the letter.
“Go back to the original context of the stories,” Woods said, referring to Asimov’s I, Robot among others. “He’s using the three laws as a literary device. The plot is driven by the gaps in the laws -- the situations in which the laws break down. For those laws to be meaningful, robots have to possess a degree of social intelligence and moral intelligence, and Asimov examines what would happen when that intelligence isn’t there.”
“His stories are so compelling because they focus on the gap between our aspirations about robots and our actual capabilities. And that’s the irony, isn’t it? When we envision our future with robots, we focus on our hopes and desires and aspirations about robots -- not reality.”
In reality, engineers are still struggling to give robots basic vision and language skills. These efforts are hindered in part by our lack of understanding of how these skills are managed in the human brain. We are far from a time when humans may teach robots a moral code and responsibility.
Woods and his coauthor, Robin Murphy of Texas A&M University, composed three laws that put the responsibility back on humans.
Woods directs the Cognitive Systems Engineering Laboratory at Ohio State, and is an expert in automation safety. Murphy is the Raytheon Professor of Computer Science and Engineering at Texas A&M, and is an expert in both rescue robotics and human-robot interaction.
Together, they composed three laws that focus on the human organizations that develop and deploy robots. They looked for ways to ensure high safety standards.
Here are Asimov’s original three laws:
A robot may not injure a human being, or through inaction, allow a human being to come to harm.
A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
And here are the three new laws that Woods and Murphy propose:
A human may not deploy a robot without the human-robot work system meeting the highest legal and professional standards of safety and ethics.
A robot must respond to humans as appropriate for their roles.
A robot must be endowed with sufficient situated autonomy to protect its own existence as long as such protection provides smooth transfer of control which does not conflict with the First and Second Laws.
The new first law assumes the reality that humans deploy robots. The second assumes that robots will have limited ability to understand human orders, and so they will be designed to respond to an appropriate set of orders from a limited number of humans.
The last law is the most complex, Woods said.
“Robots exist in an open world where you can’t predict everything that’s going to happen. The robot has to have some autonomy in order to act and react in a real situation. It needs to make decisions to protect itself, but it also needs to transfer control to humans when appropriate. You don’t want a robot to drive off a ledge, for instance -- unless a human needs the robot to drive off the ledge. When those situations happen, you need to have smooth transfer of control from the robot to the appropriate human,” Woods said.
“The bottom line is, robots need to be responsive and resilient. They have to be able to protect themselves and also smoothly transfer control to humans when necessary.”
Woods admits that one thing is missing from the new laws: the romance of Asimov’s fiction -- the idea of a perfect, moral robot that sets engineers’ hearts fluttering.
“Our laws are little more realistic, and therefore a little more boring,” he laughed.
David Woods | Newswise Science News
New dental implant with built-in reservoir reduces risk of infections
18.01.2017 | KU Leuven
Many muons: Imaging the underground with help from the cosmos
19.12.2016 | DOE/Pacific Northwest National Laboratory
For the first time ever, a cloud of ultra-cold atoms has been successfully created in space on board of a sounding rocket. The MAIUS mission demonstrates that quantum optical sensors can be operated even in harsh environments like space – a prerequi-site for finding answers to the most challenging questions of fundamental physics and an important innovation driver for everyday applications.
According to Albert Einstein's Equivalence Principle, all bodies are accelerated at the same rate by the Earth's gravity, regardless of their properties. This...
An important step towards a completely new experimental access to quantum physics has been made at University of Konstanz. The team of scientists headed by...
Yersiniae cause severe intestinal infections. Studies using Yersinia pseudotuberculosis as a model organism aim to elucidate the infection mechanisms of these...
Researchers from the University of Hamburg in Germany, in collaboration with colleagues from the University of Aarhus in Denmark, have synthesized a new superconducting material by growing a few layers of an antiferromagnetic transition-metal chalcogenide on a bismuth-based topological insulator, both being non-superconducting materials.
While superconductivity and magnetism are generally believed to be mutually exclusive, surprisingly, in this new material, superconducting correlations...
Laser-driving of semimetals allows creating novel quasiparticle states within condensed matter systems and switching between different states on ultrafast time scales
Studying properties of fundamental particles in condensed matter systems is a promising approach to quantum field theory. Quasiparticles offer the opportunity...
19.01.2017 | Event News
10.01.2017 | Event News
09.01.2017 | Event News
23.01.2017 | Health and Medicine
23.01.2017 | Physics and Astronomy
23.01.2017 | Process Engineering