“We have developed algorithms that allow a robot to determine whether it should deceive a human or other intelligent machine and we have designed techniques that help the robot select the best deceptive strategy to reduce its chance of being discovered,” said Ronald Arkin, a Regents professor in the Georgia Tech School of Interactive Computing.
The results of robot experiments and theoretical and cognitive deception modeling were published online on September 3 in the International Journal of Social Robotics. Because the researchers explored the phenomena of robot deception from a general perspective, the study’s results apply to robot-robot and human-robot interactions. This research was funded by the Office of Naval Research.
In the future, robots capable of deception may be valuable for several different areas, including military and search and rescue operations. A search and rescue robot may need to deceive in order to calm or receive cooperation from a panicking victim. Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe.
“Most social robots will probably rarely use deception, but it’s still an important tool in the robot’s interactive arsenal because robots that recognize the need for deception have advantages in terms of outcome compared to robots that do not recognize the need for deception,” said the study’s co-author, Alan Wagner, a research engineer at the Georgia Tech Research Institute.
For this study, the researchers focused on the actions, beliefs and communications of a robot attempting to hide from another robot to develop programs that successfully produced deceptive behavior. Their first step was to teach the deceiving robot how to recognize a situation that warranted the use of deception. Wagner and Arkin used interdependence theory and game theory to develop algorithms that tested the value of deception in a specific situation. A situation had to satisfy two key conditions to warrant deception -- there must be conflict between the deceiving robot and the seeker, and the deceiver must benefit from the deception.
Once a situation was deemed to warrant deception, the robot carried out a deceptive act by providing a false communication to benefit itself. The technique developed by the Georgia Tech researchers based a robot’s deceptive action selection on its understanding of the individual robot it was attempting to deceive.
To test their algorithms, the researchers ran 20 hide-and-seek experiments with two autonomous robots. Colored markers were lined up along three potential pathways to locations where the robot could hide. The hider robot randomly selected a hiding location from the three location choices and moved toward that location, knocking down colored markers along the way. Once it reached a point past the markers, the robot changed course and hid in one of the other two locations. The presence or absence of standing markers indicated the hider’s location to the seeker robot.
“The hider’s set of false communications was defined by selecting a pattern of knocked over markers that indicated a false hiding position in an attempt to say, for example, that it was going to the right and then actually go to the left,” explained Wagner.
The hider robots were able to deceive the seeker robots in 75 percent of the trials, with the failed experiments resulting from the hiding robot’s inability to knock over the correct markers to produce the desired deceptive communication.
“The experimental results weren’t perfect, but they demonstrated the learning and use of deception signals by real robots in a noisy environment,” said Wagner. “The results were also a preliminary indication that the techniques and algorithms described in the paper could be used to successfully produce deceptive behavior in a robot.”
While there may be advantages to creating robots with the capacity for deception, there are also ethical implications that need to be considered to ensure that these creations are consistent with the overall expectations and well-being of society, according to the researchers.
“We have been concerned from the very beginning with the ethical implications related to the creation of robots capable of deception and we understand that there are beneficial and deleterious aspects,” explained Arkin. “We strongly encourage discussion about the appropriateness of deceptive robots to determine what, if any, regulations or guidelines should constrain the development of these systems.”
This work was funded by Grant No. N00014-08-1-0696 from the Office of Naval Research (ONR). The content is solely the responsibility of the principal investigator and does not necessarily represent the official view of ONR.
Abby Vogel Robinson | Source: Newswise Science News
Further information: www.gatech.edu
More articles from Interdisciplinary Research:
Stingray movement could inspire the next generation of submarines
14.11.2013 | University at Buffalo
An intersection of math and biology: Clams and snails inspire robotic diggers and crawlers
12.11.2013 | Society for Industrial and Applied Mathematics
Quantum entanglement, a perplexing phenomenon of quantum mechanics that Albert Einstein once referred to as “spooky action at a distance,” could be even spookier than Einstein perceived.
Physicists at the University of Washington and Stony Brook University in New York believe the phenomenon might be intrinsically linked with wormholes, hypothetical features of space-time that in popular science fiction can provide a much-faster-than-light shortcut from one part of the universe to another.
But here’s the catch: One couldn’t actually ...
A star is formed when a large cloud of gas and dust condenses and eventually becomes so dense that it collapses into a ball of gas, where the pressure heats the matter, creating a glowing gas ball – a star is born.
New research from the Niels Bohr Institute, among others, shows that a young, newly formed star in the Milky Way had such an explosive growth, that it was initially about 100 times brighter than it is now. The results are published in the scientific journal, Astrophysical Journal Letters.
The young ...
EPFL scientists have shown how to achieve a dramatic increase in the capacity of optical fibers; Their simple, innovative solution reduces the amount of space required between the pulses of light that transport data
Optical fibers carry data in the form of pulses of light over distances of thousands of miles at amazing speeds. They are one of the glories of modern telecommunications technology.
However, their capacity is limited, because the pulses of light need to be lined up one after the other in ...
NASA's Hurricane and Severe Storms Sentinel airborne mission known as HS3 wrapped up for the 2013 Atlantic Ocean hurricane season at the end of September, and had several highlights. HS3 will return to NASA’s Wallops Flight Facility in Wallops Island, Va., for the 2014 Atlantic hurricane season.
During the 2013 mission, two unmanned Global Hawks flew from Wallops for the first time. The mission highlights included studying the Saharan Air Layer, following the genesis of a tropical storm, finding a unique hybrid core or center circulation in a redeveloped storm, obtaining measurements on the strongest side of ...
Nanosponges that soak up a dangerous pore-forming toxin produced by MRSA (methicillin-resistant Staphylococcus aureus) could serve as a safe and effective vaccine against this toxin.
This "nanosponge vaccine" enabled the immune systems of mice to block the adverse effects of the alpha-haemolysin toxin from MRSA—both within the bloodstream and on the skin. Nanoengineers from the University of California, San Diego described the safety and efficacy of this nanosponge vaccine in the December 1 issue of ...
04.12.2013 | Health and Medicine
04.12.2013 | Materials Sciences
04.12.2013 | Ecology, The Environment and Conservation
04.12.2013 | Event News
12.11.2013 | Event News
29.10.2013 | Event News