Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Teleoperating robots with virtual reality

05.10.2017

VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute

Many manufacturing jobs require a physical presence to operate machinery. But what if such jobs could be done remotely? This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) presented a virtual-reality (VR) system that lets you teleoperate a robot using an Oculus Rift headset.


VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute.

Credit: Jason Dorfman, MIT CSAIL

The system embeds the user in a VR control room with multiple sensor displays, making it feel like they are inside the robot's head. By using gestures, users can match their movements to the robot's to complete various tasks.

"A system like this could eventually help humans supervise robots from a distance," says CSAIL postdoctoral associate Jeffrey Lipton, who was lead author on a related paper about the system. "By teleoperating robots from home, blue-collar workers would be able to tele-commute and benefit from the IT revolution just as white-collars workers do now."

The researchers even imagine that such a system could help employ increasing numbers of jobless video-gamers by "game-ifying" manufacturing positions.

The team demonstrated their VC control approach with the Baxter humanoid robot from Rethink Robotics, but said that the approach can work on other robot platforms and is also compatible with the HTC Vive headset.

Lipton co-wrote the paper with CSAIL director Daniela Rus and researcher Aidan Fay. They presented the paper this week at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in Vancouver.

How it works

There have traditionally been two main approaches to using VR for teleoperation.

In a "direct" model, the user's vision is directly coupled to the robot's state. With these systems, a delayed signal could lead to nausea and headaches, and the user's viewpoint is limited to one perspective.

In the "cyber-physical" model, the user is separate from the robot. The user interacts with a virtual copy of the robot and the environment. This requires much more data, and specialized spaces.

The CSAIL team's system is halfway between these two methods. It solves the delay problem, since the user is constantly receiving visual feedback from the virtual world. It also solves the the cyber-physical issue of being distinct from the robot: once a user puts on the headset and logs into the system, they will feel as if they are inside Baxter's head.

The system mimics the "homunculus model of mind" - the idea that there's a small human inside our brains controlling our actions, viewing the images we see and understanding them for us. While it's a peculiar idea for humans, for robots it fits: "inside" the robot is a human in a control room, seeing through its eyes and controlling its actions.

Using Oculus' controllers, users can interact with controls that appear in the virtual space to open and close the hand grippers to pick up, move, and retrieve items. A user can plan movements based on the distance between the arm's location marker and their hand while looking at the live display of the arm.

To make these movements possible, the human's space is mapped into the virtual space, and the virtual space is then mapped into the robot space to provide a sense of co-location.

The system is also more flexible compared to previous systems that require many resources. Other systems might extract 2-D information from each camera, build out a full 3-D model of the environment, and then process and redisplay the data.

In contrast, the CSAIL team's approach bypasses all of that by taking the 2-D images that are displayed to each eye. (The human brain does the rest by automatically inferring the 3-D information.)

To test the system, the team first teleoperated Baxter to do simple tasks like picking up screws or stapling wires. They then had the test users teleoperate the robot to pick up and stack blocks.

Users successfully completed the tasks at a much higher rate compared to the "direct" model. Unsurprisingly, users with gaming experience had much more ease with the system.

Tested against state-of-the-art systems, CSAIL's system was better at grasping objects 95 percent of the time and 57 percent faster at doing tasks. The team also showed that the system could pilot the robot from hundreds of miles away, testing it on a hotel's wireless network in Washington, DC to control Baxter at MIT.

"This contribution represents a major milestone in the effort to connect the user with the robot's space in an intuitive, natural, and effective manner." says Oussama Khatib, a computer science professor at Stanford University who was not involved in the paper.

The team eventually wants to focus on making the system more scalable, with many users and different types of robots that can be compatible with current automation technologies.

###

The project was funded in part by the Boeing Company and the National Science Foundation.

Media Contact

Adam Conner-Simons
aconner@csail.mit.edu
617-324-9135

 @mit_csail

http://www.csail.mit.edu/ 

Adam Conner-Simons | EurekAlert!

More articles from Medical Engineering:

nachricht True to type: From human biopsy to complex gut physiology on a chip
14.02.2018 | Wyss Institute for Biologically Inspired Engineering at Harvard

nachricht The Scanpy software processes huge amounts of single-cell data
12.02.2018 | Helmholtz Zentrum München - Deutsches Forschungszentrum für Gesundheit und Umwelt

All articles from Medical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Demonstration of a single molecule piezoelectric effect

Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale

Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...

Im Focus: Hybrid optics bring color imaging using ultrathin metalenses into focus

For photographers and scientists, lenses are lifesavers. They reflect and refract light, making possible the imaging systems that drive discovery through the microscope and preserve history through cameras.

But today's glass-based lenses are bulky and resist miniaturization. Next-generation technologies, such as ultrathin cameras or tiny microscopes, require...

Im Focus: Stem cell divisions in the adult brain seen for the first time

Scientists from the University of Zurich have succeeded for the first time in tracking individual stem cells and their neuronal progeny over months within the intact adult brain. This study sheds light on how new neurons are produced throughout life.

The generation of new nerve cells was once thought to taper off at the end of embryonic development. However, recent research has shown that the adult brain...

Im Focus: Interference as a new method for cooling quantum devices

Theoretical physicists propose to use negative interference to control heat flow in quantum devices. Study published in Physical Review Letters

Quantum computer parts are sensitive and need to be cooled to very low temperatures. Their tiny size makes them particularly susceptible to a temperature...

Im Focus: Autonomous 3D scanner supports individual manufacturing processes

Let’s say the armrest is broken in your vintage car. As things stand, you would need a lot of luck and persistence to find the right spare part. But in the world of Industrie 4.0 and production with batch sizes of one, you can simply scan the armrest and print it out. This is made possible by the first ever 3D scanner capable of working autonomously and in real time. The autonomous scanning system will be on display at the Hannover Messe Preview on February 6 and at the Hannover Messe proper from April 23 to 27, 2018 (Hall 6, Booth A30).

Part of the charm of vintage cars is that they stopped making them long ago, so it is special when you do see one out on the roads. If something breaks or...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

2nd International Conference on High Temperature Shape Memory Alloys (HTSMAs)

15.02.2018 | Event News

Aachen DC Grid Summit 2018

13.02.2018 | Event News

How Global Climate Policy Can Learn from the Energy Transition

12.02.2018 | Event News

 
Latest News

Fingerprints of quantum entanglement

16.02.2018 | Information Technology

'Living bandages': NUST MISIS scientists develop biocompatible anti-burn nanofibers

16.02.2018 | Health and Medicine

Hubble sees Neptune's mysterious shrinking storm

16.02.2018 | Physics and Astronomy

VideoLinks
Science & Research
Overview of more VideoLinks >>>