Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Teleoperating robots with virtual reality

05.10.2017

VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute

Many manufacturing jobs require a physical presence to operate machinery. But what if such jobs could be done remotely? This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) presented a virtual-reality (VR) system that lets you teleoperate a robot using an Oculus Rift headset.


VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute.

Credit: Jason Dorfman, MIT CSAIL

The system embeds the user in a VR control room with multiple sensor displays, making it feel like they are inside the robot's head. By using gestures, users can match their movements to the robot's to complete various tasks.

"A system like this could eventually help humans supervise robots from a distance," says CSAIL postdoctoral associate Jeffrey Lipton, who was lead author on a related paper about the system. "By teleoperating robots from home, blue-collar workers would be able to tele-commute and benefit from the IT revolution just as white-collars workers do now."

The researchers even imagine that such a system could help employ increasing numbers of jobless video-gamers by "game-ifying" manufacturing positions.

The team demonstrated their VC control approach with the Baxter humanoid robot from Rethink Robotics, but said that the approach can work on other robot platforms and is also compatible with the HTC Vive headset.

Lipton co-wrote the paper with CSAIL director Daniela Rus and researcher Aidan Fay. They presented the paper this week at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in Vancouver.

How it works

There have traditionally been two main approaches to using VR for teleoperation.

In a "direct" model, the user's vision is directly coupled to the robot's state. With these systems, a delayed signal could lead to nausea and headaches, and the user's viewpoint is limited to one perspective.

In the "cyber-physical" model, the user is separate from the robot. The user interacts with a virtual copy of the robot and the environment. This requires much more data, and specialized spaces.

The CSAIL team's system is halfway between these two methods. It solves the delay problem, since the user is constantly receiving visual feedback from the virtual world. It also solves the the cyber-physical issue of being distinct from the robot: once a user puts on the headset and logs into the system, they will feel as if they are inside Baxter's head.

The system mimics the "homunculus model of mind" - the idea that there's a small human inside our brains controlling our actions, viewing the images we see and understanding them for us. While it's a peculiar idea for humans, for robots it fits: "inside" the robot is a human in a control room, seeing through its eyes and controlling its actions.

Using Oculus' controllers, users can interact with controls that appear in the virtual space to open and close the hand grippers to pick up, move, and retrieve items. A user can plan movements based on the distance between the arm's location marker and their hand while looking at the live display of the arm.

To make these movements possible, the human's space is mapped into the virtual space, and the virtual space is then mapped into the robot space to provide a sense of co-location.

The system is also more flexible compared to previous systems that require many resources. Other systems might extract 2-D information from each camera, build out a full 3-D model of the environment, and then process and redisplay the data.

In contrast, the CSAIL team's approach bypasses all of that by taking the 2-D images that are displayed to each eye. (The human brain does the rest by automatically inferring the 3-D information.)

To test the system, the team first teleoperated Baxter to do simple tasks like picking up screws or stapling wires. They then had the test users teleoperate the robot to pick up and stack blocks.

Users successfully completed the tasks at a much higher rate compared to the "direct" model. Unsurprisingly, users with gaming experience had much more ease with the system.

Tested against state-of-the-art systems, CSAIL's system was better at grasping objects 95 percent of the time and 57 percent faster at doing tasks. The team also showed that the system could pilot the robot from hundreds of miles away, testing it on a hotel's wireless network in Washington, DC to control Baxter at MIT.

"This contribution represents a major milestone in the effort to connect the user with the robot's space in an intuitive, natural, and effective manner." says Oussama Khatib, a computer science professor at Stanford University who was not involved in the paper.

The team eventually wants to focus on making the system more scalable, with many users and different types of robots that can be compatible with current automation technologies.

###

The project was funded in part by the Boeing Company and the National Science Foundation.

Media Contact

Adam Conner-Simons
aconner@csail.mit.edu
617-324-9135

 @mit_csail

http://www.csail.mit.edu/ 

Adam Conner-Simons | EurekAlert!

More articles from Medical Engineering:

nachricht Bio stents increase risk of heart attack
02.11.2017 | Universitätsspital Bern

nachricht World´s smallest jet engine invented in Stuttgart
25.10.2017 | Max-Planck-Institut für Intelligente Systeme

All articles from Medical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Support Free with “TwoCure” – Innovation in Resin-Based 3D Printing

The Fraunhofer Institute for Laser Technology ILT and Rapid Shape GmbH are working together to further develop resin-based 3D printing. The new “TwoCure” process requires no support structures and is significantly more efficient and productive than conventional 3D printing techniques for plastic components. Experts from Fraunhofer ILT will be presenting the state-funded joint development that makes use of the interaction of light and cold in forming the components at formnext 2017 from November 14 to 17 in Frankfurt am Main.

Much like stereolithography, one of the best-known processes for printing 3D plastic components works using photolithographic light exposure that causes liquid...

Im Focus: Researchers develop chip-scale optical abacus

A team of researchers led by Prof. Wolfram Pernice from the Institute of Physics at Münster University has developed a miniature abacus on a microchip which calculates using light signals. With it they are paving the way to the development of new types of computer in which, as in the human brain, the computing and storage functions are combined in one element.

Researchers at the universities of Münster, Exeter and Oxford have developed a miniature “abacus” which can be used for calculating with light signals. With it...

Im Focus: Lightwave controlled nanoscale electron acceleration sets the pace

Extremely short electron bunches are key to many new applications including ultrafast electron microscopy and table-top free-electron lasers. A german team of physicists from Rostock University, the Max Born Institute in Berlin, the Ludwig-Maxmilians-Universität Munich, and the Max Planck Institute of Quantum Optics in Garching has now shown how electrons can be accelerated in an extreme and well-controlled way with laser light, while crossing a silver particle of just a few nanometers.

Of particular importance for potential applications is the ability to manipulate the acceleration process, known as a swing-by maneuver from space travel, with...

Im Focus: Newly Discovered microRNA Regulates Mobility of Tumor Cells

Cancer cells can reactivate a cellular process that is an essential part of embryonic development. This allows them to leave the primary tumor, penetrate the surrounding tissue and form metastases in peripheral organs. In the journal Nature Communications, researchers from the University of Basel’s Department of Biomedicine provide an insight into the molecular networks that regulate this process.

During an embryo’s development, epithelial cells can break away from the cell cluster, modify their cell type-specific properties, and migrate into other...

Im Focus: World´s smallest jet engine invented in Stuttgart

For the second time, Dr. Samuel Sánchez from the Max Planck Institute for Intelligent Systems in Stuttgart receives the Guinness World Record for the smallest nanotube travelling through fluid like a jet engine.

Dr. Samuel Sánchez is thrilled, just like last time he received a Guinness World Record for the smallest jet engine ever created. Sánchez is a Research Group...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

#Berlin5GWeek: The right network for Industry 4.0

30.10.2017 | Event News

3rd Symposium on Driving Simulation

23.10.2017 | Event News

ASEAN Member States discuss the future role of renewable energy

17.10.2017 | Event News

 
Latest News

NASA investigates invisible magnetic bubbles in outer solar system

02.11.2017 | Physics and Astronomy

Ions in the spotlight

02.11.2017 | Physics and Astronomy

Support Free with “TwoCure” – Innovation in Resin-Based 3D Printing

02.11.2017 | Trade Fair News

VideoLinks
B2B-VideoLinks
More VideoLinks >>>