Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Intuitive Visual Control Provides Faster Robot Operation

27.09.2012
Using a novel method of integrating video technology and familiar control devices, a research team from Georgia Tech and the Georgia Tech Research Institute (GTRI) is developing a technique to simplify remote control of robotic devices.

The researchers' aim is to enhance a human operator's ability to perform precise tasks using a multi-jointed robotic device such as an articulated mechanical arm. The new approach has been shown to be easier and faster than older methods, especially when the robot is controlled by an operator who is watching it in a video monitor.

Known as Uncalibrated Visual Servoing for Intuitive Human Guidance of Robots, the new method uses a special implementation of an existing vision-guided control method called visual servoing (VS). By applying visual-servoing technology in innovative ways, the researchers have constructed a robotic system that responds to human commands more directly and intuitively than older techniques.

"Our approach exploits 3-D video technology to let an operator guide a robotic device in ways that are more natural and time-saving, yet are still very precise," said Ai-Ping Hu, a GTRI senior research engineer who is leading the effort. "This capability could have numerous applications – especially in situations where directly observing the robot's operation is hazardous or not possible – including bomb disposal, handling of hazardous materials and search-and-rescue missions."

A paper on this technology was presented at the 2012 IEEE International Conference on Robotics and Automation held in St. Paul, Minn.

For decades articulated robots have been used by industry to perform precision tasks such as welding vehicle seams or assembling electronics, Hu explained. The user develops a software program that enables the device to cycle through the required series of motions, using feedback from sensors built into the robot.

But such programming can be complex and time-consuming. The robot must typically be maneuvered joint by joint through the numerous actions required to complete a task. Moreover, such technology works only in a structured and unchanging environment, such as a factory assembly line, where spatial relationships are constant.

The Human Operator

In recent years, new techniques have enabled human operators to freely guide remote robots through unstructured and unfamiliar environments, to perform such challenging tasks as bomb disposal, Hu said. Operators have controlled the device in one of two ways: by "line of sight" – direct user observation – or by means of conventional, two-dimensional camera that is mounted on the robot to send back an image of both the robot and its target.

But humans guiding robots via either method face some of the same complexities that challenge those who program industrial robots, he added. Manipulating a remote robot into place is generally slow and laborious.

That's especially true when the operator must depend on the imprecise images provided by 2-D video feedback. Manipulating separate controls for each of the robot's multiple joint axes, users have only limited visual information to help them and must maneuver to the target by trial and error.

"Essentially, the user is trying to visualize and reconstruct a 3-D scenario from flat 2-D camera images," Hu said. "The process can become particularly confusing when operators are facing in a different direction from the robot and must mentally reorient themselves to try to distinguish right from left. It's somewhat similar to backing up a vehicle with an attached trailer – you have to turn the steering wheel to the left to get the trailer to move right, which is decidedly non-intuitive."

The Visual Servoing Advantage

To simplify user control, the Georgia Tech team turned to visual servoing (a term synonymous with visual activation). Visual servoing has been studied for years as a way to use video cameras to help robots re-orient themselves within a structured environment such as an assembly line.

Traditional visual servoing is calibrated, meaning that position information generated by a video camera can be transformed into data meaningful to the robot. Using these data, the robot can adjust itself to stay in a correct spatial relationship with target objects.

"Say a conveyor line is accidently moved a few millimeters," Hu said. "A robot with a calibrated visual servoing capability can automatically detect the movement using the video image and a fixed reference point, and then readjust to compensate."

But visual servoing offers additional possibilities. The research team – which includes Hu, associate professor Harvey Lipkin of the School of Mechanical Engineering, graduate student Matthew Marshall, GTRI research engineer Michael Matthews and GTRI principal research engineer Gary McMurray -- has adapted visual-servoing technology in ways that facilitate human control of remote robots.

The new technique takes advantage of both calibrated and uncalibrated techniques. A calibrated 3-D "time of flight" camera is mounted on the robot – typically at the end of a robotic arm, in a gripping device called an end-effector. This approach is sometimes called an eye-in-hand system, because of the camera's location in the robot's "hand."

The camera utilizes an active sensor that detects depth data, allowing it to send back 3-D coordinates that pinpoint the end-effector's spatial location. At the same time, the eye-in-hand camera also supplies a standard, uncalibrated 2-D grayscale video image to the operator's monitor.

The result is that the operator, without seeing the robot, now has a robot's-eye view of the target. Watching this image in a monitor, an operator can visually guide the robot using a gamepad, in a manner somewhat reminiscent of a first-person 3-D video game.

In addition, visual-servoing technology now automatically actuates all the joints needed to complete whatever action the user indicates on the gamepad – rather than the user having to manipulate those joints one by one. In the background, the Georgia Tech system performs the complex computation needed to coordinate the monitor image, the 3-D camera information, the robot's spatial position and the user's gamepad commands.

Testing System Usability

"The guidance process is now very intuitive – pressing 'left' on the gamepad will actuate all the requisite robot joints to effect a leftward displacement," Hu said. "What's more, the robot could be upside down and the controls will still respond in the same intuitive way – left is still left and right is still right."

To judge system usability, the Georgia Tech research team recently conducted trials to test whether the visual-servoing approach enabled faster task-completion times. Using a gamepad that controls an articulated-arm robot with six degrees of freedom, subjects performed four tests: they used visual-servoing guidance as well as conventional joint-based guidance, in both line-of-sight and camera-view modes.

In the line-of-sight test, volunteer participants using visual-servoing guidance averaged task-completion times that were 15 percent faster than when they used joint-based guidance. However, in camera-view mode, participants using visual-servoing guidance averaged 227 percent faster results than with the joint-based technique.

Hu noted that the visual-servoing system used in this test scenario was only one of numerous possible applications of the technology. The research team's plans include testing a mobile platform with a VS-guided robotic arm mounted on it. Also underway is a proof-of-concept effort that incorporates visual-servoing control into a low-cost, consumer-level robot.

"Our ultimate goal is to develop a generic, uncalibrated control framework that is able to use image data to guide many different kinds of robots," he said.

Research News & Publications Office
Georgia Institute of Technology
75 Fifth Street, N.W., Suite 309
Atlanta, Georgia 30308 USA
Media Relations Contact: John Toon (404-894-6986)(jtoon@gatech.edu).
Writer: Rick Robinson

John Toon | Newswise Science News
Further information:
http://www.gatech.edu

More articles from Power and Electrical Engineering:

nachricht Researchers use light to remotely control curvature of plastics
23.03.2017 | North Carolina State University

nachricht TU Graz researchers show that enzyme function inhibits battery ageing
21.03.2017 | Technische Universität Graz

All articles from Power and Electrical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Giant Magnetic Fields in the Universe

Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.

The results will be published on March 22 in the journal „Astronomy & Astrophysics“.

Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...

Im Focus: Tracing down linear ubiquitination

Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.

Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...

Im Focus: Perovskite edges can be tuned for optoelectronic performance

Layered 2D material improves efficiency for solar cells and LEDs

In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...

Im Focus: Polymer-coated silicon nanosheets as alternative to graphene: A perfect team for nanoelectronics

Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.

Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...

Im Focus: Researchers Imitate Molecular Crowding in Cells

Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to simulate these confined natural conditions in artificial vesicles for the first time. As reported in the academic journal Small, the results are offering better insight into the development of nanoreactors and artificial organelles.

Enzymes behave differently in a test tube compared with the molecular scrum of a living cell. Chemists from the University of Basel have now been able to...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

International Land Use Symposium ILUS 2017: Call for Abstracts and Registration open

20.03.2017 | Event News

CONNECT 2017: International congress on connective tissue

14.03.2017 | Event News

ICTM Conference: Turbine Construction between Big Data and Additive Manufacturing

07.03.2017 | Event News

 
Latest News

Argon is not the 'dope' for metallic hydrogen

24.03.2017 | Materials Sciences

Astronomers find unexpected, dust-obscured star formation in distant galaxy

24.03.2017 | Physics and Astronomy

Gravitational wave kicks monster black hole out of galactic core

24.03.2017 | Physics and Astronomy

VideoLinks
B2B-VideoLinks
More VideoLinks >>>