New wrist-mounted device augments the human hand with two robotic fingers
Twisting a screwdriver, removing a bottle cap, and peeling a banana are just a few simple tasks that are tricky to pull off single-handedly. Now a new wrist-mounted robot can provide a helping hand — or rather, fingers.
Researchers at MIT have developed a robot that enhances the grasping motion of the human hand. The device, worn around one's wrist, works essentially like two extra fingers adjacent to the pinky and thumb. A novel control algorithm enables it to move in sync with the wearer's fingers to grasp objects of various shapes and sizes. Wearing the robot, a user could use one hand to, for instance, hold the base of a bottle while twisting off its cap.
"This is a completely intuitive and natural way to move your robotic fingers," says Harry Asada, the Ford Professor of Engineering in MIT's Department of Mechanical Engineering. "You do not need to command the robot, but simply move your fingers naturally. Then the robotic fingers react and assist your fingers."
Ultimately, Asada says, with some training people may come to perceive the robotic fingers as part of their body — "like a tool you have been using for a long time, you feel the robot as an extension of your hand." He hopes that the two-fingered robot may assist people with limited dexterity in performing routine household tasks, such as opening jars and lifting heavy objects. He and graduate student Faye Wu presented a paper on the robot this week at the Robotics: Science and Systems conference in Berkeley, Calif.
The robot, which the researchers have dubbed "supernumerary robotic fingers," consists of actuators linked together to exert forces as strong as those of human fingers during a grasping motion.
To develop an algorithm to coordinate the robotic fingers with a human hand, the researchers first looked to the physiology of hand gestures, learning that a hand's five fingers are highly coordinated. While a hand may reach out and grab an orange in a different way than, say, a mug, just two general patterns of motion are used to grasp objects: bringing the fingers together, and twisting them inwards. A grasp of any object can be explained through a combination of these two patterns.
The researchers hypothesized that a similar "biomechanical synergy" may exist not only among the five human fingers, but also among seven. To test the hypothesis, Wu wore a glove outfitted with multiple position-recording sensors, and attached to her wrist via a light brace. She then scavenged the lab for common objects, such as a box of cookies, a soda bottle, and a football.
Wu grasped each object with her hand, then manually positioned the robotic fingers to support the object. She recorded both hand and robotic joint angles multiple times with various objects, then analyzed the data, and found that every grasp could be explained by a combination of two or three general patterns among all seven fingers.
The researchers used this information to develop a control algorithm to correlate the postures of the two robotic fingers with those of the five human fingers. Asada explains that the algorithm essentially "teaches" the robot to assume a certain posture that the human expects the robot to take.
Bringing robots closer to humans
For now, the robot mimics the grasping of a hand, closing in and spreading apart in response to a human's fingers. But Wu would like to take the robot one step further, controlling not just position, but also force.
"Right now we're looking at posture, but it's not the whole story," Wu says. "There are other things that make a good, stable grasp. With an object that looks small but is heavy, or is slippery, the posture would be the same, but the force would be different, so how would it adapt to that? That's the next thing we'll look at."
Wu also notes that certain gestures — such as grabbing an apple — may differ slightly from person to person, and ultimately, a robotic aid may have to account for personal grasping preferences. To that end, she envisions developing a library of human and robotic gesture correlations. As a user works with the robot, it could learn to adapt to match his or her preferences, discarding others from the library. She likens this machine learning to that of voice-command systems, like Apple's Siri.
"After you've been using it for a while, it gets used to your pronunciation so it can tune to your particular accent," Wu says. "Long-term, our technology can be similar, where the robot can adjust and adapt to you."
Down the road, Asada says the robot may also be scaled down to a less bulky form.
"This is a prototype, but we can shrink it down to one-third its size, and make it foldable," Asada says. "We could make this into a watch or a bracelet where the fingers pop up, and when the job is done, they come back into the watch. Wearable robots are a way to bring the robot closer to our daily life."
Written by Jennifer Chu, MIT News Office
Sarah McDonnell | Eurek Alert!
“Virtual Lab” Specializes in Ultra-High Efficiency Solar Cells from Europe
25.06.2015 | Fraunhofer-Institut für Solare Energiesysteme ISE
Efficient conversion from spin currents to charge currents in a superconductor
24.06.2015 | The University of Tokyo
New technique combines electron microscopy and synchrotron X-rays to track chemical reactions under real operating conditions
A new technique pioneered at the U.S. Department of Energy's Brookhaven National Laboratory reveals atomic-scale changes during catalytic reactions in real...
Think of an object made of iron: An I-beam, a car frame, a nail. Now imagine that half of the iron in that object owes its existence to bacteria living two and a half billion years ago.
Think of an object made of iron: An I-beam, a car frame, a nail. Now imagine that half of the iron in that object owes its existence to bacteria living two and...
A team of scientists including PhD student Friedrich Schuler from the Laboratory of MEMS Applications at the Department of Microsystems Engineering (IMTEK) of...
The three-year clinical trial results of the retinal implant popularly known as the "bionic eye," have proven the long-term efficacy, safety and reliability of...
On June 23, the second Sentinel mission was launched from the space mission launch center in Kourou. A critical component of Aachen is on board. Researchers at the Fraunhofer Institute for Laser Technology ILT and Tesat-Spacecom have jointly developed the know-how for space-qualified laser components. For the Sentinel mission the diode laser pump module of the Laser Communication Terminal LCT was planned and constructed in Aachen in cooperation with the manufacturer of the LCT, Tesat-Spacecom, and the Ferdinand Braun Institute.
After eight years of preparation, in the early morning of June 23 the time had come: in Kourou in French Guiana, the European Space Agency launched the...
25.06.2015 | Event News
16.06.2015 | Event News
11.06.2015 | Event News
01.07.2015 | Physics and Astronomy
01.07.2015 | Materials Sciences
01.07.2015 | Ecology, The Environment and Conservation