Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Robot can be programmed by casually talking to it

23.06.2014

Robots are getting smarter, but they still need step-by-step instructions for tasks they haven't performed before.

Before you can tell your household robot "Make me a bowl of ramen noodles," you'll have to teach it how to do that. Since we're not all computer programmers, we'd prefer to give those instructions in English, just as we'd lay out a task for a child.

But human language can be ambiguous, and some instructors forget to mention important details. Suppose you told your household robot how to prepare ramen noodles, but forgot to mention heating the water or tell it where the stove is.

In his Robot Learning Lab, Ashutosh Saxena, assistant professor of computer science at Cornell University, is teaching robots to understand instructions in natural language from various speakers, account for missing information, and adapt to the environment at hand.

... more about:
»Robotics »Science »combining

Saxena and graduate students Dipendra K. Misra and Jaeyong Sung will describe their methods at the Robotics: Science and Systems conference at the University of California, Berkeley, July 12-16.

Video and abstract available at http://tellmedave.cs.cornell.edu

The robot may have a built-in programming language with commands like find (pan); grasp (pan); carry (pan, water tap); fill (pan, water); carry (pan, stove) and so on. Saxena's software translates human sentences, such as "Fill a pan with water, put it on the stove, heat the water. When it's boiling, add the noodles." into robot language. Notice that you didn't say, "Turn on the stove." The robot has to be smart enough to fill in that missing step.

Saxena's robot, equipped with a 3-D camera, scans its environment and identifies the objects in it, using computer vision software previously developed in Saxena's lab. The robot has been trained to associate objects with their capabilities: A pan can be poured into or poured from; stoves can have other objects set on them, and can heat things.

So the robot can identify the pan, locate the water faucet and stove and incorporate that information into its procedure. If you tell it to "heat water" it can use the stove or the microwave, depending on which is available. And it can carry out the same actions tomorrow if you've moved the pan, or even moved the robot to a different kitchen.

Other workers have attacked these problems by giving a robot a set of templates for common actions and chewing up sentences one word at a time. Saxena's research group uses techniques computer scientists call "machine learning" to train the robot's computer brain to associate entire commands with flexibly defined actions. The computer is fed animated video simulations of the action –- created by humans in a process similar to playing a video game – accompanied by recorded voice commands from several different speakers.

The computer stores the combination of many similar commands as a flexible pattern that can match many variations, so when it hears "Take the pot to the stove," "Carry the pot to the stove," "Put the pot on the stove," "Go to the stove and heat the pot" and so on, it calculates the probability of a match with what it has heard before, and if the probability is high enough, it declares a match. A similarly fuzzy version of the video simulation supplies a plan for the action: Wherever the sink and the stove are, the path can be matched to the recorded action of carrying the pot of water from one to the other.

Of course the robot still doesn't get it right all the time. To test, the researchers gave instructions for preparing ramen noodles and for making affogato – an Italian dessert combining coffee and ice cream: "Take some coffee in a cup. Add ice cream of your choice. Finally, add raspberry syrup to the mixture."

The robot performed correctly up to 64 percent of the time even when the commands were varied or the environment was changed, and it was able to fill in missing steps. That was three to four times better than previous methods, the researchers reported, but "There is still room for improvement."

You can teach a simulated robot to perform a kitchen task at the "Tell me Dave" website, and your input there will become part of a crowdsourced library of instructions for the Cornell robots. Aditya Jami, visiting researcher at Cornell, is helping Tell Me Dave to scale the library to millions of examples. "With crowdsourcing at such a scale, robots will learn at a much faster rate," Saxena said.

###

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.

Syl Kacapyr | Eurek Alert!

Further reports about: Robotics Science combining

More articles from Power and Electrical Engineering:

nachricht ISFH-CalTeC is “designated test centre” for the confirmation of solar cell world records
16.01.2018 | Institut für Solarenergieforschung GmbH

nachricht A water-based, rechargeable battery
09.01.2018 | Empa - Eidgenössische Materialprüfungs- und Forschungsanstalt

All articles from Power and Electrical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Scientists decipher key principle behind reaction of metalloenzymes

So-called pre-distorted states accelerate photochemical reactions too

What enables electrons to be transferred swiftly, for example during photosynthesis? An interdisciplinary team of researchers has worked out the details of how...

Im Focus: The first precise measurement of a single molecule's effective charge

For the first time, scientists have precisely measured the effective electrical charge of a single molecule in solution. This fundamental insight of an SNSF Professor could also pave the way for future medical diagnostics.

Electrical charge is one of the key properties that allows molecules to interact. Life itself depends on this phenomenon: many biological processes involve...

Im Focus: Paradigm shift in Paris: Encouraging an holistic view of laser machining

At the JEC World Composite Show in Paris in March 2018, the Fraunhofer Institute for Laser Technology ILT will be focusing on the latest trends and innovations in laser machining of composites. Among other things, researchers at the booth shared with the Aachen Center for Integrative Lightweight Production (AZL) will demonstrate how lasers can be used for joining, structuring, cutting and drilling composite materials.

No other industry has attracted as much public attention to composite materials as the automotive industry, which along with the aerospace industry is a driver...

Im Focus: Room-temperature multiferroic thin films and their properties

Scientists at Tokyo Institute of Technology (Tokyo Tech) and Tohoku University have developed high-quality GFO epitaxial films and systematically investigated their ferroelectric and ferromagnetic properties. They also demonstrated the room-temperature magnetocapacitance effects of these GFO thin films.

Multiferroic materials show magnetically driven ferroelectricity. They are attracting increasing attention because of their fascinating properties such as...

Im Focus: A thermometer for the oceans

Measurement of noble gases in Antarctic ice cores

The oceans are the largest global heat reservoir. As a result of man-made global warming, the temperature in the global climate system increases; around 90% of...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

10th International Symposium: “Advanced Battery Power – Kraftwerk Batterie” Münster, 10-11 April 2018

08.01.2018 | Event News

See, understand and experience the work of the future

11.12.2017 | Event News

Innovative strategies to tackle parasitic worms

08.12.2017 | Event News

 
Latest News

Gran Chaco: Biodiversity at High Risk

17.01.2018 | Ecology, The Environment and Conservation

Only an atom thick: Physicists succeed in measuring mechanical properties of 2D monolayer materials

17.01.2018 | Physics and Astronomy

Fraunhofer HHI receives AIS Technology Innovation Award 2018 for 3D Human Body Reconstruction

17.01.2018 | Awards Funding

VideoLinks
B2B-VideoLinks
More VideoLinks >>>