Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Robot learns to smile and frown

13.07.2009
UC San Diego computer scientists present machine learning first

A hyper-realistic Einstein robot at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to "empower" their robot to learn to make realistic facial expressions.

"As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions," said Tingfan Wu, the computer science Ph.D. student from the UC San Diego Jacobs School of Engineering who presented this advance on June 6 at the IEEE International Conference on Development and Learning.

The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein in order to find ways to automate the process of teaching robots to make lifelike facial expressions.

This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combinations to make specific face expressions. In order to begin to automate this process, the UCSD researchers looked to both developmental psychology and machine learning.

Developmental psychologists speculate that infants learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. Initially, these movements appear to be executed in a random manner as infants learn to control their bodies and reach for objects.

"We applied this same idea to the problem of a robot learning to make realistic facial expressions," said Javier Movellan, the senior author on the paper presented at ICDL 2009 and the director of UCSD's Machine Perception Laboratory, housed in Calit2, the California Institute for Telecommunications and Information Technology.

Although their preliminary results are promising, the researchers note that some of the learned facial expressions are still awkward. One potential explanation is that their model may be too simple to describe the coupled interactions between facial muscles and skin.

To begin the learning process, the UC San Diego researchers directed the Einstein robot head (Hanson Robotics' Einstein Head) to twist and turn its face in all directions, a process called "body babbling." During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.

Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot learned to make facial expressions it had never encountered.

For example, the robot learned eyebrow narrowing, which requires the inner eyebrows to move together and the upper eyelids to close a bit to narrow the eye aperture.

"During the experiment, one of the servos burned out due to misconfiguration. We therefore ran the experiment without that servo. We discovered that the model learned to automatically compensate for the missing servo by activating a combination of nearby servos," the authors wrote in the paper presented at the 2009 IEEE International Conference on Development and Learning.

"Currently, we are working on a more accurate facial expression generation model as well as systematic way to explore the model space efficiently," said Wu, the computer science PhD student. Wu also noted that the "body babbling" approach he and his colleagues described in their paper may not be the most efficient way to explore the model of the face.

While the primary goal of this work was to solve the engineering problem of how to approximate the appearance of human facial muscle movements with motors, the researchers say this kind of work could also lead to insights into how humans learn and develop facial expressions.

This research was supported by the National Science Foundation.

"Learning to Make Facial Expressions," by Tingfan Wu, Nicholas J. Butko, Paul Ruvulo, Marian S. Bartlett, Javier R. Movellan from Machine Perception Laboratory, University of California San Diego. Presented on June 6 at the 2009 IEEE 8TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING.

Download the paper at: http://mplab.ucsd.edu/wp-content/uploads/wu_icdl20091.pdf

Watch the robot make faces in this short YouTube video: http://cse-ece-ucsd.blogspot.com/2009/07/robot-learns-to-smile.html

Daniel Kane | EurekAlert!
Further information:
http://www.ucsd.edu

More articles from Power and Electrical Engineering:

nachricht How protons move through a fuel cell
22.06.2017 | Empa - Eidgenössische Materialprüfungs- und Forschungsanstalt

nachricht Fraunhofer IZFP acquires lucrative EU project for increasing nuclear power plant safety
21.06.2017 | Fraunhofer-Institut für Zerstörungsfreie Prüfverfahren IZFP

All articles from Power and Electrical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Can we see monkeys from space? Emerging technologies to map biodiversity

An international team of scientists has proposed a new multi-disciplinary approach in which an array of new technologies will allow us to map biodiversity and the risks that wildlife is facing at the scale of whole landscapes. The findings are published in Nature Ecology and Evolution. This international research is led by the Kunming Institute of Zoology from China, University of East Anglia, University of Leicester and the Leibniz Institute for Zoo and Wildlife Research.

Using a combination of satellite and ground data, the team proposes that it is now possible to map biodiversity with an accuracy that has not been previously...

Im Focus: Climate satellite: Tracking methane with robust laser technology

Heatwaves in the Arctic, longer periods of vegetation in Europe, severe floods in West Africa – starting in 2021, scientists want to explore the emissions of the greenhouse gas methane with the German-French satellite MERLIN. This is made possible by a new robust laser system of the Fraunhofer Institute for Laser Technology ILT in Aachen, which achieves unprecedented measurement accuracy.

Methane is primarily the result of the decomposition of organic matter. The gas has a 25 times greater warming potential than carbon dioxide, but is not as...

Im Focus: How protons move through a fuel cell

Hydrogen is regarded as the energy source of the future: It is produced with solar power and can be used to generate heat and electricity in fuel cells. Empa researchers have now succeeded in decoding the movement of hydrogen ions in crystals – a key step towards more efficient energy conversion in the hydrogen industry of tomorrow.

As charge carriers, electrons and ions play the leading role in electrochemical energy storage devices and converters such as batteries and fuel cells. Proton...

Im Focus: A unique data centre for cosmological simulations

Scientists from the Excellence Cluster Universe at the Ludwig-Maximilians-Universität Munich have establised "Cosmowebportal", a unique data centre for cosmological simulations located at the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences. The complete results of a series of large hydrodynamical cosmological simulations are available, with data volumes typically exceeding several hundred terabytes. Scientists worldwide can interactively explore these complex simulations via a web interface and directly access the results.

With current telescopes, scientists can observe our Universe’s galaxies and galaxy clusters and their distribution along an invisible cosmic web. From the...

Im Focus: Scientists develop molecular thermometer for contactless measurement using infrared light

Temperature measurements possible even on the smallest scale / Molecular ruby for use in material sciences, biology, and medicine

Chemists at Johannes Gutenberg University Mainz (JGU) in cooperation with researchers of the German Federal Institute for Materials Research and Testing (BAM)...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Plants are networkers

19.06.2017 | Event News

Digital Survival Training for Executives

13.06.2017 | Event News

Global Learning Council Summit 2017

13.06.2017 | Event News

 
Latest News

Quantum thermometer or optical refrigerator?

23.06.2017 | Physics and Astronomy

A 100-year-old physics problem has been solved at EPFL

23.06.2017 | Physics and Astronomy

Equipping form with function

23.06.2017 | Information Technology

VideoLinks
B2B-VideoLinks
More VideoLinks >>>