Face science meets robot science

The team will showcase their work as part of the annual exhibition which runs from 5 – 10 July 2011. Visitors will be able to see how the brain understands faces, what their faces look like when they switch gender, how to transfer motions from one person's face to another and see state of the art computer vision systems that can recognise facial expressions.

Professor Peter McOwan, from the School of Electronic Engineering and Computer Science at Queen Mary, University of London, explains: “We will be showing some of the latest research from the EU funded LIREC project, which aims to create socially aware companion robots and graphical characters. There will be the opportunity for those attending to see if our computer vision system can detect their smiles, watch the most recent videos of our robots in action and talk to us about the project.”

If we can understand how we break up facial movement into elementary facial actions and how actions vary between people, this will help computer scientists to both analyse facial movement and build realistic motion into avatars, making avatars more acceptable to people as channels of communication.

Professor McOwan adds: “Robots are going to increasingly form part of our daily lives – for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible – understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness.”

Co researcher Professor Alan Johnston, from the UCL Division of Psychology and Language Sciences added: “A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face.”

Co researcher Professor Cecilia Heyes, from All Souls College, University of Oxford, points out: “This technology has all kinds of great spin-offs. We're using it to find out how people imitate facial expressions, which is very important for rapport and cooperation, and why people are better at recognizing their own facial movements than those of their friends – even though they see their friends faces much more often than their own.”

Media Contact

Sian Halkyard EurekAlert!

More Information:

http://www.qmul.ac.uk

All latest news from the category: Interdisciplinary Research

News and developments from the field of interdisciplinary research.

Among other topics, you can find stimulating reports and articles related to microsystems, emotions research, futures research and stratospheric research.

Back to home

Comments (0)

Write a comment

Newest articles

Silicon Carbide Innovation Alliance to drive industrial-scale semiconductor work

Known for its ability to withstand extreme environments and high voltages, silicon carbide (SiC) is a semiconducting material made up of silicon and carbon atoms arranged into crystals that is…

New SPECT/CT technique shows impressive biomarker identification

…offers increased access for prostate cancer patients. A novel SPECT/CT acquisition method can accurately detect radiopharmaceutical biodistribution in a convenient manner for prostate cancer patients, opening the door for more…

How 3D printers can give robots a soft touch

Soft skin coverings and touch sensors have emerged as a promising feature for robots that are both safer and more intuitive for human interaction, but they are expensive and difficult…

Partners & Sponsors