Face science meets robot science

The team will showcase their work as part of the annual exhibition which runs from 5 – 10 July 2011. Visitors will be able to see how the brain understands faces, what their faces look like when they switch gender, how to transfer motions from one person's face to another and see state of the art computer vision systems that can recognise facial expressions.

Professor Peter McOwan, from the School of Electronic Engineering and Computer Science at Queen Mary, University of London, explains: “We will be showing some of the latest research from the EU funded LIREC project, which aims to create socially aware companion robots and graphical characters. There will be the opportunity for those attending to see if our computer vision system can detect their smiles, watch the most recent videos of our robots in action and talk to us about the project.”

If we can understand how we break up facial movement into elementary facial actions and how actions vary between people, this will help computer scientists to both analyse facial movement and build realistic motion into avatars, making avatars more acceptable to people as channels of communication.

Professor McOwan adds: “Robots are going to increasingly form part of our daily lives – for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible – understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness.”

Co researcher Professor Alan Johnston, from the UCL Division of Psychology and Language Sciences added: “A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face.”

Co researcher Professor Cecilia Heyes, from All Souls College, University of Oxford, points out: “This technology has all kinds of great spin-offs. We're using it to find out how people imitate facial expressions, which is very important for rapport and cooperation, and why people are better at recognizing their own facial movements than those of their friends – even though they see their friends faces much more often than their own.”

Media Contact

Sian Halkyard EurekAlert!

More Information:

http://www.qmul.ac.uk

All latest news from the category: Interdisciplinary Research

News and developments from the field of interdisciplinary research.

Among other topics, you can find stimulating reports and articles related to microsystems, emotions research, futures research and stratospheric research.

Back to home

Comments (0)

Write a comment

Newest articles

Superradiant atoms could push the boundaries of how precisely time can be measured

Superradiant atoms can help us measure time more precisely than ever. In a new study, researchers from the University of Copenhagen present a new method for measuring the time interval,…

Ion thermoelectric conversion devices for near room temperature

The electrode sheet of the thermoelectric device consists of ionic hydrogel, which is sandwiched between the electrodes to form, and the Prussian blue on the electrode undergoes a redox reaction…

Zap Energy achieves 37-million-degree temperatures in a compact device

New publication reports record electron temperatures for a small-scale, sheared-flow-stabilized Z-pinch fusion device. In the nine decades since humans first produced fusion reactions, only a few fusion technologies have demonstrated…

Partners & Sponsors