Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:


OUCH! Computer system spots fake expressions of pain better than people


The system may also be used to detect deceptive actions in the realms of security and job screening

A joint study by researchers at the University of California, San Diego, the University at Buffalo, and the University of Toronto has found that a computer–vision system can distinguish between real or faked expressions of pain more accurately than can humans.

The researchers used the computer expression recognition toolbox (CERT), an end-to-end system for fully automated facial-expression recognition that operates in real time. UB's Mark Frank was one of CERT's developers.

This ability has obvious uses for uncovering pain malingering — fabricating or exaggerating the symptoms of pain for a variety of motives — but the system also could be used to detect deceptive actions in the realms of security, psychopathology, job screening, medicine and law. 

The study, “Automatic Decoding of Deceptive Pain Expressions,” is published in the latest issue of Current Biology.

The authors are Marian Bartlett, PhD, research professor, Institute for Neural Computation, University of California, San Diego; Gwen C. Littlewort, PhD, co-director of the institute’s Machine Perception Laboratory; Mark G. Frank, PhD, professor of communication, University at Buffalo, and Kang Lee, PhD, Dr. Erick Jackman Institute of Child Study, University of Toronto. 

The study employed two experiments with a total of 205 human observers who were asked to assess the veracity of expressions of pain in video clips of individuals, some of whom were being subjected to the cold presser test in which a hand is immersed in ice water to measure pain tolerance, and of others who were faking their painful expressions.

“Human subjects could not discriminate real from faked expressions of pain more frequently than would be expected by chance,” Frank says. “Even after training, they were accurate only 55 percent of the time. The computer system, however, was accurate 85 percent of the time.”

Bartlett noted that the computer system “managed to detect distinctive, dynamic features of facial expressions that people missed. Human observers just aren’t very good at telling real from faked expressions of pain.”

The researchers employed the computer expression recognition toolbox (CERT), an end-to-end system for fully automated facial-expression recognition that operates in real time. It was developed by Bartlett, Littlewort, Frank and others to assess the accuracy of machine versus human vision.

They found that machine vision was able to automatically distinguish deceptive facial signals from genuine facial signals by extracting information from spatiotemporal facial-expression signals that humans either cannot or do not extract.

“In highly social species such as humans,” says Lee, “faces have evolved to convey rich information, including expressions of emotion and pain. And, because of the way our brains are built, people can simulate emotions they’re not actually experiencing so successfully that they fool other people. The computer is much better at spotting the subtle differences between involuntary and voluntary facial movements.”

Frank adds, “Our findings demonstrate that automated systems like CERT may analyze the dynamics of facial behavior at temporal resolutions previously not feasible using manual coding methods.”

Bartlet says this approach illuminates basic questions pertaining to many social situations in which the behavioral fingerprint of neural control systems may be relevant.

“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask and fake such emotions, which may involve ‘dual control’ of the face,” Bartlett says.  

“Dual control of the face means that the signal for our spontaneous felt emotion expressions originate in different areas in the brain than our deliberately posed emotion expressions,” Frank explains, “and they proceed through different motor systems that account for subtle appearance, and in the case of this study, dynamic movement factors.”

The computer-vision system, Bartlett says, “can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”

The single most predictive feature of falsified expressions, the study showed, is how and when the mouth opens and closes. Fakers’ mouths open with less variation and too regularly. The researchers say further investigations will explore whether such over-regularity is a general feature of fake expressions.

Media Contact Information
Patricia Donovan
Senior Editor, Arts, Humanities, Public Health, Social Sciences
Tel: 716-645-4602

Patricia Donovan | EurekAlert!
Further information:

Further reports about: Buffalo Human Toronto accurate behavior distinguish emotions explains humans movement pain signals subtle

More articles from Information Technology:

nachricht Secure data transfer thanks to a single photon
13.10.2015 | Forschungszentrum MATHEON ECMath

nachricht Researchers from Kiel and Bochum develop new information storage device
12.10.2015 | Christian-Albrechts-Universität zu Kiel

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Secure data transfer thanks to a single photon

Physicists of TU Berlin and mathematicians of MATHEON are so successful that even the prestigious journal “Nature Communications” reported on their project.

Security in data transfer is an important issue, and not only since the NSA scandal. Sometimes, however, the need for speed conflicts to a certain degree with...

Im Focus: A Light Touch May Help Animals and Robots Move on Sand and Snow

Having a light touch can make a hefty difference in how well animals and robots move across challenging granular surfaces such as snow, sand and leaf litter. Research reported October 9 in the journal Bioinspiration & Biomimetics shows how the design of appendages – whether legs or wheels – affects the ability of both robots and animals to cross weak and flowing surfaces.

Using an air fluidized bed trackway filled with poppy seeds or glass spheres, researchers at the Georgia Institute of Technology systematically varied the...

Im Focus: Reliable in-line inspections of high-strength automotive body parts within seconds

Nondestructive material testing (NDT) is a fast and effective way to analyze the quality of a product during the manufacturing process. Because defective materials can lead to malfunctioning finished products, NDT is an essential quality assurance measure, especially in the manufacture of safety-critical components such as automotive B-pillars. NDT examines the quality without damaging the component or modifying the surface of the material. At this year's Blechexpo trade fair in Stuttgart, Fraunhofer IZFP will have an exhibit that demonstrates the nondestructive testing of high-strength automotive body parts using 3MA. The measurement results are available in a matter of seconds.

To minimize vehicle weight and fuel consumption while providing the highest level of crash safety, automotive bodies are reinforced with elements made from...

Im Focus: Kick-off for a new era of precision astronomy

The MICADO camera, a first light instrument for the European Extremely Large Telescope (E-ELT), has entered a new phase in the project: by agreeing to a Memorandum of Understanding, the partners in Germany, France, the Netherlands, Austria, and Italy, have all confirmed their participation. Following this milestone, the project's transition into its preliminary design phase was approved at a kick-off meeting held in Vienna. Two weeks earlier, on September 18, the consortium and the European Southern Observatory (ESO), which is building the telescope, have signed the corresponding collaboration agreement.

As the first dedicated camera for the E-ELT, MICADO will equip the giant telescope with a capability for diffraction-limited imaging at near-infrared...

Im Focus: Locusts at the wheel: University of Graz investigates collision detector inspired by insect eyes

Self-driving cars will be on our streets in the foreseeable future. In Graz, research is currently dedicated to an innovative driver assistance system that takes over control if there is a danger of collision. It was nature that inspired Dr Manfred Hartbauer from the Institute of Zoology at the University of Graz: in dangerous traffic situations, migratory locusts react around ten times faster than humans. Working together with an interdisciplinary team, Hartbauer is investigating an affordable collision detector that is equipped with artificial locust eyes and can recognise potential crashes in time, during both day and night.

Inspired by insects

All Focus news of the innovation-report >>>



Event News

EHFG 2015: Securing healthcare and sustainably strengthening healthcare systems

01.10.2015 | Event News

Conference in Brussels: Tracking and Tracing the Smallest Marine Life Forms

30.09.2015 | Event News

World Alzheimer`s Day – Professor Willnow: Clearer Insights into the Development of the Disease

17.09.2015 | Event News

Latest News

Smart clothing, mini-eyes, and a virtual twin – Artificial Intelligence at ICT 2015

13.10.2015 | Trade Fair News

Listening to the Extragalactic Radio

13.10.2015 | Physics and Astronomy

Penn study stops vision loss in late-stage canine X-linked retinitis pigmentosa

13.10.2015 | Health and Medicine

More VideoLinks >>>