Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Tiny Motions Bring Digital Doubles to Life

04.12.2014

Researchers at the Max Planck Institute for Intelligent Systems unveil new technology for motion and shape capture (MoSh) that helps animators jump the “Uncanny Valley” by turning a few moving dots into detailed body shapes that jiggle and deform like real humans.

Researchers at the Max Planck Institute for Intelligent Systems in Tübingen, Germany, announced today that their Motion and Shape Capture (MoSh) study, which appeared in the journal ACM Transactions on Graphics, will be presented at SIGGRAPH Asia (http://sa2014.siggraph.org/en/) in Shenzhen on December 6, 2014.


MoSh needs only sparse mocap marker data to create animations (purple parts) with a level of realism that is difficult to achieve with standard skeleton-based mocap methods. (green parts: 3D scans)

Picture: Perceiving Systems Department

Devised by a team of researchers under the direction of Dr. Michael J. Black, Director of the Perceiving Systems department, MoSh is a method that allows animators to record the three-dimensional (3D) motion and shape of a real human and digitally “retarget” it to a new body shape. With MoSh, realistic virtual humans can populate games, the Internet, and virtual reality, while reducing animation costs for the special effects industry.

Current Motion Capture (mocap) technology uses dozens of high-speed cameras to capture 3D position and motion data from a few reflective markers attached to a person’s body and face. This marker data is then converted into a moving skeleton that controls a digital character, much like a puppeteer controls a puppet. Mocap is widely criticized because this can result in eerily lifeless animations. Consequently mocap serves as only a starting point for the time-consuming and expensive hand animation by experts who put life into the movements of animated characters..

MoSh changes this labor-intensive approach by using a sophisticated mathematical model of human shape and pose, which, is used to compute body shape and motion directly from the 3D marker positions. The MoSh approach lets mocap data be transferred to any new virtual shape automatically. For example, the researchers captured the motion of an elegant female salsa dancer and then changed the body shape to that of a giant ogre, making him look light on his feet. “We can take the motions of one body and simply transfer them to another body resulting in a realistic animation,” says Matthew Loper, the lead author of the study.

And, because MoSh does not rely on a skeletal approach to animation, the details of body shape – such as breathing, muscle flexing, fat jiggling – are retained from the mocap marker data. Current methods throw such important details away and rely on manual animation techniques to apply them after the fact.

“Everybody jiggles,” according to Black, adding: “we were surprised by how much information is present in so few markers. This means that existing motion capture data may contain a treasure trove of realistic performances that MoSh can bring to life.”

Naureen Mahmood, one of the co-authors of the study noted, “realistically rigging and animating a 3D body requires expertise. MoSh will let anyone use motion capture data to achieve results approaching professional animation quality.” This means that realistic digital humans may be coming to video games, training videos, and new virtual-reality headsets.

Opening up realistic human animation to new markets, Max Planck has licensed its technology to Body Labs (http://www.bodylabs.com), a technology company that transforms the human body into a digital platform from which goods and services can be designed, created, and sold. “MoSh has a host of applications,” says William O’Farrell, co-founder CEO of Body Labs. “The obvious application is enhancing the quality and reducing the cost of animations from mocap; but, we also see extensive uses in apparel. MoSh makes high-end effects accessible to the clothing industry and finally allows clothing designers and customers to easily visualize garments on realistic moving bodies.”

Original Paper
Loper, M.M., Mahmood, N. and Black, M.J., MoSh: Motion and Shape Capture from Sparse Markers, ACM Transactions on Graphics, (Proc. SIGGRAPH Asia), 33(6): 220:1-220:13, November 2014. doi: http://dx.doi.org/10.1145/2661229.2661273


Weitere Informationen:

https://www.youtube.com/watch?v=Uidbr2fQor0&feature=youtu.be
http://www.is.tuebingen.mpg.de/#news
http://dl.acm.org/citation.cfm?id=2661273

Claudia Däfler | Max-Planck-Institut für Intelligente Systeme

Further reports about: ACM Bring Digital Life Max Planck Institute Max-Planck-Institut SIGGRAPH Transactions clothing controls humans markers

More articles from Information Technology:

nachricht Putting food-safety detection in the hands of consumers
15.11.2018 | Massachusetts Institute of Technology

nachricht Next stop Morocco: EU partners test innovative space robotics technologies in the Sahara desert
09.11.2018 | Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, DFKI

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: UNH scientists help provide first-ever views of elusive energy explosion

Researchers at the University of New Hampshire have captured a difficult-to-view singular event involving "magnetic reconnection"--the process by which sparse particles and energy around Earth collide producing a quick but mighty explosion--in the Earth's magnetotail, the magnetic environment that trails behind the planet.

Magnetic reconnection has remained a bit of a mystery to scientists. They know it exists and have documented the effects that the energy explosions can...

Im Focus: A Chip with Blood Vessels

Biochips have been developed at TU Wien (Vienna), on which tissue can be produced and examined. This allows supplying the tissue with different substances in a very controlled way.

Cultivating human cells in the Petri dish is not a big challenge today. Producing artificial tissue, however, permeated by fine blood vessels, is a much more...

Im Focus: A Leap Into Quantum Technology

Faster and secure data communication: This is the goal of a new joint project involving physicists from the University of Würzburg. The German Federal Ministry of Education and Research funds the project with 14.8 million euro.

In our digital world data security and secure communication are becoming more and more important. Quantum communication is a promising approach to achieve...

Im Focus: Research icebreaker Polarstern begins the Antarctic season

What does it look like below the ice shelf of the calved massive iceberg A68?

On Saturday, 10 November 2018, the research icebreaker Polarstern will leave its homeport of Bremerhaven, bound for Cape Town, South Africa.

Im Focus: Penn engineers develop ultrathin, ultralight 'nanocardboard'

When choosing materials to make something, trade-offs need to be made between a host of properties, such as thickness, stiffness and weight. Depending on the application in question, finding just the right balance is the difference between success and failure

Now, a team of Penn Engineers has demonstrated a new material they call "nanocardboard," an ultrathin equivalent of corrugated paper cardboard. A square...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

“3rd Conference on Laser Polishing – LaP 2018” Attracts International Experts and Users

09.11.2018 | Event News

On the brain’s ability to find the right direction

06.11.2018 | Event News

European Space Talks: Weltraumschrott – eine Gefahr für die Gesellschaft?

23.10.2018 | Event News

 
Latest News

How Humans and Machines Navigate Complex Situations

19.11.2018 | Science Education

Finding plastic litter from afar

19.11.2018 | Ecology, The Environment and Conservation

Channels for the Supply of Energy

19.11.2018 | Life Sciences

VideoLinks
Science & Research
Overview of more VideoLinks >>>