Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Computer program looks five minutes into the future

13.06.2018

Computer scientists from the University of Bonn have developed software that can look a few minutes into the future: The program first learns the typical sequence of actions, such as cooking, from video sequences. Based on this knowledge, it can then accurately predict in new situations what the chef will do at which point in time. Researchers will present their findings at the world's largest Conference on Computer Vision and Pattern Recognition, which will be held June 19-21 in Salt Lake City, USA.

The perfect butler, as every fan of British social drama knows, has a special ability: He senses his employer’s wishes before they have even been uttered. The working group of Prof. Dr. Jürgen Gall wants to teach computers something similar: “We want to predict the timing and duration of activities - minutes or even hours before they happen”, he explains.


When will you do what? Prof. Jürgen Gall (right) and Yazan Abu Farha from the Institute of Computer Science at the University of Bonn.

© Photo: Barbara Frommann/Uni Bonn

A kitchen robot, for example, could then pass the ingredients as soon as they are needed, pre-heat the oven in time - and in the meantime warn the chef if he is about to forget a preparation step. The automatic vacuum cleaner meanwhile knows that it has no business in the kitchen at that time, and instead takes care of the living room.

We humans are very good at anticipating the actions of others. For computers however, this discipline is still in its infancy. The researchers at the Institute of Computer Science at the University of Bonn are now able to announce a first success: They have developed self-learning software that can estimate the timing and duration of future activities with astonishing accuracy for periods of several minutes.

Training data: four hours of salad videos

The training data used by the scientists included 40 videos in which performers prepare different salads. Each of the recordings was around 6 minutes long and contained an average of 20 different actions. The videos also contained precise details of what time the action started and how long it took.

The computer “watched” these salad videos totaling around four hours. This way, the algorithm learned which actions typically follow each other during this task and how long they last. This is by no means trivial: After all, every chef has his own approach. Additionally, the sequence may vary depending on the recipe.

“Then we tested how successful the learning process was”, explains Gall. “For this we confronted the software with videos that it had not seen before.” At least the new short films fit into the context: They also showed the preparation of a salad. For the test, the computer was told what is shown in the first 20 or 30 percent of one of the new videos. On this basis it then had to predict what would happen during the rest of the film.

That worked amazingly well. Gall: “Accuracy was over 40 percent for short forecast periods, but then dropped the more the algorithm had to look into the future.” For activities that were more than three minutes in the future, the computer was still right in 15 percent of cases. However, the prognosis was only considered correct if both the activity and its timing were correctly predicted.

Gall and his colleagues want the study to be understood only as a first step into the new field of activity prediction. Especially since the algorithm performs noticeably worse if it has to recognize on its own what happens in the first part of the video, instead of being told. Because this analysis is never 100 percent correct - Gall speaks of “noisy” data. “Our process does work with it”, he says. “But unfortunately nowhere near as well.”

The study was developed as part of a research group dedicated to the prediction of human behavior and financially supported by the German Research Foundation (DFG).

Publication: Yazan Abu Farha, Alexander Richard and Jürgen Gall: When will you do what? - Anticipating Temporal Occurrences of Activities. IEEE Conference on Computer Vision and Pattern Recognition 2018; http://pages.iai.uni-bonn.de/gall_juergen/download/jgall_anticipation_cvpr18.pdf

Sample test videos and predictions derived from them are available at https://www.youtube.com/watch?v=xMNYRcVH_oI

Contact:

Prof. Dr. Jürgen Gall
Institute of Computer Science
University of Bonn
Tel. +49(0)228/7369600
E-mail: gall@informatik.uni-bonn.de

Johannes Seiler | idw - Informationsdienst Wissenschaft
Further information:
http://www.uni-bonn.de/

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Successfully Tested in Praxis: Bidirectional Sensor Technology Optimizes Laser Material Deposition

The quality of additively manufactured components depends not only on the manufacturing process, but also on the inline process control. The process control ensures a reliable coating process because it detects deviations from the target geometry immediately. At LASER World of PHOTONICS 2019, the Fraunhofer Institute for Laser Technology ILT will be demonstrating how well bi-directional sensor technology can already be used for Laser Material Deposition (LMD) in combination with commercial optics at booth A2.431.

Fraunhofer ILT has been developing optical sensor technology specifically for production measurement technology for around 10 years. In particular, its »bd-1«...

Im Focus: The hidden structure of the periodic system

The well-known representation of chemical elements is just one example of how objects can be arranged and classified

The periodic table of elements that most chemistry books depict is only one special case. This tabular overview of the chemical elements, which goes back to...

Im Focus: MPSD team discovers light-induced ferroelectricity in strontium titanate

Light can be used not only to measure materials’ properties, but also to change them. Especially interesting are those cases in which the function of a material can be modified, such as its ability to conduct electricity or to store information in its magnetic state. A team led by Andrea Cavalleri from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg used terahertz frequency light pulses to transform a non-ferroelectric material into a ferroelectric one.

Ferroelectricity is a state in which the constituent lattice “looks” in one specific direction, forming a macroscopic electrical polarisation. The ability to...

Im Focus: Determining the Earth’s gravity field more accurately than ever before

Researchers at TU Graz calculate the most accurate gravity field determination of the Earth using 1.16 billion satellite measurements. This yields valuable knowledge for climate research.

The Earth’s gravity fluctuates from place to place. Geodesists use this phenomenon to observe geodynamic and climatological processes. Using...

Im Focus: Tube anemone has the largest animal mitochondrial genome ever sequenced

Discovery by Brazilian and US researchers could change the classification of two species, which appear more akin to jellyfish than was thought.

The tube anemone Isarachnanthus nocturnus is only 15 cm long but has the largest mitochondrial genome of any animal sequenced to date, with 80,923 base pairs....

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

SEMANTiCS 2019 brings together industry leaders and data scientists in Karlsruhe

29.04.2019 | Event News

Revered mathematicians and computer scientists converge with 200 young researchers in Heidelberg!

17.04.2019 | Event News

First dust conference in the Central Asian part of the earth’s dust belt

15.04.2019 | Event News

 
Latest News

A new force for optical tweezers awakens

19.06.2019 | Physics and Astronomy

New AI system manages road infrastructure via Google Street View

19.06.2019 | Information Technology

A new manufacturing process for aluminum alloys

19.06.2019 | Materials Sciences

VideoLinks
Science & Research
Overview of more VideoLinks >>>