Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Piecing together the next generation of cognitive robots

05.05.2008
Building robots with anything akin to human intelligence remains a far off vision, but European researchers are making progress on piecing together a new generation of machines that are more aware of their environment and better able to interact with humans.

Making robots more responsive would allow them to be used in a greater variety of sophisticated tasks in the manufacturing and service sectors. Such robots could be used as home helpers and caregivers, for example.

As research into artificial cognitive systems (ACS) has progressed in recent years it has grown into a highly fragmented field. Some researchers and teams have concentrated on machine vision, others on spatial cognition, and on human-robot interaction, among many other disciplines.

All have made progress, but, as the EU-funded project CoSy (Cognitive Systems for Cognitive Assistants) has shown, by working together the researchers can make even more advances in the field.

“We have brought together one of the broadest and most varied teams of researchers in this field,” says Geert-Jan Kruijff, the CoSy project manager at the German Research Centre for Artificial Intelligence. “This has resulted in an ACS architecture that integrates multiple cognitive functions to create robots that are more self-aware, understand their environment and can better interact with humans.”

The CoSy ACS is indeed greater than the sum of its parts. It incorporates a range of technologies from a design for cognitive architecture, spatial cognition, human-robot interaction and situated dialogue processing, to developmental models of visual processing.

“We have learnt how to put the pieces of ACS together, rather than just studying them separately,” adds Jeremy Wyatt, one of the project managers at the UK’s University of Birmingham.

The researchers have made the ACS architecture toolkit they developed available under an open source license. They want to encourage further research. The toolkit has already sparked several spin-off initiatives.

Overcoming the integration challenge
“The integration of different components in an ACS is one of the greatest challenges in robotics,” Kruijff says. “Getting robots to understand their environment from visual inputs and to interact with humans from spoken commands and relate what is said to their environment is enormously complex.”

Because of the complexity most robots developed to date have tended to be reactive. They simply react to their environment rather than act in it autonomously. Similar to a beetle that scuttles away when prodded, many mobile robots back off when they collide with an object, but have little self-awareness or understanding of the space around them and what they can do there.

In comparison, a demonstrator called the Explorer developed by the CoSy team has a more human-like understanding of its environment. Explorer can even talk about its surroundings with a human.

Instead of using just geometric data to create a map of its surroundings, the Explorer also incorporates qualitative, topographical information. Through interaction with humans it can then learn to recognise objects, spaces and their uses. For example, if it sees a coffee machine it may reason that it is in a kitchen. If it sees a sofa it may conclude it is in a living room.

“The robot sees a room much as humans see it because it has a conceptual understanding of space,” Kruijff notes.

Another demonstrator, called the PlayMate, applied machine vision and spatial recognition in a different context. PlayMate uses a robotic arm to manipulate objects in response to human instructions.

In Wyatt’s view the development of machine vision and its integration with other ACS components is still a big obstacle to creating more advanced robots, especially if the goal is to replicate human sight and awareness.

“Don’t underestimate how sophisticated we are…,” he says. “We don’t realise how agile our brains are at interpreting what we see. You can pick out colours from a scene, look at a bottle of water, a packet of cornflakes, or a coffee mug and know what activities each of them allows. You recognise them, see where to grasp them, and how to manipulate them, and you do it all seamlessly. We are still so very, very far from doing that with robots.”

Robotic ‘gofers’
Fortunately, replicating human-like intelligence and awareness, if it is indeed possible, is not necessary when creating robots that are useful to humans.

Kruijff foresees robots akin to those developed in the CoSy project becoming an everyday sight over the coming years in what he describes as ‘gofer scenarios’. Already some robots with a lower level of intelligence are being used to bring medicines to patients in hospitals and could be used to transport documents around office buildings.

Robotic vacuum cleaners are becoming increasingly popular in homes, as too are toys that incorporate artificial intelligence. And the creation of robots that are able to interact with people opens the door to robotic home helpers and caregivers.

“In the future people may all be waited on by robots in their old age,” Wyatt says.

Ahmed ElAmin | alfa
Further information:
http://cordis.europa.eu/ictresults/index.cfm/section/news/tpl/article/BrowsingType/Features/ID/89704

More articles from Information Technology:

nachricht Controlling robots with brainwaves and hand gestures
20.06.2018 | Massachusetts Institute of Technology, CSAIL

nachricht Innovative autonomous system for identifying schools of fish
20.06.2018 | IMDEA Networks Institute

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Temperature-controlled fiber-optic light source with liquid core

In a recent publication in the renowned journal Optica, scientists of Leibniz-Institute of Photonic Technology (Leibniz IPHT) in Jena showed that they can accurately control the optical properties of liquid-core fiber lasers and therefore their spectral band width by temperature and pressure tuning.

Already last year, the researchers provided experimental proof of a new dynamic of hybrid solitons– temporally and spectrally stationary light waves resulting...

Im Focus: Overdosing on Calcium

Nano crystals impact stem cell fate during bone formation

Scientists from the University of Freiburg and the University of Basel identified a master regulator for bone regeneration. Prasad Shastri, Professor of...

Im Focus: AchemAsia 2019 will take place in Shanghai

Moving into its fourth decade, AchemAsia is setting out for new horizons: The International Expo and Innovation Forum for Sustainable Chemical Production will take place from 21-23 May 2019 in Shanghai, China. With an updated event profile, the eleventh edition focusses on topics that are especially relevant for the Chinese process industry, putting a strong emphasis on sustainability and innovation.

Founded in 1989 as a spin-off of ACHEMA to cater to the needs of China’s then developing industry, AchemAsia has since grown into a platform where the latest...

Im Focus: First real-time test of Li-Fi utilization for the industrial Internet of Things

The BMBF-funded OWICELLS project was successfully completed with a final presentation at the BMW plant in Munich. The presentation demonstrated a Li-Fi communication with a mobile robot, while the robot carried out usual production processes (welding, moving and testing parts) in a 5x5m² production cell. The robust, optical wireless transmission is based on spatial diversity; in other words, data is sent and received simultaneously by several LEDs and several photodiodes. The system can transmit data at more than 100 Mbit/s and five milliseconds latency.

Modern production technologies in the automobile industry must become more flexible in order to fulfil individual customer requirements.

Im Focus: Sharp images with flexible fibers

An international team of scientists has discovered a new way to transfer image information through multimodal fibers with almost no distortion - even if the fiber is bent. The results of the study, to which scientist from the Leibniz-Institute of Photonic Technology Jena (Leibniz IPHT) contributed, were published on 6thJune in the highly-cited journal Physical Review Letters.

Endoscopes allow doctors to see into a patient’s body like through a keyhole. Typically, the images are transmitted via a bundle of several hundreds of optical...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

Munich conference on asteroid detection, tracking and defense

13.06.2018 | Event News

2nd International Baltic Earth Conference in Denmark: “The Baltic Sea region in Transition”

08.06.2018 | Event News

ISEKI_Food 2018: Conference with Holistic View of Food Production

05.06.2018 | Event News

 
Latest News

Graphene assembled film shows higher thermal conductivity than graphite film

22.06.2018 | Materials Sciences

Fast rising bedrock below West Antarctica reveals an extremely fluid Earth mantle

22.06.2018 | Earth Sciences

Zebrafish's near 360 degree UV-vision knocks stripes off Google Street View

22.06.2018 | Life Sciences

VideoLinks
Science & Research
Overview of more VideoLinks >>>