Autonomous robots that find their way through unfamiliar terrain? Not so distant future.
The robot in the arena. The small camera films the objects and passes the information to the neural network by wifi. The network processes the data and controls the movement direction of the robot.
Martin Paul Nawrot
Researchers at the Bernstein Fokus Neuronal Basis of Learning, the Bernstein Center Berlin and the Freie Universität Berlin have developed a robot that perceives environmental stimuli and learns to react to them.
The scientists used the relatively simple nervous system of the honeybee as a model for its working principles. To this end, they installed a camera on a small robotic vehicle and connected it to a computer. The computer program replicated in a simplified way the sensorimotor network of the insect brain.
The input data came from the camera that—akin to an eye—received and projected visual information. The neural network, in turn, operated the motors of the robot wheels—and could thus control its motion direction.
The outstanding feature of this artifical mini brain is its ability to learn by simple principles. “The network-controlled robot is able to link certain external stimuli with behavioral rules,” says Professor Martin Paul Nawrot, head of the research team and member of the sub-project „Insect inspired robots: towards an understanding of memory in decision making“ of the Bernstein Focus. “Much like honeybees learn to associate certain flower colors with tasty nectar, the robot learns to approach certain colored objects and to avoid others.”
In the learning experiment, the scientists located the network-controlled robot in the center of a small arena. Red and blue objects were installed on the walls. Once the robot’s camera focused on an object with the desired color—red, for instance—, the scientists triggered a light flash. This signal activated a so-called reward sensor nerve cell in the artificial network. The simultaneous processing of red color and the reward now led to specific changes in those parts of the network, which exercised control over the robot wheels. As a consequence, when the robot “saw” another red object, it started to move toward it. Blue items, in contrast, made it to move backwards. “Just within seconds, the robot accomplishes the task to find an object in the desired color and to approach it,” explains Nawrot. “Only a single learning trial is needed, similar to experimental observations in honeybees.”
The current study has been carried out within an interdisciplinary collaboration between Professor Martin Paul Nawot’s research group “Neuroinformatics” (Institut of Biology), and the group “Intelligent Systems and Robotics” (Institute of Computer Science) headed by Raúl Rojas at Freie Universität Berlin. The scientists are now planning to expand their neural network by supplementing more learning principles. Thus, the mini brain will become even more powerful—and the robot more autonomous.
The Bernstein Focus Neuronal Basis of Learning, sub-project “Insect inspired robots: towards an understanding of memory in decision making” and the Bernstein Center Berlin are part of the National Bernstein Network Computational Neuroscience in Germany. With this funding initiative, the German Federal Ministry of Education and Research (BMBF) has supported the new discipline of Computational Neuroscience since 2004 with more than 170 million Euros. The network is named after the German physiologist Julius Bernstein (1835–1917).Contact:
Mareike Kardinal | idw
Superfast fluorescence sets new speed record
27.07.2015 | Duke University
Two crystals are better than one
22.07.2015 | The Agency for Science, Technology and Research (A*STAR)
Researchers have developed an ultrafast light-emitting device that can flip on and off 90 billion times a second and could form the basis of optical computing.
Joint BioEnergy Institute study identifies bacterial protein that is key to protecting rice against bacterial blight
A bacterial signal that when recognized by rice plants enables the plants to resist a devastating blight disease has been identified by a multi-national team...
Researchers in the Cockrell School of Engineering at The University of Texas at Austin are one step closer to delivering smart windows with a new level of energy efficiency, engineering materials that allow windows to reveal light without transferring heat and, conversely, to block light while allowing heat transmission, as described in two new research papers.
By allowing indoor occupants to more precisely control the energy and sunlight passing through a window, the new materials could significantly reduce costs for...
Argonne scientists used Mira to identify and improve a new mechanism for eliminating friction, which fed into the development of a hybrid material that exhibited superlubricity at the macroscale for the first time. Argonne Leadership Computing Facility (ALCF) researchers helped enable the groundbreaking simulations by overcoming a performance bottleneck that doubled the speed of the team's code.
While reviewing the simulation results of a promising new lubricant material, Argonne researcher Sanket Deshmukh stumbled upon a phenomenon that had never been...
A NASA camera on the Deep Space Climate Observatory (DSCOVR) satellite has returned its first view of the entire sunlit side of Earth from one million miles away.
The color images of Earth from NASA's Earth Polychromatic Imaging Camera (EPIC) are generated by combining three separate images to create a...
23.07.2015 | Event News
10.07.2015 | Event News
25.06.2015 | Event News
29.07.2015 | Physics and Astronomy
29.07.2015 | Life Sciences
29.07.2015 | Awards Funding