The way insects see and track their prey is being applied to a new robot under development at the University of Adelaide, in the hopes of improving robot visual systems.
The project - which crosses the boundaries of neuroscience, mechanical engineering and computer science - builds on years of research into insect vision at the University.
In a new paper published today in the Journal of The Royal Society Interface, researchers describe how the learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to 'pursue' an object.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task," says the lead author of the paper, Mechanical Engineering PhD student Zahra Bagheri.
"Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall!
"Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill," she says.
Research conducted in the lab of University of Adelaide neuroscientist Dr Steven Wiederman (School of Medical Sciences) has shown that flying insects, such as dragonflies, show remarkable visually guided behaviour. This includes chasing mates or prey, even in the presence of distractions, like swarms of insects.
"They perform this task despite their low visual acuity and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%," Ms Bagheri says.
The team of engineers and neuroscientists has developed an unusual algorithm to help emulate this visual tracking. "Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it," Ms Bagheri says. "This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal."
This bio-inspired "active vision" system has been tested in virtual reality worlds composed of various natural scenes. The Adelaide team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.
"This type of performance can allow for real-time applications using quite simple processors," says Dr Wiederman, who is leading the project, and who developed the original motion sensing mechanism after recording the responses of neurons in the dragonfly brain.
"We are currently transferring the algorithm to a hardware platform, a bio-inspired, autonomous robot."
School of Mechanical Engineering
The University of Adelaide
Dr Steven Wiederman
ARC Discovery Early Career Researcher
School of Medical Sciences
The University of Adelaide
Phone: +61 8 8313 8067
Dr. Steven Wiederman | EurekAlert!
Football through the eyes of a computer
14.06.2018 | Universität Konstanz
People recall information better through virtual reality, says new UMD study
14.06.2018 | University of Maryland
Moving into its fourth decade, AchemAsia is setting out for new horizons: The International Expo and Innovation Forum for Sustainable Chemical Production will take place from 21-23 May 2019 in Shanghai, China. With an updated event profile, the eleventh edition focusses on topics that are especially relevant for the Chinese process industry, putting a strong emphasis on sustainability and innovation.
Founded in 1989 as a spin-off of ACHEMA to cater to the needs of China’s then developing industry, AchemAsia has since grown into a platform where the latest...
The BMBF-funded OWICELLS project was successfully completed with a final presentation at the BMW plant in Munich. The presentation demonstrated a Li-Fi communication with a mobile robot, while the robot carried out usual production processes (welding, moving and testing parts) in a 5x5m² production cell. The robust, optical wireless transmission is based on spatial diversity; in other words, data is sent and received simultaneously by several LEDs and several photodiodes. The system can transmit data at more than 100 Mbit/s and five milliseconds latency.
Modern production technologies in the automobile industry must become more flexible in order to fulfil individual customer requirements.
An international team of scientists has discovered a new way to transfer image information through multimodal fibers with almost no distortion - even if the fiber is bent. The results of the study, to which scientist from the Leibniz-Institute of Photonic Technology Jena (Leibniz IPHT) contributed, were published on 6thJune in the highly-cited journal Physical Review Letters.
Endoscopes allow doctors to see into a patient’s body like through a keyhole. Typically, the images are transmitted via a bundle of several hundreds of optical...
Light detection and control lies at the heart of many modern device applications, such as smartphone cameras. Using graphene as a light-sensitive material for...
Water molecules exist in two different forms with almost identical physical properties. For the first time, researchers have succeeded in separating the two forms to show that they can exhibit different chemical reactivities. These results were reported by researchers from the University of Basel and their colleagues in Hamburg in the scientific journal Nature Communications.
From a chemical perspective, water is a molecule in which a single oxygen atom is linked to two hydrogen atoms. It is less well known that water exists in two...
13.06.2018 | Event News
08.06.2018 | Event News
05.06.2018 | Event News
18.06.2018 | Earth Sciences
18.06.2018 | Process Engineering
18.06.2018 | Life Sciences