The way insects see and track their prey is being applied to a new robot under development at the University of Adelaide, in the hopes of improving robot visual systems.
The project - which crosses the boundaries of neuroscience, mechanical engineering and computer science - builds on years of research into insect vision at the University.
In a new paper published today in the Journal of The Royal Society Interface, researchers describe how the learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to 'pursue' an object.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task," says the lead author of the paper, Mechanical Engineering PhD student Zahra Bagheri.
"Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall!
"Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill," she says.
Research conducted in the lab of University of Adelaide neuroscientist Dr Steven Wiederman (School of Medical Sciences) has shown that flying insects, such as dragonflies, show remarkable visually guided behaviour. This includes chasing mates or prey, even in the presence of distractions, like swarms of insects.
"They perform this task despite their low visual acuity and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%," Ms Bagheri says.
The team of engineers and neuroscientists has developed an unusual algorithm to help emulate this visual tracking. "Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it," Ms Bagheri says. "This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal."
This bio-inspired "active vision" system has been tested in virtual reality worlds composed of various natural scenes. The Adelaide team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.
"This type of performance can allow for real-time applications using quite simple processors," says Dr Wiederman, who is leading the project, and who developed the original motion sensing mechanism after recording the responses of neurons in the dragonfly brain.
"We are currently transferring the algorithm to a hardware platform, a bio-inspired, autonomous robot."
School of Mechanical Engineering
The University of Adelaide
Dr Steven Wiederman
ARC Discovery Early Career Researcher
School of Medical Sciences
The University of Adelaide
Phone: +61 8 8313 8067
Dr. Steven Wiederman | EurekAlert!
Construction of practical quantum computers radically simplified
05.12.2016 | University of Sussex
UT professor develops algorithm to improve online mapping of disaster areas
29.11.2016 | University of Tennessee at Knoxville
In recent years, lasers with ultrashort pulses (USP) down to the femtosecond range have become established on an industrial scale. They could advance some applications with the much-lauded “cold ablation” – if that meant they would then achieve more throughput. A new generation of process engineering that will address this issue in particular will be discussed at the “4th UKP Workshop – Ultrafast Laser Technology” in April 2017.
Even back in the 1990s, scientists were comparing materials processing with nanosecond, picosecond and femtosesecond pulses. The result was surprising:...
Have you ever wondered how you see the world? Vision is about photons of light, which are packets of energy, interacting with the atoms or molecules in what...
A multi-institutional research collaboration has created a novel approach for fabricating three-dimensional micro-optics through the shape-defined formation of porous silicon (PSi), with broad impacts in integrated optoelectronics, imaging, and photovoltaics.
Working with colleagues at Stanford and The Dow Chemical Company, researchers at the University of Illinois at Urbana-Champaign fabricated 3-D birefringent...
In experiments with magnetic atoms conducted at extremely low temperatures, scientists have demonstrated a unique phase of matter: The atoms form a new type of quantum liquid or quantum droplet state. These so called quantum droplets may preserve their form in absence of external confinement because of quantum effects. The joint team of experimental physicists from Innsbruck and theoretical physicists from Hannover report on their findings in the journal Physical Review X.
“Our Quantum droplets are in the gas phase but they still drop like a rock,” explains experimental physicist Francesca Ferlaino when talking about the...
The Max Planck Institute for Physics (MPP) is opening up a new research field. A workshop from November 21 - 22, 2016 will mark the start of activities for an innovative axion experiment. Axions are still only purely hypothetical particles. Their detection could solve two fundamental problems in particle physics: What dark matter consists of and why it has not yet been possible to directly observe a CP violation for the strong interaction.
The “MADMAX” project is the MPP’s commitment to axion research. Axions are so far only a theoretical prediction and are difficult to detect: on the one hand,...
16.11.2016 | Event News
01.11.2016 | Event News
14.10.2016 | Event News
07.12.2016 | Health and Medicine
07.12.2016 | Life Sciences
07.12.2016 | Health and Medicine