The way insects see and track their prey is being applied to a new robot under development at the University of Adelaide, in the hopes of improving robot visual systems.
The project - which crosses the boundaries of neuroscience, mechanical engineering and computer science - builds on years of research into insect vision at the University.
In a new paper published today in the Journal of The Royal Society Interface, researchers describe how the learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to 'pursue' an object.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task," says the lead author of the paper, Mechanical Engineering PhD student Zahra Bagheri.
"Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall!
"Robotics engineers still dream of providing robots with the combination of sharp eyes, quick reflexes and flexible muscles that allow a budding champion to master this skill," she says.
Research conducted in the lab of University of Adelaide neuroscientist Dr Steven Wiederman (School of Medical Sciences) has shown that flying insects, such as dragonflies, show remarkable visually guided behaviour. This includes chasing mates or prey, even in the presence of distractions, like swarms of insects.
"They perform this task despite their low visual acuity and a tiny brain, around the size of a grain of rice. The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%," Ms Bagheri says.
The team of engineers and neuroscientists has developed an unusual algorithm to help emulate this visual tracking. "Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it," Ms Bagheri says. "This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal."
This bio-inspired "active vision" system has been tested in virtual reality worlds composed of various natural scenes. The Adelaide team has found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.
"This type of performance can allow for real-time applications using quite simple processors," says Dr Wiederman, who is leading the project, and who developed the original motion sensing mechanism after recording the responses of neurons in the dragonfly brain.
"We are currently transferring the algorithm to a hardware platform, a bio-inspired, autonomous robot."
School of Mechanical Engineering
The University of Adelaide
Dr Steven Wiederman
ARC Discovery Early Career Researcher
School of Medical Sciences
The University of Adelaide
Phone: +61 8 8313 8067
Dr. Steven Wiederman | EurekAlert!
Robots as Tools and Partners in Rehabilitation
17.08.2018 | Albert-Ludwigs-Universität Freiburg im Breisgau
Low bandwidth? Use more colors at once
17.08.2018 | Purdue University
New design tool automatically creates nanostructure 3D-print templates for user-given colors
Scientists present work at prestigious SIGGRAPH conference
Most of the objects we see are colored by pigments, but using pigments has disadvantages: such colors can fade, industrial pigments are often toxic, and...
Scientists at the University of California, Los Angeles present new research on a curious cosmic phenomenon known as "whistlers" -- very low frequency packets...
Scientists develop first tool to use machine learning methods to compute flow around interactively designable 3D objects. Tool will be presented at this year’s prestigious SIGGRAPH conference.
When engineers or designers want to test the aerodynamic properties of the newly designed shape of a car, airplane, or other object, they would normally model...
Researchers from TU Graz and their industry partners have unveiled a world first: the prototype of a robot-controlled, high-speed combined charging system (CCS) for electric vehicles that enables series charging of cars in various parking positions.
Global demand for electric vehicles is forecast to rise sharply: by 2025, the number of new vehicle registrations is expected to reach 25 million per year....
Proteins must be folded correctly to fulfill their molecular functions in cells. Molecular assistants called chaperones help proteins exploit their inbuilt folding potential and reach the correct three-dimensional structure. Researchers at the Max Planck Institute of Biochemistry (MPIB) have demonstrated that actin, the most abundant protein in higher developed cells, does not have the inbuilt potential to fold and instead requires special assistance to fold into its active state. The chaperone TRiC uses a previously undescribed mechanism to perform actin folding. The study was recently published in the journal Cell.
Actin is the most abundant protein in highly developed cells and has diverse functions in processes like cell stabilization, cell division and muscle...
17.08.2018 | Event News
08.08.2018 | Event News
27.07.2018 | Event News
17.08.2018 | Physics and Astronomy
17.08.2018 | Information Technology
17.08.2018 | Life Sciences