Not only does the brain find a way to compensate for our constantly flickering gaze, but researchers at the Salk Institute for Biological Studies have found that it actually turns the tables and relies on eye movements to recognize partially hidden or moving objects. Their findings will be published in a forthcoming issue of Nature Neuroscience.
"You might expect that if you move your eyes, your perception of objects might get degraded," explains senior author Richard Krauzlis, Ph.D., an associate professor in the Systems Neurobiology Laboratory at the Salk Institute. "The striking thing is that moving your eyes can actually help resolve ambiguous visual inputs."
Our eyes move all the time, whether to follow a moving object or to scan our surroundings. On average, our eyes move several times a second – in fact, in a lifetime, our eyes move more often than our heart beats. "Nevertheless, you don't have the sense that the world has just swept across or rotated around you. You sense that the world is stable," says Krauzlis.
Just like high-end video cameras, the brain relies on an internal image stabilization system to prevent our perception of the world from turning into a blurry mess. Explains lead author Ziad Hafed, Ph.D. "Obviously, the brain has found a solution. In addition to the jumpy video stream, the visual system constantly receives feedback about the eye movements that the brain is generating."
Hafed and Krauzlis took the question of how the brain is able to maintain perception under less than optimal circumstances one step further. "If you think of the video stream as a bunch of pixels coming in from the eyes, the real challenge for the visual system is to decide which pixels belong to which objects. We wondered whether information about eye movements is used by the brain to solve this difficult problem," says Hafed, who is an NSERC (Canada) and Sloan-Swartz post-doctoral researcher at the Salk Institute.
Krauzlis explains that the human brain recognizes objects in everyday circumstances because it is very good at filling in missing visual information. "When we see a deer partially hidden by tree trunks in a forest, we can still segment the visual scene and properly interpret the individual features and group them together into objects," he says.
However, even though recognizing that deer is effortless for us, it is not a trivial accomplishment for the brain. Teaching computers to recognize objects in real life situations has proven to be an almost insurmountable problem. Artificial intelligence researchers have spent much time and effort trying to design robots that can recognize objects in unconstrained situations, but so far, their success has been limited.
To determine whether eye movements actually help the brain recognize objects, Hafed and Krauzlis asked whether people perceived an object better when they actively moved their eyes or when they stared at a given point in space. Human subjects watched a short video that allowed them to glimpse a partially hidden chevron shape that moved in a circle.
When they kept their eyes still by fixating on a stationary spot, observers perceived only random lines moving up and down. But when they moved their eyes such that the input video streams through them were unaltered, viewers easily recognized the lines as a circling chevron.
"It turns out that eye movements not only help with image stabilization, but that this additional input also plays a fairly important role for the perception of objects in the face of all the challenges that real life visual scenes pose – that objects are obscured or are moving, and so on," says Hafed.
Gina Kirchweger | EurekAlert!
Single-stranded DNA and RNA origami go live
15.12.2017 | Wyss Institute for Biologically Inspired Engineering at Harvard
New antbird species discovered in Peru by LSU ornithologists
15.12.2017 | Louisiana State University
11.12.2017 | Event News
08.12.2017 | Event News
07.12.2017 | Event News
15.12.2017 | Power and Electrical Engineering
15.12.2017 | Materials Sciences
15.12.2017 | Life Sciences