Optic Flow: A Step in The Right Direction

The motion we perceive with our eyes plays a critical role in guiding our feet as we move through the world, Brown University research shows.

The work was conducted in Brown’s Virtual Environment Navigation Lab, or VENLab, one of the largest virtual reality labs in the country. It appears online in Current Biology and will be featured on the journal’s Dec. 4, 2007, cover. The findings shed important new light on the phenomenon of optic flow, the perceived motion of visual objects that helps in judging distance and avoiding obstacles while walking, driving and even landing a plane.

Perception in action, or the intersection of how we see and how we move, is the focus of research in the laboratory of William Warren, chairman of Brown’s Department of Cognitive and Linguistic Sciences. In the VENLab, Warren and his team have studied optic flow by having human subjects wear virtual reality goggles and navigate through virtual obstacles in the darkened room.

In 2001, the Warren lab showed for the first time that people use optic flow to help them steer toward a target as they walk. But what happens when you are standing still? If you are not walking and not getting the constant stream of visual information that optic flow provides, how do you begin moving toward a target? How do you know which way to direct your first step?

To get an answer to this question, the team created a virtual display that simulates a prism, bending light so that the visual scene shifts to one side. The target – in this case, a virtual doorway toward which subjects were told to walk – appeared to be farther to the right than it actually was. A total of 40 subjects ran through about 40 trials each, with everyone trying to walk through the virtual doorway while wearing the simulated prism. Half those subjects had optic flow, or a steady stream of visual information, available to them. The other half did not.

The researchers found that, on average, all 40 subjects missed the doorway by about five feet on the first few steps. But after a while, subjects adapted and were able to walk straight toward the doorway. Then the simulated prism was removed, and subjects were again asked to walk to the virtual doorway. Surprisingly, they all missed their mark on the opposite side because their brains and bodies had adapted to the prism. After a few tries, subjects quickly readjusted again and were able to walk straight to the doorway.

Hugo Bruggeman, a postdoctoral research fellow in the Warren lab who led the research, said the kicker came when they compared data from subjects who had optic flow available during the trials with data from those who did not. When subjects had optic flow, they took a straight path toward the doorway and made it, on average, in just three tries. When optic flow was eliminated, and subjects had only a lone target to aim for, it took an average of 20 tries before they walked straight to the target.

“With optic flow, it is easier to walk in the right direction,” Bruggeman said. “Subjects adapted seven times faster. This suggests that with a continuous flow of visual information, your brain allows you to rapidly and accurately adapt your direction of walking. So we’re constantly recalibrating our movements and our actions based on information such as optic flow.”

This finding could have practical applications, particularly in robotics, where it can be used to produce machines with more accurate guidance capabilities. But Warren said the real value of the work rests with the deeper scientific principle it points up.

“We tend to think that the structures of the brain dictate how we perceive and act in the world,” Warren said. “But this work shows that the world also influences the structures of the brain. It’s a loop. The organization in the brain must reflect the organization and information in the world. And this is a universal principle. All animals that move use optic flow to navigate. So whether you’re a fly landing on a leaf or a pilot landing on a runway, you may be using the same source of sensory information to carry out your task.”

Wendy Zosh, a graduate student in the Department of Cognitive and Linguistic Sciences, also assisted in the research.

The National Eye Institute funded the work.

Media Contact

Wendy Lawton EurekAlert!

Weitere Informationen:

http://www.brown.edu

Alle Nachrichten aus der Kategorie: Life Sciences

Articles and reports from the Life Sciences area deal with applied and basic research into modern biology, chemistry and human medicine.

Valuable information can be found on a range of life sciences fields including bacteriology, biochemistry, bionics, bioinformatics, biophysics, biotechnology, genetics, geobotany, human biology, marine biology, microbiology, molecular biology, cellular biology, zoology, bioinorganic chemistry, microchemistry and environmental chemistry.

Zurück zur Startseite

Kommentare (0)

Schreib Kommentar

Neueste Beiträge

Nanomaterials enable dual-mode heating and cooling device

Device could cut HVAC energy use by nearly 20% in the US. Engineers at Duke University have demonstrated a dual-mode heating and cooling device for building climate control that, if…

What social distancing does to a brain

Scientists discover a neuropeptide that reflects the current state of a fish’s social environment. Have you recently wondered how social-distancing and self-isolation may be affecting your brain? An international research…

Scientists solve big limitation of stratospheric balloon payloads

How do you cool a large telescope to absolute zero while flying it from a huge balloon at 130,000 feet? Nearly all photons emitted after the Big Bang are now…

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close