Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

How we see objects in depth: The brain's code for 3-D structure

29.10.2008
A team of Johns Hopkins University neuroscientists has discovered patterns of brain activity that may underlie our remarkable ability to see and understand the three-dimensional structure of objects.

Computers can beat us at math and chess, but humans are the experts at object vision. (That's why some Web sites use object recognition tasks as part of their authentication of human users.)

It seems trivial to us to describe a teapot as having a C-shaped handle on one side, an S-shaped spout on the other and a disk-shaped lid on top. But sifting this three-dimensional information from the constantly changing, two-dimensional images coming in through our eyes is one of the most difficult tasks the brain performs. Even sophisticated computer vision systems have never been able to accomplish the same feat using two-dimensional camera images.

The Johns Hopkins research suggests that higher-level visual regions of the brain represent objects as spatial configurations of surface fragments, something like a structural drawing. Individual neurons are tuned to respond to surface fragment substructures. For instance, one neuron from the study responded to the combination of a forward-facing ridge near the front and an upward-facing concavity near the top. Multiple neurons with different tuning sensitivities could combine like a three-dimensional mosaic to encode the entire object surface. An article describing these findings appears in the November issue of Nature Neuroscience, available online here: http://www.nature.com/neuro/journal/vaop/ncurrent/full/nn.2202.html.

"Human beings are keenly aware of object structure, and that may be due to this clear structural representation in the brain," explains Charles E. Connor, associate professor in the Zanvyl Krieger Mind-Brain Institute at The Johns Hopkins University.

In the study, Connor and a postdoctoral fellow, Yukako Yamane, trained two rhesus monkeys to look at a computer monitor while 3-D pictures of objects were flashed on the screen. At the same time, the researchers recorded electrical responses of individual neurons in higher-level visual regions of the brain. A computer algorithm was used to guide the experiment gradually toward object shapes that evoked stronger responses.

This evolutionary stimulus strategy let the experimenters pinpoint the exact 3-D shape information that drove a given cell to respond.

These findings and other research on object coding in the brain have implications for treating patients with perceptual disorders. In addition, they could inform new approaches to computer vision. Connor also believes that understanding neural codes could help explain why visual experience feels the way it does, perhaps even why some things seem beautiful and others displeasing.

"In a sense, artists are neuroscientists, experimenting with shape and color, trying to evoke unique, powerful responses from the visual brain," Connor said.

As a first step toward this neuroaesthetic question, the Connor laboratory plans to collaborate with the Walters Art Museum in Baltimore to study human responses to sculptural shape. Gary Vikan, the Walters' director, is a strong believer in the power of neuroscience to inform the interpretation of art.

"My interest is in finding out what happens between a visitor's brain and a work of art," said Vikan. "Knowing what effect art has on patrons' brains will contribute to techniques of display -- lighting and color and arrangement -- that will enhance their experiences when they come into the museum."

The plan is to let museum patrons view a series of computer-generated 3-D shapes and rate them aesthetically. The same computer algorithm will be used to guide evolution of these shapes; in this case, based on aesthetic preference.

If this experiment can identify artistically powerful structural motifs, the next step would be to study how those motifs are represented at the neural level.

"Some researchers speculate that evolution determines what kinds of shapes and such our brains find pleasing," Vikan said. "In other words, perhaps we are hard-wired to prefer certain things. This collaboration with the Mind-Brain Institute at Johns Hopkins could help us begin to understand that in more depth."

Lisa DeNike | EurekAlert!
Further information:
http://www.jhu.edu

More articles from Life Sciences:

nachricht The birth of a new protein
20.10.2017 | University of Arizona

nachricht Building New Moss Factories
20.10.2017 | Albert-Ludwigs-Universität Freiburg im Breisgau

All articles from Life Sciences >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Neutron star merger directly observed for the first time

University of Maryland researchers contribute to historic detection of gravitational waves and light created by event

On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...

Im Focus: Breaking: the first light from two neutron stars merging

Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.

Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....

Im Focus: Smart sensors for efficient processes

Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).

When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...

Im Focus: Cold molecules on collision course

Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.

How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...

Im Focus: Shrinking the proton again!

Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.

It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

ASEAN Member States discuss the future role of renewable energy

17.10.2017 | Event News

World Health Summit 2017: International experts set the course for the future of Global Health

10.10.2017 | Event News

Climate Engineering Conference 2017 Opens in Berlin

10.10.2017 | Event News

 
Latest News

Terahertz spectroscopy goes nano

20.10.2017 | Information Technology

Strange but true: Turning a material upside down can sometimes make it softer

20.10.2017 | Materials Sciences

NRL clarifies valley polarization for electronic and optoelectronic technologies

20.10.2017 | Interdisciplinary Research

VideoLinks
B2B-VideoLinks
More VideoLinks >>>