Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New image sensor will show what the eyes see, and a camera cannot

13.01.2005


Software behind the technology already finding its way into photo editing



Researchers are developing new technologies that may give robots the visual-sensing edge they need to monitor dimly lit airports, pilot vehicles in extreme weather and direct unmanned combat vehicles.

The researchers intend to create an imaging chip that defeats the harmful effects of arbitrary illumination, allowing robotic vision to leave the controlled lighting of a laboratory and enter the erratic lighting of the natural world. In a first step, the researchers have now developed software that simulates the chip circuitry, a program that alone is capable of uncovering hidden detail in existing images.


Designed by robot-vision expert, Vladimir Brajovic, and his colleagues at Intrigue Technologies, Inc.--a spin-off of the team’s Carnegie Mellon University research--the new optical device will work more like a retina than a standard imaging sensor.

Just as neurons in the eye process information before sending signals to the brain, the pixels of the new device will "talk" to each other about what they see. The pixels will use the information to modify their behavior and adapt to lighting, ultimately gathering visual information even under adverse conditions.

Through an online demonstration, the simulator software plug-in, dubbed Shadow Illuminator , has processed more than 80,000 pictures from around the world. By balancing exposure across images, clearing away "noise" and improving contrast, the software revealed missing textures, exposed concealed individuals and even uncovered obscured features in medical x-ray film.

This new approach counters a persistent problem for computer-vision cameras – when capturing naturally lit scenes, a camera can be as much of an obstacle as it is a tool. Despite careful attention to shutter speeds and other settings, the brightly illuminated parts of the image are often washed out, and shadowy parts of the image are completely black. The mathematical churning behind that process will allow pixels to "perceive" reflectance--a surface property that determines how much incoming light reflects off an object, light that a camera can capture.

Light illuminating an object helps reveal reflectance to a camera or an eye. However, illumination is a necessary evil, says Brajovic. "Most of the problems in robotic imaging can be traced back to having too much light in some parts of the image and too little in others," he says, "and yet we need light to reveal the objects in a field of view."

To produce images that appear uniformly illuminated, the researchers created a system that widens the range of light intensities a sensor can accommodate. According to Brajovic, limitations in standard imaging sensors have hindered many vision applications, such as security and surveillance, intelligent transportation systems, and defense systems – not to mention ruining a few cherished family photos.

The researchers hope the new technology will yield high-quality image data, despite natural lighting, and ultimately improve the reliability of machine-vision systems, such as those for biometric identification, enhanced X-ray diagnostics and space exploration imagers.

Additional comments from the researcher:

"The washed out and underexposed images captured by today’s digital cameras are simply too confusing for machines to interpret, ultimately leading to failure of their vision systems in many critical applications." – Vladimir Brajovic, Carnegie Mellon University and Intrigue Technologies, Inc.

"Often, when we take a picture with a digital or film camera, we are disappointed that many details we remember seeing appear in the image buried in deep shadows or washed out in overexposed regions. This is because our eyes have a built-in mechanism to adapt to local illumination conditions, while our cameras don’t. Because of this camera deficiency, robot vision often fails." – Vladimir Brajovic

Josh Chamot | EurekAlert!
Further information:
http://www.nsf.gov

More articles from Information Technology:

nachricht New software speeds origami structure designs
12.10.2017 | Georgia Institute of Technology

nachricht Seeing the next dimension of computer chips
11.10.2017 | Osaka University

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Neutron star merger directly observed for the first time

University of Maryland researchers contribute to historic detection of gravitational waves and light created by event

On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...

Im Focus: Breaking: the first light from two neutron stars merging

Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.

Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....

Im Focus: Smart sensors for efficient processes

Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).

When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...

Im Focus: Cold molecules on collision course

Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.

How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...

Im Focus: Shrinking the proton again!

Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.

It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

ASEAN Member States discuss the future role of renewable energy

17.10.2017 | Event News

World Health Summit 2017: International experts set the course for the future of Global Health

10.10.2017 | Event News

Climate Engineering Conference 2017 Opens in Berlin

10.10.2017 | Event News

 
Latest News

Physics boosts artificial intelligence methods

19.10.2017 | Physics and Astronomy

NASA team finds noxious ice cloud on saturn's moon titan

19.10.2017 | Physics and Astronomy

New procedure enables cultivation of human brain sections in the petri dish

19.10.2017 | Life Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>