Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

A new way to help computers recognize patterns

25.01.2006


Researchers at Ohio State University have found a way to boost the development of pattern recognition software by taking a different approach from that used by most experts in the field.



This work may impact research in areas as diverse as genetics, economics, climate modeling, and neuroscience.

Aleix Martinez, assistant professor of electrical and computer engineering at Ohio State, explained what all these areas of research have in common: pattern recognition.


He designs computer algorithms to replicate human vision, so he studies the patterns in shape and color that help us recognize objects, from apples to friendly faces. But much of today’s research in other areas comes down to finding patterns in data -- identifying the common factors among people who develop a certain disease, for example.

In fact, the majority of pattern recognition algorithms in science and engineering today are derived from the same basic equation and employ the same methods, collectively called linear feature extraction, Martinez said.

But the typical methods don’t always give researchers the answers they want. That’s why Martinez has developed a fast and easy test to find out in advance which algorithms are best in a particular circumstance.

"You can spend hours or weeks exploring a particular method, just to find out that it doesn’t work," he said. "Or you could use our test and find out right away if you shouldn’t waste your time with a particular approach."

The research grew out of the frustration that Martinez and his colleagues felt in the university’s Computational Biology and Cognitive Science Laboratory, when linear algorithms worked well in some applications, but not others.

In the journal IEEE Transactions on Pattern Analysis and Machine Intelligence, he and doctoral student Manil Zhu described the test they developed, which rates how well a particular pattern recognition algorithm will work for a given application.

Along the way, they discovered what happens to scientific data when researchers use a less-than-ideal algorithm: They don’t necessarily get the wrong answer, but they do get unnecessary information along with the answer, which adds to the problem.

He gave an example.

"Let’s say you are trying to understand why some patients have a disease. And you have certain variables, which could be the type of food they eat, what they drink, amount of exercise they take, and where they live. And you want to find out which variables are most important to their developing that disease. You may run an algorithm and find that two variables -- say, the amount of exercise and where they live -- most influence whether they get the disease. But it may turn out that one of those variables is not necessary. So your answer isn’t totally wrong, but a smaller set of variables would have worked better," he said. "The problem is that such errors may contribute to the incorrect classification of future observations."

Martinez and Zhu tested machine vision algorithms using two databases, one of objects such as apples and pears, and another database of faces with different expressions. The two tasks -- sorting objects and identifying expressions -- are sufficiently different that an algorithm could potentially be good at doing one but not at the other.

The test rates algorithms on a scale from zero to one. The closer the score is to zero, the better the algorithm.

The test worked: An algorithm that received a score of 0.2 for sorting faces was right 98 percent of the time. That same algorithm scored 0.34 for sorting objects, and was right only 70 percent of the time when performing that task. Another algorithm scored 0.68 and sorted objects correctly only 33 percent of the time.

"So a score like 0.68 means ’don’t waste your time,’" Martinez said. "You don’t have to go to the trouble to run it and find out that it’s wrong two-thirds of the time."

He hopes that researchers across a broad range of disciplines will try out this new test. His team has already started using it to optimize the algorithms they use to study language and cancer genetics.

This work was sponsored by the National Institutes of Health.

Aleix Martinez | EurekAlert!
Further information:
http://www.osu.edu

More articles from Information Technology:

nachricht Terahertz spectroscopy goes nano
20.10.2017 | Brown University

nachricht New software speeds origami structure designs
12.10.2017 | Georgia Institute of Technology

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Neutron star merger directly observed for the first time

University of Maryland researchers contribute to historic detection of gravitational waves and light created by event

On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...

Im Focus: Breaking: the first light from two neutron stars merging

Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.

Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....

Im Focus: Smart sensors for efficient processes

Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).

When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...

Im Focus: Cold molecules on collision course

Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.

How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...

Im Focus: Shrinking the proton again!

Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.

It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

ASEAN Member States discuss the future role of renewable energy

17.10.2017 | Event News

World Health Summit 2017: International experts set the course for the future of Global Health

10.10.2017 | Event News

Climate Engineering Conference 2017 Opens in Berlin

10.10.2017 | Event News

 
Latest News

Terahertz spectroscopy goes nano

20.10.2017 | Information Technology

Strange but true: Turning a material upside down can sometimes make it softer

20.10.2017 | Materials Sciences

NRL clarifies valley polarization for electronic and optoelectronic technologies

20.10.2017 | Interdisciplinary Research

VideoLinks
B2B-VideoLinks
More VideoLinks >>>