Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New tool enables powerful data analysis

12.01.2009
A powerful yet compact algorithm has been developed that can be used on laptop computers to extract features and patterns from huge and complex data sets

A powerful computing tool that allows scientists to extract features and patterns from enormously large and complex sets of raw data has been developed by scientists at University of California, Davis, and Lawrence Livermore National Laboratory. The tool – a set of problem-solving calculations known as an algorithm – is compact enough to run on computers with as little as two gigabytes of memory.

The team that developed this algorithm has already used it to probe a slew of phenomena represented by billions of data points, including analyzing and creating images of flame surfaces; searching for clusters and voids in a virtual universe experiment; and identifying and tracking pockets of fluid in a simulated mixing of two fluids.

"What we've developed is a workable system of handling any data in any dimension," said Attila Gyulassy, who led the five-year development effort while pursuing a PhD in computer science at UC Davis. "We expect this algorithm will become an integral part of a scientist's toolbox to answer questions about data."

A paper describing the new algorithm was published in the November-December issue of IEEE Transactions on Visualization and Computer Graphics.

Computers are widely used to perform simulations of real-world phenomena and to capture results of physical experiments and observations, storing this information as collections of numbers. But as the size of these data sets has burgeoned, hand-in-hand with computer capacity, analysis has grown increasingly difficult.

A mathematical tool to extract and visualize useful features from data sets has existed for nearly 40 years – in theory. Called the Morse-Smale complex, it partitions sets by similarity of features and encodes them into mathematical terms. But working with the Morse-Smale complex is not easy. "It's a powerful language. But a cost of that, is that using it meaningfully for practical applications is very difficult," Gyulassy said.

Gyulassy's algorithm divides data sets into parcels of cells, then analyzes each parcel separately using the Morse-Smale complex. Results of those computations are then merged together. As new parcels are created from merged parcels, they are analyzed and merged yet again. At each step, data that do not need to be stored in memory are discarded, drastically reducing the computing power required to run the calculations.

One of Gyulassy's tests of the algorithm was to use it to analyze and track the formation and movement of pockets of fluid in the simulated mixing of two fluids: one dense, one light. The complexity of this data set is so vast – it consists of more than one billion data points on a three-dimensional grid – it challenges even supercomputers, Gyulassy said. Yet the new algorithm with its streamlining features was able to perform the analysis on a laptop computer with just two gigabytes of memory. Although Gyulassy had to wait nearly 24 hours for the little machine to complete its calculations, at the end of this process he could pull up images in mere seconds to illustrate phenomena he was interested in, such as the branching of fluid pockets in the mixture.

Two main factors are driving the need for analysis of large data sets, said co-author Bernd Hamann: a surge in the use of powerful computers that can produce huge amounts of data, and an upswing in affordability and availability of sensing devices that researchers deploy in the field and lab to collect a profusion of data.

"Our data files are becoming larger and larger, while the scientist has less and less time to understand them," said Hamann, a professor of computer science and associate vice chancellor for research at UC Davis. "But what are the data good for if we don't have the means of applying mathematically sound and computationally efficient computer analysis tools to look for what is captured in them?"

Gyulassy is currently developing software that will allow others to put the algorithm to use. He expects the learning curve to be steep for this open-source product, "but if you just learn the minimal amount about what a Morse-Smale complex is," he said, "it will be pretty intuitive."

Liese Greensfelder | EurekAlert!
Further information:
http://www.ucdavis.edu

More articles from Information Technology:

nachricht Snake-inspired robot uses kirigami to move
22.02.2018 | Harvard John A. Paulson School of Engineering and Applied Sciences

nachricht Camera technology in vehicles: Low-latency image data compression
22.02.2018 | Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Attoseconds break into atomic interior

A newly developed laser technology has enabled physicists in the Laboratory for Attosecond Physics (jointly run by LMU Munich and the Max Planck Institute of Quantum Optics) to generate attosecond bursts of high-energy photons of unprecedented intensity. This has made it possible to observe the interaction of multiple photons in a single such pulse with electrons in the inner orbital shell of an atom.

In order to observe the ultrafast electron motion in the inner shells of atoms with short light pulses, the pulses must not only be ultrashort, but very...

Im Focus: Good vibrations feel the force

A group of researchers led by Andrea Cavalleri at the Max Planck Institute for Structure and Dynamics of Matter (MPSD) in Hamburg has demonstrated a new method enabling precise measurements of the interatomic forces that hold crystalline solids together. The paper Probing the Interatomic Potential of Solids by Strong-Field Nonlinear Phononics, published online in Nature, explains how a terahertz-frequency laser pulse can drive very large deformations of the crystal.

By measuring the highly unusual atomic trajectories under extreme electromagnetic transients, the MPSD group could reconstruct how rigid the atomic bonds are...

Im Focus: Developing reliable quantum computers

International research team makes important step on the path to solving certification problems

Quantum computers may one day solve algorithmic problems which even the biggest supercomputers today can’t manage. But how do you test a quantum computer to...

Im Focus: In best circles: First integrated circuit from self-assembled polymer

For the first time, a team of researchers at the Max-Planck Institute (MPI) for Polymer Research in Mainz, Germany, has succeeded in making an integrated circuit (IC) from just a monolayer of a semiconducting polymer via a bottom-up, self-assembly approach.

In the self-assembly process, the semiconducting polymer arranges itself into an ordered monolayer in a transistor. The transistors are binary switches used...

Im Focus: Demonstration of a single molecule piezoelectric effect

Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale

Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

2nd International Conference on High Temperature Shape Memory Alloys (HTSMAs)

15.02.2018 | Event News

Aachen DC Grid Summit 2018

13.02.2018 | Event News

How Global Climate Policy Can Learn from the Energy Transition

12.02.2018 | Event News

 
Latest News

Basque researchers turn light upside down

23.02.2018 | Physics and Astronomy

Finnish research group discovers a new immune system regulator

23.02.2018 | Health and Medicine

Attoseconds break into atomic interior

23.02.2018 | Physics and Astronomy

VideoLinks
Science & Research
Overview of more VideoLinks >>>