New Supercomputer Rankings for "Big Data" Machines Released

New rankings were released Tuesday in Seattle at SC2011, the international conference for high-performance computing.

NNSA/SC Blue Gene/Q Prototype II has risen to the top spot on the list and is the first National Nuclear Security Administration winner. Sandia’s Ultraviolet platform placed 10th using custom software, the Sandia Red Sky supercomputer dropped from 8th to 13th, and Dingus and Wingus, the insouciantly named Sandia prototype (Convey-based Field-Programmable Gate Array, or FPGA) platforms, placed 23rd and 24th, respectively.

The Graph500 stresses supercomputer performance on “big data” scaling problems rather than on the purely arithmetic computations measured by the Linpack Top500 and similar benchmarks. Graph500 machines are tested for their ability to solve complex problems involving random-appearing graphs, rather than simply for their speed in solving complex problems.

Such graph-based problems are found in the medical world, where large numbers of medical entries must be correlated; in the analysis of social networks, with their enormous numbers of electronically related participants; and in international security, where, for example, huge numbers of containers on ships roaming the world’s ports of call must be tracked.

“Companies are interested in doing well on the Graph500 because large-scale data analytics are an increasingly important problem area and could eclipse traditional high-performance computing (HPC) in overall importance to society,” said Murphy, whose committee receives input from 30 international researchers. Changes are implemented by Sandia, the Georgia Institute of Technology, the University of Illinois at Urbana-Champaign, Indiana University and others.

Big-data problems are solved by creating large, complex graphs with vertices that represent the data points — say, people on Facebook — and edges that represent relations between the data points — say, friends on Facebook. These problems stress the ability of computing systems to store and communicate large amounts of data in irregular, fast-changing communication patterns, rather than the ability to perform many arithmetic operations in succession. The Graph500 benchmarks indicate how well supercomputers handle such complex problems.

The complete list of rankings is available at http://www.graph500.org/nov2011.html

Media Contact

Neal Singer Newswise Science News

More Information:

http://www.sandia.gov

All latest news from the category: Information Technology

Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.

This area covers topics such as IT services, IT architectures, IT management and telecommunications.

Back to home

Comments (0)

Write a comment

Newest articles

Bringing bio-inspired robots to life

Nebraska researcher Eric Markvicka gets NSF CAREER Award to pursue manufacture of novel materials for soft robotics and stretchable electronics. Engineers are increasingly eager to develop robots that mimic the…

Bella moths use poison to attract mates

Scientists are closer to finding out how. Pyrrolizidine alkaloids are as bitter and toxic as they are hard to pronounce. They’re produced by several different types of plants and are…

AI tool creates ‘synthetic’ images of cells

…for enhanced microscopy analysis. Observing individual cells through microscopes can reveal a range of important cell biological phenomena that frequently play a role in human diseases, but the process of…

Partners & Sponsors