SDSC researchers submitted an entry for Gordon using 218 teraflops per second (Tflop/s) and 12,608 cores – about 75 percent of the system. Built by SDSC researchers and Appro, a leading developer of supercomputing solutions, Gordon, the next generation Appro Xtreme-X™ Supercomputer is currently undergoing acceptance testing and when made available to the research community on January 1, 2012, it will have 16,384 cores and achieve over 280 Tflop/s.
The latest Top500 rankings were announced during the SC11 (Supercomputing 2011) conference in Seattle this week. In related acceptance runs, Gordon was tested as to its ability to do I/O (input/output) operations, and achieved an unprecedented 35 million IOPS (input/output operations per second), making it the most powerful supercomputer ever commissioned by the National Science Foundation (NSF) for doing IO.
IOPS is an important measure for data intensive computing since it indicates the ability of a storage system to perform I/O operations on small transfer sizes of randomly organized data – something prevalent in database and data mining applications.
Gordon’s Top500 result is notable in that the Tflop/s ranking was achieved using about half the number of cores compared to most other systems. That’s because Gordon is among the first systems – and the first one commissioned by the NSF – to use Intel® Xeon® processor E5 Family, which perform twice as many operations per clock (eight versus four) of any system currently in use.
“These are truly remarkable results not only for us, but for countless researchers who are moving along with us into an era of data-intensive computing, where supercomputers using innovative new architectures will become a vital tool for them in accelerating new discoveries across a wide range of disciplines,” said Michael Norman, director of SDSC and a co-principal investigator of the Gordon project, the result of a five-year, $20 million NSF award.
“Taken together, these two results mean that Gordon will be the defacto leader in NSF systems in terms of ‘big data’ analysis computing,” said Allan Snavely, SDSC’s associate director and a co-PI for the Gordon system. “Gordon was designed from the outset to be a balanced system between compute and I/O to take on some of the most data-intensive challenges in the real world, not just to score well in terms of FLOPS.”
Gordon is the first high-performance supercomputer to use large amounts of flash-based SSD (solid state drive) memory – think of it as the world’s largest thumb drive. Flash memory is more common in smaller devices such as mobile phones and laptop computers, but unique for supercomputers, which generally use slower spinning disk technology. Gordon uses 1,024 of Intel’s latest high endurance SSD’s, providing the significantly higher performance and endurance required for data intensive applications.
“Gordon, the Appro Xtreme-X Supercomputer, rises to the challenge of the San Diego Supercomputer Center’s computational demands with delivery of a data-intensive computing solution that blends cutting-edge for compute, network, and storage,” said Steve Lyness, vice president of Appro HPC Solutions Group. “The Appro Xtreme-X system based on the Future Intel Xeon processor E5 Family and the new Intel SSD 710 Series will maximize I/O capabilities and flash memory support to analyze large data sets used by SDSC to answer modern science’s critical problems.”
Gordon is capable of handling massive data bases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries. With almost 5 TB (terabytes) of high-performance flash memory in each I/O node, Gordon will provide a new level of performance and capacity for solving the most challenging data-intensive challenges, such as large graph problems, data mining operations, de novo genome assembly, database applications, and quantum chemistry. Gordon’s unique architecture includes:‧ 1,024 dual-socket compute nodes, each with 2 8-core Intel Xeon E5 Family processors, and 64 GB (gigabyte) DDR3 1333 memory
‧ High-performance parallel file system with over 4 PB (petabytes) of capacity, and sustained rates of 100 GB/s (gigabytes per second)
The Top500 list of the world’s most powerful supercomputers, now in its 19th year, has been joined by another supercomputer ranking called the Graph500. While developers of the new ranking complements the Top500, industry experts say that the Graph500 ranking seeks to quantify how much work a supercomputer can do based on its ability to analyze very large graph-based datasets that have millions of disparate points, while the Top500 ranking uses LINPACK software to determine sheer speed – or how fast a supercomputer can perform linear algebraic calculations.
Trestles, another new but smaller supercomputer using flash-based memory and launched earlier this year by SDSC, ranked XX on the latest Graph500 lists, also announced this week at SC11. The ranking was based on runs using less than half of Trestles overall compute capabilities.
With 10,368 processor cores, 324 nodes, and a peak speed of 100 teraflops per second (TFlop/s), Trestles has already been used for more than 200 separate research projects since its launch last January, with research areas ranging from astrophysics to molecular dynamics. Gordon and Trestles are available to users of the NSF’s new XSEDE (Extreme Science and Engineering Discovery Environment) program, which integrates 16 supercomputers and high-end visualization and data analysis resources across the country to provide the most comprehensive collection of advanced digital services in the world. The new project replaced the NSF’s TeraGrid program after about 10 years.
Jan Zverina | Newswise Science News
Earthquake researchers finalists for supercomputing prize
19.11.2018 | University of Tokyo
Putting food-safety detection in the hands of consumers
15.11.2018 | Massachusetts Institute of Technology
Researchers at the University of New Hampshire have captured a difficult-to-view singular event involving "magnetic reconnection"--the process by which sparse particles and energy around Earth collide producing a quick but mighty explosion--in the Earth's magnetotail, the magnetic environment that trails behind the planet.
Magnetic reconnection has remained a bit of a mystery to scientists. They know it exists and have documented the effects that the energy explosions can...
Biochips have been developed at TU Wien (Vienna), on which tissue can be produced and examined. This allows supplying the tissue with different substances in a very controlled way.
Cultivating human cells in the Petri dish is not a big challenge today. Producing artificial tissue, however, permeated by fine blood vessels, is a much more...
Faster and secure data communication: This is the goal of a new joint project involving physicists from the University of Würzburg. The German Federal Ministry of Education and Research funds the project with 14.8 million euro.
In our digital world data security and secure communication are becoming more and more important. Quantum communication is a promising approach to achieve...
On Saturday, 10 November 2018, the research icebreaker Polarstern will leave its homeport of Bremerhaven, bound for Cape Town, South Africa.
When choosing materials to make something, trade-offs need to be made between a host of properties, such as thickness, stiffness and weight. Depending on the application in question, finding just the right balance is the difference between success and failure
Now, a team of Penn Engineers has demonstrated a new material they call "nanocardboard," an ultrathin equivalent of corrugated paper cardboard. A square...
19.11.2018 | Event News
09.11.2018 | Event News
06.11.2018 | Event News
19.11.2018 | Materials Sciences
19.11.2018 | Information Technology
19.11.2018 | Life Sciences