SDSC researchers submitted an entry for Gordon using 218 teraflops per second (Tflop/s) and 12,608 cores – about 75 percent of the system. Built by SDSC researchers and Appro, a leading developer of supercomputing solutions, Gordon, the next generation Appro Xtreme-X™ Supercomputer is currently undergoing acceptance testing and when made available to the research community on January 1, 2012, it will have 16,384 cores and achieve over 280 Tflop/s.
The latest Top500 rankings were announced during the SC11 (Supercomputing 2011) conference in Seattle this week. In related acceptance runs, Gordon was tested as to its ability to do I/O (input/output) operations, and achieved an unprecedented 35 million IOPS (input/output operations per second), making it the most powerful supercomputer ever commissioned by the National Science Foundation (NSF) for doing IO.
IOPS is an important measure for data intensive computing since it indicates the ability of a storage system to perform I/O operations on small transfer sizes of randomly organized data – something prevalent in database and data mining applications.
Gordon’s Top500 result is notable in that the Tflop/s ranking was achieved using about half the number of cores compared to most other systems. That’s because Gordon is among the first systems – and the first one commissioned by the NSF – to use Intel® Xeon® processor E5 Family, which perform twice as many operations per clock (eight versus four) of any system currently in use.
“These are truly remarkable results not only for us, but for countless researchers who are moving along with us into an era of data-intensive computing, where supercomputers using innovative new architectures will become a vital tool for them in accelerating new discoveries across a wide range of disciplines,” said Michael Norman, director of SDSC and a co-principal investigator of the Gordon project, the result of a five-year, $20 million NSF award.
“Taken together, these two results mean that Gordon will be the defacto leader in NSF systems in terms of ‘big data’ analysis computing,” said Allan Snavely, SDSC’s associate director and a co-PI for the Gordon system. “Gordon was designed from the outset to be a balanced system between compute and I/O to take on some of the most data-intensive challenges in the real world, not just to score well in terms of FLOPS.”
Gordon is the first high-performance supercomputer to use large amounts of flash-based SSD (solid state drive) memory – think of it as the world’s largest thumb drive. Flash memory is more common in smaller devices such as mobile phones and laptop computers, but unique for supercomputers, which generally use slower spinning disk technology. Gordon uses 1,024 of Intel’s latest high endurance SSD’s, providing the significantly higher performance and endurance required for data intensive applications.
“Gordon, the Appro Xtreme-X Supercomputer, rises to the challenge of the San Diego Supercomputer Center’s computational demands with delivery of a data-intensive computing solution that blends cutting-edge for compute, network, and storage,” said Steve Lyness, vice president of Appro HPC Solutions Group. “The Appro Xtreme-X system based on the Future Intel Xeon processor E5 Family and the new Intel SSD 710 Series will maximize I/O capabilities and flash memory support to analyze large data sets used by SDSC to answer modern science’s critical problems.”
Gordon is capable of handling massive data bases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries. With almost 5 TB (terabytes) of high-performance flash memory in each I/O node, Gordon will provide a new level of performance and capacity for solving the most challenging data-intensive challenges, such as large graph problems, data mining operations, de novo genome assembly, database applications, and quantum chemistry. Gordon’s unique architecture includes:‧ 1,024 dual-socket compute nodes, each with 2 8-core Intel Xeon E5 Family processors, and 64 GB (gigabyte) DDR3 1333 memory
‧ High-performance parallel file system with over 4 PB (petabytes) of capacity, and sustained rates of 100 GB/s (gigabytes per second)
The Top500 list of the world’s most powerful supercomputers, now in its 19th year, has been joined by another supercomputer ranking called the Graph500. While developers of the new ranking complements the Top500, industry experts say that the Graph500 ranking seeks to quantify how much work a supercomputer can do based on its ability to analyze very large graph-based datasets that have millions of disparate points, while the Top500 ranking uses LINPACK software to determine sheer speed – or how fast a supercomputer can perform linear algebraic calculations.
Trestles, another new but smaller supercomputer using flash-based memory and launched earlier this year by SDSC, ranked XX on the latest Graph500 lists, also announced this week at SC11. The ranking was based on runs using less than half of Trestles overall compute capabilities.
With 10,368 processor cores, 324 nodes, and a peak speed of 100 teraflops per second (TFlop/s), Trestles has already been used for more than 200 separate research projects since its launch last January, with research areas ranging from astrophysics to molecular dynamics. Gordon and Trestles are available to users of the NSF’s new XSEDE (Extreme Science and Engineering Discovery Environment) program, which integrates 16 supercomputers and high-end visualization and data analysis resources across the country to provide the most comprehensive collection of advanced digital services in the world. The new project replaced the NSF’s TeraGrid program after about 10 years.
Jan Zverina | Newswise Science News
AI implications: Engineer's model lays groundwork for machine-learning device
18.08.2017 | Washington University in St. Louis
Smarter robot vacuum cleaners for automated office cleaning
15.08.2017 | Fraunhofer-Institut für Arbeitswirtschaft und Organisation IAO
Whether you call it effervescent, fizzy, or sparkling, carbonated water is making a comeback as a beverage. Aside from quenching thirst, researchers at the University of Illinois at Urbana-Champaign have discovered a new use for these "bubbly" concoctions that will have major impact on the manufacturer of the world's thinnest, flattest, and one most useful materials -- graphene.
As graphene's popularity grows as an advanced "wonder" material, the speed and quality at which it can be manufactured will be paramount. With that in mind,...
Physicists at the University of Bonn have managed to create optical hollows and more complex patterns into which the light of a Bose-Einstein condensate flows. The creation of such highly low-loss structures for light is a prerequisite for complex light circuits, such as for quantum information processing for a new generation of computers. The researchers are now presenting their results in the journal Nature Photonics.
Light particles (photons) occur as tiny, indivisible portions. Many thousands of these light portions can be merged to form a single super-photon if they are...
For the first time, scientists have shown that circular RNA is linked to brain function. When a RNA molecule called Cdr1as was deleted from the genome of mice, the animals had problems filtering out unnecessary information – like patients suffering from neuropsychiatric disorders.
While hundreds of circular RNAs (circRNAs) are abundant in mammalian brains, one big question has remained unanswered: What are they actually good for? In the...
An experimental small satellite has successfully collected and delivered data on a key measurement for predicting changes in Earth's climate.
The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) CubeSat was launched into low-Earth orbit on Nov. 11, 2016, in order to test new...
A study led by scientists of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science in Hamburg presents evidence of the coexistence of superconductivity and “charge-density-waves” in compounds of the poorly-studied family of bismuthates. This observation opens up new perspectives for a deeper understanding of the phenomenon of high-temperature superconductivity, a topic which is at the core of condensed matter research since more than 30 years. The paper by Nicoletti et al has been published in the PNAS.
Since the beginning of the 20th century, superconductivity had been observed in some metals at temperatures only a few degrees above the absolute zero (minus...
16.08.2017 | Event News
04.08.2017 | Event News
26.07.2017 | Event News
18.08.2017 | Life Sciences
18.08.2017 | Physics and Astronomy
18.08.2017 | Information Technology