HLRS Supercomputer Successfully Executed Extreme-Scale Simulation Projects
Supercomputer Hornet of the High Performance Computing Center Stuttgart (HLRS) is ready for extreme-scale computing challenges. The newly installed HPC system (High Performance Computing) successfully finished extensive simulation projects that by far exceeded the calibre of previously performed simulation runs at HLRS:
Der HLRS-Supercomputer Hornet, ein Cray XC40 System, liefert eine Rechenhöchstleistung von 3,8 Petaflops (3,8 Billiarden Rechenoperationen pro Sekunde).
(c) Boris Lehner für HLRS/Universität Stuttgart
Six so called XXL-Projects from computationally demanding scientific fields such as planetary research, climatology, environmental chemistry, aerospace, and scientific engineering were recently applied on the HLRS supercomputer. With each application scaling up to all of Hornet’s available 94,646 compute cores, the machine was put through a demanding endurance test. The achieved results more than satisfied the HLRS HPC experts as well as the scientific users: Hornet lived up to the challenge and passed the simulation “burn-in runs” with flying colors.
The new HLRS supercomputer Hornet, a Cray XC40 system which in its current configuration delivers a peak performance of 3.8 PetaFlops (1 PetaFlops = 1 quadrillion floating point operations per second), was declared “up and running” in late 2014.
In its early installation phase, prior to making the machine available for general use, HLRS had invited national scientists and researchers from various fields to apply large-scale simulation projects on Hornet. The goal was to deliver evidence that all related HPC hardware and software components required to smoothly run highly complex and extreme-scale compute jobs are up and ready for top-notch challenges. Six perfectly suited XXL-Projects were identified and implemented on the HLRS supercomputer:
(1) “Convection Permitting Channel Simulation”, Institute of Physics and Meteorology, Universität Hohenheim
(84,000 compute cores, 84 compute hours, 330 TB of data + 120 TB for pre-processing)
Objective: Run a latitude belt simulation around the Earth at a resolution of a few km for a time period long enough to cover various extreme events on the Northern hemisphere and to study the model performance.
(2) “Direct Numerical Simulation of a Spatially-Developing Turbulent Boundary Along a Flat Plate”, Institute of Aerodynamics and Gas Dynamics (IAG), Universität Stuttgart
(93,840 compute cores, 70 machine hours, 30 TB of data)
Objective: To conduct a direct numerical simulation of the complete transition of a boundary layer flow to fully-developed turbulence along a flat plate up to high Reynolds numbers.
(3) “Prediction of the Turbulent Flow Field Around a Ducted Axial Fan”, Institute of Aerodynamics, RWTH Aachen University
(92,000 compute cores, 110 machine hours, 80 TB of data)
Objective: To better understand the development of vortical flow structures and the turbulence intensity in the tip-gap of a ducted axial fan.
(4) “Large-Eddy Simulation of a Helicopter Engine Jet”, Institute of Aerodynamics, RWTH Aachen University
(94,646 compute cores, 300 machine hours, 120 TB of data)
Objective: Analysis of the impact of internal perturbations due to geometric variations on the flow field and the acoustic field of a helicopter engine jet.
(5) “Ion Transport by Convection and Diffusion“, Institute of Simulation Techniques and Scientific Computing, Universität Siegen
(94.080 compute cores, 5 machine hours, 1.1 TB of data)
Objective: To better understand and optimize the electrodialysis desalination process.
(6) “Large Scale Numerical Simulation of Planetary Interiors”, German Aerospace Center/Technische Universität Berlin
(54,000 compute cores, 3 machine hours, 2 TB of data)
Objective: To study the effect of heat driven convection within planets on the evolution of a planet (how is the surface influenced, how are conditions for life maintained, how do plate-tectonics work, and how quickly can a planet cool).
Demand for High Performance Computing on the Rise
Demand for High Performance Computing is unbroken. Scientists continue to crave for ever increasing computing power. They are eagerly awaiting the availability of even faster systems and better scalable software enabling them to attack and puzzle out the most challenging scientific and engineering problems. “Supply generates demand”, states Prof. Dr.-Ing. Michael M. Resch, Director of HLRS. “With the abilities of ultra-fast machines like Hornet both industry and researchers are quickly realizing that fully leveraging the vast capabilities of such a supercomputer opens unprecedented opportunities and helps them deliver results previously impossible to obtain. We are positive that our HPC infrastructure will be leveraged to its full extent. Hornet will be an invaluable tool in supporting researchers in their pursuit for answers to the most pressing subjects of today’s time, leading to scientific findings and knowledge of great and enduring value,” adds Professor Resch.
Following its ambitious technology roadmap, HLRS is currently striving to implement a planned system expansion which is scheduled to be completed by the end of 2015. The HLRS supercomputing infrastructure will then deliver a peak performance of more than seven PetaFlops (quadrillion mathematical calculations per second) and feature 2.3 petabytes of additional file system storage.
More information about the HLRS XXL-Projects can be found at http://www.gauss-centre.eu/gauss-centre/EN/Projects/XXL_Projects_Hornet/XXL_Proj...
About HLRS: The High Performance Computing Center Stuttgart (HLRS) of the University of Stuttgart is one of the three German supercomputer institutions forming the national Gauss Centre for Supercomputing. HLRS supports German and pan-European researchers as well as industrial users with leading edge supercomputing technology, HPC trainings, and support.
About GCS: The Gauss Centre for Supercomputing (GCS) combines the three national supercom-puting centres HLRS (High Performance Computing Center Stuttgart), JSC (Jülich Supercomputing Centre), and LRZ (Leibniz Supercomputing Centre, Garching near Munich) into Germany’s Tier-0 supercomputing institution. Concertedly, the three centres provide the largest and most powerful supercomputing infrastructure in all of Europe to serve a wide range of industrial and research activities in various disciplines. They also provide top-class training and education for the national as well as the European High Performance Computing (HPC) community. GCS is the German member of PRACE (Partnership for Advance Computing in Europe), an international non-profit association consisting of 25 member countries, whose representative organizations create a pan-European supercomputing infrastructure, providing access to computing and data management resources and services for large-scale scientific and engineering applications at the highest performance level.
GCS has its headquarters in Berlin/Germany.
Andrea Mayer-Grenu | idw - Informationsdienst Wissenschaft
Equipping form with function
23.06.2017 | Institute of Science and Technology Austria
Can we see monkeys from space? Emerging technologies to map biodiversity
23.06.2017 | Forschungsverbund Berlin e.V.
An international team of scientists has proposed a new multi-disciplinary approach in which an array of new technologies will allow us to map biodiversity and the risks that wildlife is facing at the scale of whole landscapes. The findings are published in Nature Ecology and Evolution. This international research is led by the Kunming Institute of Zoology from China, University of East Anglia, University of Leicester and the Leibniz Institute for Zoo and Wildlife Research.
Using a combination of satellite and ground data, the team proposes that it is now possible to map biodiversity with an accuracy that has not been previously...
Heatwaves in the Arctic, longer periods of vegetation in Europe, severe floods in West Africa – starting in 2021, scientists want to explore the emissions of the greenhouse gas methane with the German-French satellite MERLIN. This is made possible by a new robust laser system of the Fraunhofer Institute for Laser Technology ILT in Aachen, which achieves unprecedented measurement accuracy.
Methane is primarily the result of the decomposition of organic matter. The gas has a 25 times greater warming potential than carbon dioxide, but is not as...
Hydrogen is regarded as the energy source of the future: It is produced with solar power and can be used to generate heat and electricity in fuel cells. Empa researchers have now succeeded in decoding the movement of hydrogen ions in crystals – a key step towards more efficient energy conversion in the hydrogen industry of tomorrow.
As charge carriers, electrons and ions play the leading role in electrochemical energy storage devices and converters such as batteries and fuel cells. Proton...
Scientists from the Excellence Cluster Universe at the Ludwig-Maximilians-Universität Munich have establised "Cosmowebportal", a unique data centre for cosmological simulations located at the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences. The complete results of a series of large hydrodynamical cosmological simulations are available, with data volumes typically exceeding several hundred terabytes. Scientists worldwide can interactively explore these complex simulations via a web interface and directly access the results.
With current telescopes, scientists can observe our Universe’s galaxies and galaxy clusters and their distribution along an invisible cosmic web. From the...
Temperature measurements possible even on the smallest scale / Molecular ruby for use in material sciences, biology, and medicine
Chemists at Johannes Gutenberg University Mainz (JGU) in cooperation with researchers of the German Federal Institute for Materials Research and Testing (BAM)...
19.06.2017 | Event News
13.06.2017 | Event News
13.06.2017 | Event News
23.06.2017 | Physics and Astronomy
23.06.2017 | Physics and Astronomy
23.06.2017 | Information Technology