Tromp is an example of the new users in today’s uncertain world who require immediate access to supercomputing resources. To meet this need, SDSC has introduced OnDemand, a new supercomputing resource that will support event-driven science.
“This is the first time that an allocated National Science Foundation (NSF) TeraGrid supercomputing resource will support on-demand users for urgent science applications,” said Anke Kamrath, director of User Services at SDSC. “In opening this new computing paradigm we’ve had to develop novel ways of handling this type of allocation as well as scheduling and job handling procedures.”
The system is already in operation and formal allocations of time for the OnDemandsystem will begin in October, with proposals due July 13. In addition to supporting important research now, this system will serve as a model to develop on-demand capabilities on additional TeraGrid systems in the future. TeraGrid is an NSF-funded computing grid linking some of the nation’s largest supercomputer centers including SDSC.
Urgent applications that will make use of OnDemand range from making movies of Southern California earthquakes to systems that will help give near real-time warnings based on predictions of the path of tornados or a hurricane, or foretell the most likely direction of a toxic plume released by an industrial accident or terrorist incident.
When an earthquake greater than magnitude 3.5 strikes Southern California, typically once or twice a month, Tromp expects that his simulation code will need to use 144 processors of the OnDemand system for about 28 minutes. Shortly after the earthquake strikes a job will automatically be submitted and immediately allowed to run. The code will launch and any “normal” jobs running at the time will be interrupted to make way for the on-demand job.
“SDSC’s new OnDemand system is an important step forward for our event-driven earthquake science,” said Tromp. “We’re getting good performance that will let us cut the time to deliver earthquake movies from about 45 to 30 minutes or less, and every minute is important.”
The movies that result from the computations are made available as part of the ShakeMovie project in Caltech's Near Real-Time Simulation of Southern California Seismic Events Portal. But behind the scenes of these dramatic earthquake movies, a great deal of coordinated activity is rapidly taking place in a complex, automated workflow.
The system springs to life every time an earthquake occurs in Southern California. When an event takes place, thousands of seismograms, or ground motion measurements, are recorded at hundreds of stations across the region, and the earthquake’s epicenter, or location, as well as its depth and intensity are determined.
The waiting ShakeMovie system at Caltech collects these seismic recordings automatically over the Internet. Then, for events greater than magnitude 3.5, to fill in the gaps between the actual ground motion recorded at specific locations in the region, the scientists use the recorded data to guide a computer model that creates a “virtual earthquake,” giving an overall view of the ground motion throughout the region.
The animations rely on the SPECFEM3D_BASIN software, which simulates seismic wave propagation in sedimentary basins. The software computes the motion of the earth in 3-D based on the actual earthquake recordings and what is known about the subsurface structure of the region, which greatly affects the wave motion -- bending, speeding or slowing, and reflecting energy in complex ways.
After the full 3-D wave simulation is run on the OnDemand system at SDSC and a system at Caltech for redundancy, data that captures the surface motion (displacement, velocity, and acceleration) are collected and mapped onto the topography of Southern California, and rendered into movies. The movies are then automatically published via the portal, and an email is sent to subscribers, including the news media and the public.
In between the urgent jobs that use SDSC’s OnDemand resource, other users will run on the system in a normal way. The system has StarP installed, a parallel version that provides a high-performance backend to packages such as Matlab.
OnDemand is a Dell cluster with 64 Intel dual-socket, dual-core compute nodes for a total of 256 processors. The 2.33 GHz, 4-way nodes have 8 GB of memory. The system, which has a nominal theoretical peak performance of 2.4 Tflops, is running the SDSC-developed Rocks open-source Linux cluster operation software and has the IBRIX parallel file system. Jobs are scheduled by the Sun Grid Engine.
Paul Tooby | EurekAlert!
Information integration and artificial intelligence for better diagnosis and therapy decisions
24.05.2017 | Fraunhofer MEVIS - Institut für Bildgestützte Medizin
World's thinnest hologram paves path to new 3-D world
18.05.2017 | RMIT University
The world's highest gain high power laser amplifier - by many orders of magnitude - has been developed in research led at the University of Strathclyde.
The researchers demonstrated the feasibility of using plasma to amplify short laser pulses of picojoule-level energy up to 100 millijoules, which is a 'gain'...
Staphylococcus aureus is a feared pathogen (MRSA, multi-resistant S. aureus) due to frequent resistances against many antibiotics, especially in hospital infections. Researchers at the Paul-Ehrlich-Institut have identified immunological processes that prevent a successful immune response directed against the pathogenic agent. The delivery of bacterial proteins with RNA adjuvant or messenger RNA (mRNA) into immune cells allows the re-direction of the immune response towards an active defense against S. aureus. This could be of significant importance for the development of an effective vaccine. PLOS Pathogens has published these research results online on 25 May 2017.
Staphylococcus aureus (S. aureus) is a bacterium that colonizes by far more than half of the skin and the mucosa of adults, usually without causing infections....
Physicists from the University of Würzburg are capable of generating identical looking single light particles at the push of a button. Two new studies now demonstrate the potential this method holds.
The quantum computer has fuelled the imagination of scientists for decades: It is based on fundamentally different phenomena than a conventional computer....
An international team of physicists has monitored the scattering behaviour of electrons in a non-conducting material in real-time. Their insights could be beneficial for radiotherapy.
We can refer to electrons in non-conducting materials as ‘sluggish’. Typically, they remain fixed in a location, deep inside an atomic composite. It is hence...
Two-dimensional magnetic structures are regarded as a promising material for new types of data storage, since the magnetic properties of individual molecular building blocks can be investigated and modified. For the first time, researchers have now produced a wafer-thin ferrimagnet, in which molecules with different magnetic centers arrange themselves on a gold surface to form a checkerboard pattern. Scientists at the Swiss Nanoscience Institute at the University of Basel and the Paul Scherrer Institute published their findings in the journal Nature Communications.
Ferrimagnets are composed of two centers which are magnetized at different strengths and point in opposing directions. Two-dimensional, quasi-flat ferrimagnets...
24.05.2017 | Event News
23.05.2017 | Event News
22.05.2017 | Event News
29.05.2017 | Earth Sciences
29.05.2017 | Life Sciences
29.05.2017 | Physics and Astronomy