Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Supercomputer Unleashes Virtual 9.0 Megaquake in Pacific Northwest

28.02.2008
Simulation May Help Big Cities Develop Early Warning Systems

On January 26, 1700, at about 9 p.m. local time, the Juan de Fuca plate beneath the ocean in the Pacific Northwest suddenly moved, slipping some 60 feet eastward beneath the North American plate in a monster quake of approximately magnitude 9, setting in motion large tsunamis that struck the coast of North America and traveled to the shores of Japan.


Virtual 9.0 Earthquake
Shakes Pacific Northwest

Scientists used a supercomputer-driven “virtual earthquake” to explore likely ground shaking in a magnitude 9.0 megathrust earthquake in the Pacific Northwest. Peak ground velocities are displayed in yellow and red. The legend represents speed in meters per second (m/s) with red equaling 2.3 m/s. Although the largest ground motions occur offshore near the fault and decrease eastward, sedimentary basins lying beneath some cities amplify the shaking in Seattle, Tacoma, Olympia, and Vancouver, increasing the risk of damage. Credit: Kim Olsen, SDSU.

Since then, the earth beneath the region – which includes the cities of Vancouver, Seattle and Portland -- has been relatively quiet. But scientists believe that earthquakes with magnitudes greater than 8, so-called “megathrust events,” occur along this fault on average every 400 to 500 years.

To help prepare for the next megathrust earthquake, a team of researchers led by seismologist Kim Olsen of San Diego State University (SDSU) used a supercomputer-powered “virtual earthquake” program to calculate for the first time realistic three-dimensional simulations that describe the possible impacts of megathrust quakes on the Pacific Northwest region. Also participating in the study were researchers from the San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.

What the scientists learned from this simulation is not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle.

With a rupture scenario beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone, the ground moved about 1 ½ feet per second in Seattle; nearly 6 inches per second in Tacoma, Olympia and Vancouver; and 3 inches in Portland, Oregon. Additional simulations, especially of earthquakes that begin in the southern part of the rupture zone, suggest that the ground motion under some conditions can be up to twice as large.

“We also found that these high ground velocities were accompanied by significant low-frequency shaking, like what you feel in a roller coaster, that lasted as long as five minutes – and that’s a long time,” said Olsen.

The long-duration shaking, combined with high ground velocities, raises the possibility that such an earthquake could inflict major damage on metropolitan areas -- especially on high-rise buildings in downtown Seattle. Compounding the risks, like Los Angeles to the south, Seattle, Tacoma, and Olympia sit on top of sediment-filled geological basins that are prone to greatly amplifying the waves generated by major earthquakes.

“One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest,” said Olsen. “Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.” Depending on how far the earthquake is from a city, early warning systems could give from a few seconds to a few tens of seconds to implement measures, such as automatically stopping trains and elevators.

Added Olsen, “The information from these simulations can also play a role in research into the hazards posed by large tsunamis, which can originate from such megathrust earthquakes like the ones generated in the 2004 Sumatra-Andeman earthquake in Indonesia.” One of the largest earthquakes ever recorded, the magnitude 9.2 Sumatra-Andeman event was felt as far away as Bangladesh, India, and Malaysia, and triggered devastating tsunamis that killed more than 200,000 people.

In addition to increasing scientific understanding of these massive earthquakes, the results of the simulations can also be used to guide emergency planners, to improve building codes, and help engineers design safer structures -- potentially saving lives and property in this region of some 9 million people.

Even with the large supercomputing and data resources at SDSC, creating “virtual earthquakes” is a daunting task. The computations to prepare initial conditions were carried out on SDSC’s DataStar supercomputer, and then the resulting information was transferred for the main simulations to the center’s Blue Gene Data supercomputer via SDSC’s advanced virtual file system or GPFS-WAN, which makes data seamlessly available on different – sometimes distant – supercomputers.

Coordinating the simulations required a complex choreography of moving information into and out of the supercomputer as Olsen’s sophisticated “Anelastic Wave Model” simulation code was running. Completing just one of several simulations, running on 2,000 supercomputer processors, required some 80,000 processor hours – equal to running one program continuously on your PC for more than 9 years!

“To solve the new challenges that arise when researchers need to run their codes at the largest scales, and data sets grow to great size, we worked closely with the earthquake scientists through several years of code optimization and modifications,” said SDSC computational scientist Yifeng Cui, who contributed numerous refinements to allow the computer model to “scale up” to capture a magnitude 9 earthquake over such a vast area.

In order to run the simulations, the scientists must recreate in their model the components that encompass all the important aspects of the earthquake. One component is an accurate representation of the earth’s subsurface layering, and how its structure will bend, reflect, and change the size and direction of the traveling earthquake waves. Co-author William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer, from Ulm University in Germany, to create the first unified “velocity model” of the layering for this entire region, extending from British Columbia to Northern California.

Another component is a model of the earthquake source from the slipping of the Juan de Fuca plate underneath the North American plate. Making use of the extensive measurements of the massive 2004 Sumatra-Andeman earthquake in Indonesia, the scientists developed a model of the earthquake source for similar megathrust earthquakes in the Pacific Northwest.

The sheer physical size of the region in the study was also challenging. The scientists included in their virtual model an immense slab of the earth more than 650 miles long by 340 miles by 30 miles deep -- more than 7 million cubic miles -- and used a computer mesh spacing of 250 meters to divide the volume into some 2 billion cubes. This mesh size allows the simulations to model frequencies up to 0.5 Hertz, which especially affect tall buildings.

“One of the strengths of an earthquake simulation model is that it lets us run scenarios of different earthquakes to explore how they may affect ground motion,” said Olsen. Because the accumulated stresses or “slip deficit” can be released in either one large event or several smaller events, the scientists ran scenarios for earthquakes of different sizes.

“We found that the magnitude 9 scenarios generate peak ground velocities five to 10 times larger than those from the smaller magnitude 8.5 quakes.”

The researchers are planning to conduct additional simulations to explore the range of impacts that depend on where the earthquake starts, the direction of travel of the rupture along the fault, and other factors that can vary.

This research was supported by the National Science Foundation, the U.S. Geological Survey, the Southern California Earthquake Center, and computing time on an NSF supercomputer at SDSC.

Paul Tooby | EurekAlert!
Further information:
http://www.sdsc.edu

More articles from Earth Sciences:

nachricht NASA sees the end of ex-Tropical Cyclone 02W
21.04.2017 | NASA/Goddard Space Flight Center

nachricht New research unlocks forests' potential in climate change mitigation
21.04.2017 | Clemson University

All articles from Earth Sciences >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Deep inside Galaxy M87

The nearby, giant radio galaxy M87 hosts a supermassive black hole (BH) and is well-known for its bright jet dominating the spectrum over ten orders of magnitude in frequency. Due to its proximity, jet prominence, and the large black hole mass, M87 is the best laboratory for investigating the formation, acceleration, and collimation of relativistic jets. A research team led by Silke Britzen from the Max Planck Institute for Radio Astronomy in Bonn, Germany, has found strong indication for turbulent processes connecting the accretion disk and the jet of that galaxy providing insights into the longstanding problem of the origin of astrophysical jets.

Supermassive black holes form some of the most enigmatic phenomena in astrophysics. Their enormous energy output is supposed to be generated by the...

Im Focus: A Quantum Low Pass for Photons

Physicists in Garching observe novel quantum effect that limits the number of emitted photons.

The probability to find a certain number of photons inside a laser pulse usually corresponds to a classical distribution of independent events, the so-called...

Im Focus: Microprocessors based on a layer of just three atoms

Microprocessors based on atomically thin materials hold the promise of the evolution of traditional processors as well as new applications in the field of flexible electronics. Now, a TU Wien research team led by Thomas Müller has made a breakthrough in this field as part of an ongoing research project.

Two-dimensional materials, or 2D materials for short, are extremely versatile, although – or often more precisely because – they are made up of just one or a...

Im Focus: Quantum-physical Model System

Computer-assisted methods aid Heidelberg physicists in reproducing experiment with ultracold atoms

Two researchers at Heidelberg University have developed a model system that enables a better understanding of the processes in a quantum-physical experiment...

Im Focus: Glacier bacteria’s contribution to carbon cycling

Glaciers might seem rather inhospitable environments. However, they are home to a diverse and vibrant microbial community. It’s becoming increasingly clear that they play a bigger role in the carbon cycle than previously thought.

A new study, now published in the journal Nature Geoscience, shows how microbial communities in melting glaciers contribute to the Earth’s carbon cycle, a...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Expert meeting “Health Business Connect” will connect international medical technology companies

20.04.2017 | Event News

Wenn der Computer das Gehirn austrickst

18.04.2017 | Event News

7th International Conference on Crystalline Silicon Photovoltaics in Freiburg on April 3-5, 2017

03.04.2017 | Event News

 
Latest News

New quantum liquid crystals may play role in future of computers

21.04.2017 | Physics and Astronomy

A promising target for kidney fibrosis

21.04.2017 | Health and Medicine

Light rays from a supernova bent by the curvature of space-time around a galaxy

21.04.2017 | Physics and Astronomy

VideoLinks
B2B-VideoLinks
More VideoLinks >>>