"This is the most realistic model to date," said Kaj Johnson, assistant professor of geological sciences at Indiana University, who worked on the modeling project several years ago when he was a Stanford graduate student. "This is something people had been asking for years now. It's the next step."
Johnson and Stanford geophysics Professor Paul Segall will present their new probability model at 11:35 a.m. PT on Dec. 14, at the annual meeting of the American Geophysical Union in San Francisco during a talk titled "Distribution of Slip on San Francisco Bay Area Faults" in Room 307, Moscone Center South.
An important component of earthquake-probability assessment is determining how fast a fault moves. One technique involves the use of GPS, which allows seismologists to measure the movement of various points on the surface of the Earth, then use these data to extrapolate underground fault movement. Another way to determine fault slip rates is to dig a trench across the fault and find the signatures of past earthquakes, a method called paleoseismology.
"People say, let's compare rates of fault movement from GPS to rates of fault movement from geologic studies," Segall said. "But it's as if you're measuring different parts of the same thing with different tools. The discrepancy can be quite big."
To bridge the gap, Segall and Johnson created a new model that weaves together everything known about how a fault moves. The idea for the model came when Segall was asked to speak at a conference on the "rate debate," which is how geophysicists refer to the GPS-paleoseismology discrepancy. That's when he realized that the standard model doesn't take into account that fault-slippage rates vary over time.
This time dependence is important, because GPS doesn't measure fault slippage directly. Rather, it measures how quickly points on the surface of the Earth are moving. Then scientists try to fit these data into mathematical models to estimate the rate of slip. "Because of the time-dependent rate, your estimate depends on where you are in the earthquake cycle," Segall said. "So if you use a model that doesn't take that into account, you will get a slip rate that's different."
The scientists hope that their new updated model can give a more accurate picture of slip rates and reconcile the two pieces of fault data.
California and Asia
With the new model, the team confirmed that the slip rates from GPS and from the geological record for the San Francisco Bay Area are relatively consistent. "Along the San Andreas system, the numbers tend to come out in reasonable agreement," Segall said.
The next step for the scientists is to use their time-dependent model to scrutinize faults in other tectonically active regions, such as China, where there is a large disparity between contemporary GPS data and the paleoseismological record. "We want to take the same philosophy and procedure and apply it to different places in the world where the discrepancy can be quite big," Segall noted. "We're developing a strategy for how to move forward. We're still very much in the progress phase."
Johnson is working on applying the new model to faults in Taiwan and Tibet, where the earthquake hazard is great. "This can help inform people who make the forecasts," Johnson said. "These new time-dependent models are going to become the norm, I think."
Mark Shwartz | EurekAlert!
NASA finds newly formed tropical storm lan over open waters
17.10.2017 | NASA/Goddard Space Flight Center
The melting ice makes the sea around Greenland less saline
16.10.2017 | Aarhus University
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
17.10.2017 | Life Sciences
17.10.2017 | Life Sciences
17.10.2017 | Earth Sciences