Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New study casts doubt on validity of standard earthquake-prediction model

19.09.2002


A new study by Stanford University geophysicists is raising serious questions about a fundamental technique used to make long-range earthquake predictions.



Writing in the journal Nature, geophysicists Jessica Murray and Paul Segall show how a widely used earthquake model failed to predict when a long-anticipated magnitude 6 quake would strike the San Andreas Fault in Central California.

In their Sept. 19 Nature study, Murray and Segall analyzed the "time-predictable recurrence model" – a technique for estimating the time when an earthquake will occur. This model is used to calculate the probability of future earthquakes.


Developed by Japanese geophysicists K. Shimazaki and T. Nakata in 1980, the time-predictable model has become a standard tool for hazard prediction in many earthquake-prone regions – including the United States, Japan and New Zealand.

Strain build-up

The time-predictable model is based on the theory that earthquakes in fault zones are caused by the constant build-up and release of strain in the Earth’s crust.

"With a plate boundary like the San Andreas, you have the North American plate on one side and the Pacific plate on the other," explained Segall, a professor of geophysics. "The two plates are moving at a very steady rate with respect to one another, so strain is being put into the system at an essentially constant rate."

When an earthquake occurs on the fault, a certain amount of accumulated strain is released, added Murray, a geophysics graduate student.

"Following the quake, strain builds up again because of the continuous grinding of the tectonic plates," she noted. "According to the time-predictable model, if you know the size of the most recent earthquake and the rate of strain accumulation afterwards, you should be able to forecast the time that the next event will happen simply by dividing the strain released by the strain-accumulation rate."

Parkfield, Calif.

Although the model makes sense on paper, Murray and Segall wanted to put it to the test using long-term data collected in an ideal setting. Their choice was Parkfield – a tiny town in Central California midway between San Francisco and Los Angeles. Perched along the San Andreas Fault, Parkfield has been rocked by a magnitude 6 earthquake every 22 years on average since 1857. The last one struck in 1966, and geologists have been collecting earthquake data there ever since.

"Parkfield is a good place to test the model because we have measurements of surface ground motion during the 1966 earthquake and of the strain that’s been accumulating since," Murray noted. "It’s also located in a fairly simple part of the San Andreas system because it’s on the main strand of the fault and doesn’t have other parallel faults running nearby."

When Murray and Segall applied the time-predictable model to the Parkfield data, they came up with a forecast of when the next earthquake would occur.

"According to the model, a magnitude 6 earthquake should have taken place between 1973 and 1987 – but it didn’t," Murray said. "In fact, 15 years have gone by. Our results show, with 95 percent confidence, that it should definitely have happened before now, and it hasn’t, so that shows that the model doesn’t work – at least in this location."

Could the time-predictable method work in other parts of the fault, including the densely populated metropolitan areas of Northern and Southern California? The researchers have their doubts,

"We used the model at Parkfield where things are fairly simple," Murray observed, "but when you come to the Bay Area or Los Angeles, there are a lot more fault interactions, so it’s probably even less likely to work in those places."

Segall agreed: "I have to say, in my heart, I believe this model is too simplistic. It’s really not likely to work elsewhere, either, but we still should test it at other sites. Lots of people do these kinds of calculations. What Jessica has done, however, is to be extremely careful. She really bent over backwards to try to understand what the uncertainties of these kinds of calculations are – consulting with our colleagues in the Stanford Statistics Department just to make sure that this was done as carefully and precisely as anybody knows how to do. So we feel quite confident that there’s no way to fudge out of this by saying there are uncertainties in the data or in the method."

Use with caution

Segall pointed out that government agencies in a number of Pacific Rim countries routinely use this technique for long-range hazard assessments.

For example, the U.S. Geological Survey (USGS) relied on the time-predictable model and two other models in its widely publicized 1999 report projecting a 70-percent probability of a large quake striking the San Francisco Bay Area by 2030.

"We’re in a tough situation, because agencies like the USGS – which have the responsibility for issuing forecasts so that city planners and builders can use the best available knowledge – have to do the best they can with what information they have." Segall observed. "The message I would send to my geophysical colleagues about this model is, ’Use with caution.’"

Technological advances in earthquake science could make long-range forecasting a reality one day, added Murray, pointing to the recently launched San Andreas Fault drilling experiment in Parkfield under the aegis of USGS and Stanford.

In the mean time, people living in earthquake-prone regions should plan for the inevitable.

"I always tell people to prepare," Segall concluded. "We know big earthquakes have happened in the past, we know they will happen again. We just don’t know when."

Mark Shwartz | EurekAlert!
Further information:
http://kilauea.Stanford.EDU/paul/
http://quake.wr.usgs.gov/research/parkfield/index.html
http://geopubs.wr.usgs.gov/open-file/of99-517/

More articles from Earth Sciences:

nachricht Sediment from Himalayas may have made 2004 Indian Ocean earthquake more severe
26.05.2017 | Oregon State University

nachricht Devils Hole: Ancient Traces of Climate History
24.05.2017 | Universität Innsbruck

All articles from Earth Sciences >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Can the immune system be boosted against Staphylococcus aureus by delivery of messenger RNA?

Staphylococcus aureus is a feared pathogen (MRSA, multi-resistant S. aureus) due to frequent resistances against many antibiotics, especially in hospital infections. Researchers at the Paul-Ehrlich-Institut have identified immunological processes that prevent a successful immune response directed against the pathogenic agent. The delivery of bacterial proteins with RNA adjuvant or messenger RNA (mRNA) into immune cells allows the re-direction of the immune response towards an active defense against S. aureus. This could be of significant importance for the development of an effective vaccine. PLOS Pathogens has published these research results online on 25 May 2017.

Staphylococcus aureus (S. aureus) is a bacterium that colonizes by far more than half of the skin and the mucosa of adults, usually without causing infections....

Im Focus: A quantum walk of photons

Physicists from the University of Würzburg are capable of generating identical looking single light particles at the push of a button. Two new studies now demonstrate the potential this method holds.

The quantum computer has fuelled the imagination of scientists for decades: It is based on fundamentally different phenomena than a conventional computer....

Im Focus: Turmoil in sluggish electrons’ existence

An international team of physicists has monitored the scattering behaviour of electrons in a non-conducting material in real-time. Their insights could be beneficial for radiotherapy.

We can refer to electrons in non-conducting materials as ‘sluggish’. Typically, they remain fixed in a location, deep inside an atomic composite. It is hence...

Im Focus: Wafer-thin Magnetic Materials Developed for Future Quantum Technologies

Two-dimensional magnetic structures are regarded as a promising material for new types of data storage, since the magnetic properties of individual molecular building blocks can be investigated and modified. For the first time, researchers have now produced a wafer-thin ferrimagnet, in which molecules with different magnetic centers arrange themselves on a gold surface to form a checkerboard pattern. Scientists at the Swiss Nanoscience Institute at the University of Basel and the Paul Scherrer Institute published their findings in the journal Nature Communications.

Ferrimagnets are composed of two centers which are magnetized at different strengths and point in opposing directions. Two-dimensional, quasi-flat ferrimagnets...

Im Focus: World's thinnest hologram paves path to new 3-D world

Nano-hologram paves way for integration of 3-D holography into everyday electronics

An Australian-Chinese research team has created the world's thinnest hologram, paving the way towards the integration of 3D holography into everyday...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Marine Conservation: IASS Contributes to UN Ocean Conference in New York on 5-9 June

24.05.2017 | Event News

AWK Aachen Machine Tool Colloquium 2017: Internet of Production for Agile Enterprises

23.05.2017 | Event News

Dortmund MST Conference presents Individualized Healthcare Solutions with micro and nanotechnology

22.05.2017 | Event News

 
Latest News

How herpesviruses win the footrace against the immune system

26.05.2017 | Life Sciences

Water forms 'spine of hydration' around DNA, group finds

26.05.2017 | Life Sciences

First Juno science results supported by University of Leicester's Jupiter 'forecast'

26.05.2017 | Physics and Astronomy

VideoLinks
B2B-VideoLinks
More VideoLinks >>>