Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New study casts doubt on validity of standard earthquake-prediction model

19.09.2002


A new study by Stanford University geophysicists is raising serious questions about a fundamental technique used to make long-range earthquake predictions.



Writing in the journal Nature, geophysicists Jessica Murray and Paul Segall show how a widely used earthquake model failed to predict when a long-anticipated magnitude 6 quake would strike the San Andreas Fault in Central California.

In their Sept. 19 Nature study, Murray and Segall analyzed the "time-predictable recurrence model" – a technique for estimating the time when an earthquake will occur. This model is used to calculate the probability of future earthquakes.


Developed by Japanese geophysicists K. Shimazaki and T. Nakata in 1980, the time-predictable model has become a standard tool for hazard prediction in many earthquake-prone regions – including the United States, Japan and New Zealand.

Strain build-up

The time-predictable model is based on the theory that earthquakes in fault zones are caused by the constant build-up and release of strain in the Earth’s crust.

"With a plate boundary like the San Andreas, you have the North American plate on one side and the Pacific plate on the other," explained Segall, a professor of geophysics. "The two plates are moving at a very steady rate with respect to one another, so strain is being put into the system at an essentially constant rate."

When an earthquake occurs on the fault, a certain amount of accumulated strain is released, added Murray, a geophysics graduate student.

"Following the quake, strain builds up again because of the continuous grinding of the tectonic plates," she noted. "According to the time-predictable model, if you know the size of the most recent earthquake and the rate of strain accumulation afterwards, you should be able to forecast the time that the next event will happen simply by dividing the strain released by the strain-accumulation rate."

Parkfield, Calif.

Although the model makes sense on paper, Murray and Segall wanted to put it to the test using long-term data collected in an ideal setting. Their choice was Parkfield – a tiny town in Central California midway between San Francisco and Los Angeles. Perched along the San Andreas Fault, Parkfield has been rocked by a magnitude 6 earthquake every 22 years on average since 1857. The last one struck in 1966, and geologists have been collecting earthquake data there ever since.

"Parkfield is a good place to test the model because we have measurements of surface ground motion during the 1966 earthquake and of the strain that’s been accumulating since," Murray noted. "It’s also located in a fairly simple part of the San Andreas system because it’s on the main strand of the fault and doesn’t have other parallel faults running nearby."

When Murray and Segall applied the time-predictable model to the Parkfield data, they came up with a forecast of when the next earthquake would occur.

"According to the model, a magnitude 6 earthquake should have taken place between 1973 and 1987 – but it didn’t," Murray said. "In fact, 15 years have gone by. Our results show, with 95 percent confidence, that it should definitely have happened before now, and it hasn’t, so that shows that the model doesn’t work – at least in this location."

Could the time-predictable method work in other parts of the fault, including the densely populated metropolitan areas of Northern and Southern California? The researchers have their doubts,

"We used the model at Parkfield where things are fairly simple," Murray observed, "but when you come to the Bay Area or Los Angeles, there are a lot more fault interactions, so it’s probably even less likely to work in those places."

Segall agreed: "I have to say, in my heart, I believe this model is too simplistic. It’s really not likely to work elsewhere, either, but we still should test it at other sites. Lots of people do these kinds of calculations. What Jessica has done, however, is to be extremely careful. She really bent over backwards to try to understand what the uncertainties of these kinds of calculations are – consulting with our colleagues in the Stanford Statistics Department just to make sure that this was done as carefully and precisely as anybody knows how to do. So we feel quite confident that there’s no way to fudge out of this by saying there are uncertainties in the data or in the method."

Use with caution

Segall pointed out that government agencies in a number of Pacific Rim countries routinely use this technique for long-range hazard assessments.

For example, the U.S. Geological Survey (USGS) relied on the time-predictable model and two other models in its widely publicized 1999 report projecting a 70-percent probability of a large quake striking the San Francisco Bay Area by 2030.

"We’re in a tough situation, because agencies like the USGS – which have the responsibility for issuing forecasts so that city planners and builders can use the best available knowledge – have to do the best they can with what information they have." Segall observed. "The message I would send to my geophysical colleagues about this model is, ’Use with caution.’"

Technological advances in earthquake science could make long-range forecasting a reality one day, added Murray, pointing to the recently launched San Andreas Fault drilling experiment in Parkfield under the aegis of USGS and Stanford.

In the mean time, people living in earthquake-prone regions should plan for the inevitable.

"I always tell people to prepare," Segall concluded. "We know big earthquakes have happened in the past, we know they will happen again. We just don’t know when."

Mark Shwartz | EurekAlert!
Further information:
http://kilauea.Stanford.EDU/paul/
http://quake.wr.usgs.gov/research/parkfield/index.html
http://geopubs.wr.usgs.gov/open-file/of99-517/

More articles from Earth Sciences:

nachricht Six-decade-old space mystery solved with shoebox-sized satellite called a CubeSat
15.12.2017 | National Science Foundation

nachricht NSF-funded researchers find that ice sheet is dynamic and has repeatedly grown and shrunk
15.12.2017 | National Science Foundation

All articles from Earth Sciences >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: First-of-its-kind chemical oscillator offers new level of molecular control

DNA molecules that follow specific instructions could offer more precise molecular control of synthetic chemical systems, a discovery that opens the door for engineers to create molecular machines with new and complex behaviors.

Researchers have created chemical amplifiers and a chemical oscillator using a systematic method that has the potential to embed sophisticated circuit...

Im Focus: Long-lived storage of a photonic qubit for worldwide teleportation

MPQ scientists achieve long storage times for photonic quantum bits which break the lower bound for direct teleportation in a global quantum network.

Concerning the development of quantum memories for the realization of global quantum networks, scientists of the Quantum Dynamics Division led by Professor...

Im Focus: Electromagnetic water cloak eliminates drag and wake

Detailed calculations show water cloaks are feasible with today's technology

Researchers have developed a water cloaking concept based on electromagnetic forces that could eliminate an object's wake, greatly reducing its drag while...

Im Focus: Scientists channel graphene to understand filtration and ion transport into cells

Tiny pores at a cell's entryway act as miniature bouncers, letting in some electrically charged atoms--ions--but blocking others. Operating as exquisitely sensitive filters, these "ion channels" play a critical role in biological functions such as muscle contraction and the firing of brain cells.

To rapidly transport the right ions through the cell membrane, the tiny channels rely on a complex interplay between the ions and surrounding molecules,...

Im Focus: Towards data storage at the single molecule level

The miniaturization of the current technology of storage media is hindered by fundamental limits of quantum mechanics. A new approach consists in using so-called spin-crossover molecules as the smallest possible storage unit. Similar to normal hard drives, these special molecules can save information via their magnetic state. A research team from Kiel University has now managed to successfully place a new class of spin-crossover molecules onto a surface and to improve the molecule’s storage capacity. The storage density of conventional hard drives could therefore theoretically be increased by more than one hundred fold. The study has been published in the scientific journal Nano Letters.

Over the past few years, the building blocks of storage media have gotten ever smaller. But further miniaturization of the current technology is hindered by...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

See, understand and experience the work of the future

11.12.2017 | Event News

Innovative strategies to tackle parasitic worms

08.12.2017 | Event News

AKL’18: The opportunities and challenges of digitalization in the laser industry

07.12.2017 | Event News

 
Latest News

Engineers program tiny robots to move, think like insects

15.12.2017 | Power and Electrical Engineering

One in 5 materials chemistry papers may be wrong, study suggests

15.12.2017 | Materials Sciences

New antbird species discovered in Peru by LSU ornithologists

15.12.2017 | Life Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>