“This is exactly what we’ve been projecting to happen, both in short-term fire forecasts for this year and the longer term patterns that can be linked to global climate change,” said Ronald Neilson, a professor at Oregon State University and bioclimatologist with the USDA Forest Service.
“You can’t look at one event such as this and say with certainty that it was caused by a changing climate,” said Neilson, who was also a contributor to publications of the Intergovernmental Panel on Climate Change, a co-recipient earlier this month of the 2007 Nobel Peace Prize.
“But things just like this are consistent with what the latest modeling shows,” Neilson said, “and may be another piece of evidence that climate change is a reality, one with serious effects.”
The latest models, Neilson said, suggest that parts of the United States may be experiencing longer-term precipitation patterns – less year-to-year variability, but rather several wet years in a row followed by several that are drier than normal.
“As the planet warms, more water is getting evaporated from the oceans and all that water has to come down somewhere as precipitation,” said Neilson. “That can lead, at times, to heavier vegetation loads popping up and creation of a tremendous fuel load. But the warmth and other climatic forces are also going to create periodic droughts. If you get an ignition source during these periods, the fires can just become explosive.”
The problems can be compounded, Neilson said, by El Niño or La Nina events. A La Niña episode that’s currently under way is probably amplifying the Southern California drought, he said. But when rains return for a period of years, the burned vegetation may inevitably re-grow to very dense levels.
“In the future, catastrophic fires such as those going on now in California may simply be a normal part of the landscape,” said Neilson.
Fire forecast models developed by Neilson’s research group at OSU and the Forest Service rely on several global climate models. When combined, they accurately predicted both the Southern California fires that are happening and the drought that has recently hit parts of the Southeast, including Georgia and Florida, causing crippling water shortages.
In studies released five years ago, Neilson and other OSU researchers predicted that the American West could become both warmer and wetter in the coming century, conditions that would lead to repeated, catastrophic fires larger than any in recent history.
At that time, the scientists suggested that periodic increases in precipitation, in combination with higher temperatures and rising carbon dioxide levels, would spur vegetation growth and add even further to existing fuel loads caused by decades of fire suppression.
Droughts or heat waves, the researchers said in 2002, would then lead to levels of wildfire larger than most observed since European settlement. The projections were based on various “general circulation” models that showed both global warming and precipitation increases during the 21st century.
Ronald Neilson | EurekAlert!
Scientists team up on study to save endangered African penguins
16.11.2017 | Florida Atlantic University
Climate change: Urban trees are growing faster worldwide
13.11.2017 | Technische Universität München
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
20.11.2017 | Life Sciences
20.11.2017 | Trade Fair News
20.11.2017 | Earth Sciences