Research by environmental scientists at the Harvard School of Engineering and Applied Sciences (SEAS) brings bad news to the western United States, where firefighters are currently battling dozens of fires in at least 11 states.
Undergrowth burns at a campground. (Photo courtesy of the National Park Service, 2007.)
The Harvard team’s study suggests wildfire seasons by 2050 will be about three weeks longer, up to twice as smoky, and will burn a wider area in the western states. The findings are based on a set of internationally recognized climate scenarios, decades of historical meteorological data, and records of past fire activity.
The results will be published in the October 2013 issue of Atmospheric Environment and are available in advance online.
Awareness is building that gradual climate change may contribute in the coming years to increases in significant, disruptive events like severe storms and floods. Loretta J. Mickley, a senior research fellow in atmospheric chemistry at Harvard SEAS and coauthor of the new study, is thinking one step further, to secondary effects like forest fires and air quality that rely heavily on meteorological factors.
“We weren’t altogether certain what we would find when we started this project,” Mickley says. “In the future atmosphere we expect warmer temperatures, which are conducive to fires, but it’s not apparent what the rainfall or relative humidity will do. Warmer air can hold more water vapor, for instance, but what does this mean for fires?”
“It turns out that, for the western United States, the biggest driver for fires in the future is temperature, and that result appears robust across models," Mickley adds. "When you get a large temperature increase over time, as we are seeing, and little change in rainfall, fires will increase in size.”
Reaching that conclusion with statistical confidence required months of analysis, because at the local level, wildfires are very difficult to predict.
“Wildfires are triggered by one set of influences—mainly human activity and lightning—but they grow and spread according to a completely different range of influences that are heavily dependent on the weather," says lead author Xu Yue. "Of course, when all the factors come together just right—whoosh, there’s a big fire.”
By examining records of past weather conditions and wildfires, the team found that the main factors influencing the spread of fires vary from region to region. In the Rocky Mountain Forest, for example, the best predictor of wildfire area in a given year is the amount of moisture in the forest floor, which depends on the temperature, rainfall, and relative humidity that season. In the Great Basin region, different factors apply. There, the area burned is influenced by the relative humidity in the previous year, which promotes fuel growth. Yue, who was a postdoctoral fellow at Harvard SEAS and is now at Yale University, created mathematical models that closely link these types of variables—seasonal temperatures, relative humidity, the amount of dry fuel and so forth—with the observed wildfire outcomes for six "ecoregions" in the West.
After developing those models, the team replaced the historical observations with data based on the conclusions of the fourth Intergovernmental Panel on Climate Change (IPCC), which use socioeconomic scenarios to predict possible future atmospheric and climatological conditions. For this study, the Harvard group followed the A1B scenario, which considers the climatological effect of a fast-growing global economy relying on a mixture of fossil fuels and renewable energy sources. By running the IPCC's climate data for the year 2050 through their own fire prediction models, the Harvard team was able to calculate the area burned for each ecoregion at midcentury.For example, the calculations suggest the following for 2050 in the western United States, in comparison to present-day conditions:
Air quality is also projected to suffer as a result of these larger, longer-lasting wildfires. Smoke from wildfires is composed of organic and black carbon particles and can impede visibility and cause respiratory problems. The U.S. Forest Service keeps a record of the amount of fuel (biomass) available across the entire United States, and another set of databases known as the Landscape Fire and Resource Management Planning Tools tracks specific types of vegetation for each square kilometer of land. Based on this information and known emission factors for combustion, the researchers predict that smoke will increase 20–100% by the 2050s, depending on the region and the type of particle.
The main innovation of the new study is its reliance on an ensemble of climate models, rather than just one or two. One of the greatest uncertainties in the science of climate change is the sensitivity of surface temperatures to rising levels of greenhouse gases.
“Our use of a multi-model ensemble increases confidence in our results,” says principal investigator Jennifer A. Logan, a recently retired Senior Research Fellow at Harvard SEAS.
The fire prediction model developed by the team performed least well in central and southern California, where the rugged topography results in a patchwork of ecoregions, each with a different fire response to changing meteorology. The authors have been investigating the unusual factors at play in that state and expect to release their findings shortly.
For Harvard's atmospheric scientists, the ultimate goal of this project was to see how air quality could be affected by climate change, given that smoke from wildfires is a major source of particulate matter in the atmosphere.
Air quality has vastly improved over much of the United States in the past 40 years, as a result of government efforts to regulate emissions. Mickley warns that increasing wildfires may erase some of the progress.
“I think what people need to realize is that embedded in those curves showing the tiny temperature increases year after year are more extreme events that can be quite serious,” she says. "It doesn't bode well."
Mickley, Logan, and Yue collaborated on this research with coauthor Jed O. Kaplan, a professor at École Polytechnique Fédérale de Lausanne in Switzerland. The project was supported by grants from the U.S. Environmental Protection Agency (R834282), the NASA Air Quality Applied Science Team (NNX11AI4OG), and the National Institutes of Health (1R21ES021427, 5R21ES020194). The researchers are grateful for their access to numerous climate models, including the WCRP CMIP3 dataset, the creation of which was supported by the Office of Science, U.S. Department of Energy.
Caroline Perry | EurekAlert!
Climate change weakens Walker circulation
20.10.2017 | MARUM - Zentrum für Marine Umweltwissenschaften an der Universität Bremen
Shallow soils promote savannas in South America
20.10.2017 | Senckenberg Forschungsinstitut und Naturmuseen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research