Authors of the study, which was funded by the National Science Foundation and published online this week in the journal Science, say that global warming is real and that increases in atmospheric CO2 will have multiple serious impacts.
However, the most Draconian projections of temperature increases from the doubling of CO2 are unlikely.
"Many previous climate sensitivity studies have looked at the past only from 1850 through today, and not fully integrated paleoclimate date, especially on a global scale," said Andreas Schmittner, an Oregon State University researcher and lead author on the Science article. "When you reconstruct sea and land surface temperatures from the peak of the last Ice Age 21,000 years ago – which is referred to as the Last Glacial Maximum – and compare it with climate model simulations of that period, you get a much different picture.
"If these paleoclimatic constraints apply to the future, as predicted by our model, the results imply less probability of extreme climatic change than previously thought," Schmittner added.
Scientists have struggled for years trying to quantify "climate sensitivity" – which is how the Earth will respond to projected increases of atmospheric carbon dioxide. The 2007 IPCC report estimated that the air near the surface of the Earth would warm on average by 2 to 4.5 degrees (Celsius) with a doubling of atmospheric CO2 from pre-industrial standards. The mean, or "expected value" increase in the IPCC estimates was 3.0 degrees; most climate model studies use the doubling of CO2 as a basic index.
Some previous studies have claimed the impacts could be much more severe – as much as 10 degrees or higher with a doubling of CO2 – although these projections come with an acknowledged low probability. Studies based on data going back only to 1850 are affected by large uncertainties in the effects of dust and other small particles in the air that reflect sunlight and can influence clouds, known as "aerosol forcing," or by the absorption of heat by the oceans, the researchers say.
To lower the degree of uncertainty, Schmittner and his colleagues used a climate model with more data and found that there are constraints that preclude very high levels of climate sensitivity.
The researchers compiled land and ocean surface temperature reconstructions from the Last Glacial Maximum and created a global map of those temperatures. During this time, atmospheric CO2 was about a third less than before the Industrial Revolution, and levels of methane and nitrous oxide were much lower. Because much of the northern latitudes were covered in ice and snow, sea levels were lower, the climate was drier (less precipitation), and there was more dust in the air.
All these factor, which contributed to cooling the Earth's surface, were included in their climate model simulations.
The new data changed the assessment of climate models in many ways, said Schmittner, an associate professor in OSU's College of Earth, Ocean, and Atmospheric Sciences. The researchers' reconstruction of temperatures has greater spatial coverage and showed less cooling during the Ice Age than most previous studies.
High sensitivity climate models – more than 6 degrees – suggest that the low levels of atmospheric CO2 during the Last Glacial Maximum would result in a "runaway effect" that would have left the Earth completely ice-covered.
"Clearly, that didn't happen," Schmittner said. "Though the Earth then was covered by much more ice and snow than it is today, the ice sheets didn't extend beyond latitudes of about 40 degrees, and the tropics and subtropics were largely ice-free – except at high altitudes. These high-sensitivity models overestimate cooling."
On the other hand, models with low climate sensitivity – less than 1.3 degrees – underestimate the cooling almost everywhere at the Last Glacial Maximum, the researchers say. The closest match, with a much lower degree of uncertainty than most other studies, suggests climate sensitivity is about 2.4 degrees.
However, uncertainty levels may be underestimated because the model simulations did not take into account uncertainties arising from how cloud changes reflect sunlight, Schmittner said.
Reconstructing sea and land surface temperatures from 21,000 years ago is a complex task involving the examination of ices cores, bore holes, fossils of marine and terrestrial organisms, seafloor sediments and other factors. Sediment cores, for example, contain different biological assemblages found in different temperature regimes and can be used to infer past temperatures based on analogs in modern ocean conditions.
"When we first looked at the paleoclimatic data, I was struck by the small cooling of the ocean," Schmittner said. "On average, the ocean was only about two degrees (Celsius) cooler than it is today, yet the planet was completely different – huge ice sheets over North America and northern Europe, more sea ice and snow, different vegetation, lower sea levels and more dust in the air.
"It shows that even very small changes in the ocean's surface temperature can have an enormous impact elsewhere, particularly over land areas at mid- to high-latitudes," he added.
Schmittner said continued unabated fossil fuel use could lead to similar warming of the sea surface as reconstruction shows happened between the Last Glacial Maximum and today.
"Hence, drastic changes over land can be expected," he said. "However, our study implies that we still have time to prevent that from happening, if we make a concerted effort to change course soon."
Other authors on the study include Peter Clark and Alan Mix of OSU; Nathan Urban, Princeton University; Jeremy Shakun, Harvard University; Natalie Mahowald, Cornell University; Patrick Bartlein, University of Oregon; and Antoni Rosell-Mele, University of Barcelona.
Andreas Schmittner | EurekAlert!
Receding glaciers in Bolivia leave communities at risk
20.10.2016 | European Geosciences Union
UM researchers study vast carbon residue of ocean life
19.10.2016 | University of Miami Rosenstiel School of Marine & Atmospheric Science
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo led the development of a new extensible wiring technique capable of controlling superconducting quantum bits, representing a significant step towards to the realization of a scalable quantum computer.
"The quantum socket is a wiring method that uses three-dimensional wires based on spring-loaded pins to address individual qubits," said Jeremy Béjanin, a PhD...
In a paper in Scientific Reports, a research team at Worcester Polytechnic Institute describes a novel light-activated phenomenon that could become the basis for applications as diverse as microscopic robotic grippers and more efficient solar cells.
A research team at Worcester Polytechnic Institute (WPI) has developed a revolutionary, light-activated semiconductor nanocomposite material that can be used...
By forcefully embedding two silicon atoms in a diamond matrix, Sandia researchers have demonstrated for the first time on a single chip all the components needed to create a quantum bridge to link quantum computers together.
"People have already built small quantum computers," says Sandia researcher Ryan Camacho. "Maybe the first useful one won't be a single giant quantum computer...
COMPAMED has become the leading international marketplace for suppliers of medical manufacturing. The trade fair, which takes place every November and is co-located to MEDICA in Dusseldorf, has been steadily growing over the past years and shows that medical technology remains a rapidly growing market.
In 2016, the joint pavilion by the IVAM Microtechnology Network, the Product Market “High-tech for Medical Devices”, will be located in Hall 8a again and will...
'Ferroelectric' materials can switch between different states of electrical polarization in response to an external electric field. This flexibility means they show promise for many applications, for example in electronic devices and computer memory. Current ferroelectric materials are highly valued for their thermal and chemical stability and rapid electro-mechanical responses, but creating a material that is scalable down to the tiny sizes needed for technologies like silicon-based semiconductors (Si-based CMOS) has proven challenging.
Now, Hiroshi Funakubo and co-workers at the Tokyo Institute of Technology, in collaboration with researchers across Japan, have conducted experiments to...
14.10.2016 | Event News
14.10.2016 | Event News
12.10.2016 | Event News
21.10.2016 | Health and Medicine
21.10.2016 | Information Technology
21.10.2016 | Materials Sciences