“We want to know what will happen in the future, especially if the climate will change abruptly,” says Zhengyu Liu, a UW-Madison professor of atmospheric and oceanic sciences and director of the Center for Climatic Research in the Nelson Institute for Environmental Studies. “The problem is, you don’t know if your model is right for this kind of change. The important thing is validating your model.”
Starting with the last glacial maximum about 21,000 years ago, Liu’s team simulated atmospheric and oceanic conditions through what scientists call the Bølling-Allerød warming, the Earth’s last major temperature hike, which occurred about 14,500 years ago. The simulation fell in close agreement with conditions — temperatures, sea levels and glacial coverage — collected from fossil and geologic records.
“It’s our most serious attempt to simulate this last major global warming event, and it’s a validation of the model itself, as well,” Liu says.
The results of the new climate modeling experiments are presented today (July 17) in the journal Science.
The group’s simulations were executed on “Phoenix” and “Jaguar,” a pair of Cray supercomputers at Oak Ridge National Laboratory in Oak Ridge, Tenn., and helped pin down the contributions of three environmental factors as drivers of the Bølling-Allerød warming: an increase in atmospheric carbon dioxide, the jump-start of stalled heat-moving ocean currents and a large buildup of subsurface heat in the ocean while those currents were dormant.
The climate dominoes began to fall during that period after glaciers reached their maximum coverage, blanketing most of North America, Liu explains. As glaciers melted, massive quantities of water poured into the North Atlantic, lowering the ocean salinity that helps power a major convection current that acts like a conveyor belt to carry warm tropical surface water north and cooler, heavier subsurface water south.
As a result, according to the model, ocean circulation stopped. Without warm tropical water streaming north, the North Atlantic cooled and heat backed up in southern waters. Subsequently, glacial melt slowed or stopped as well, and eventually restarted the overturning current — which had a much larger reserve of heat to haul north.
“All that stored heat is released like a volcano, and poured out over decades,” Liu explains. “That warmed up Greenland and melted (arctic) sea ice.”
The model showed a 15-degree Celsius increase in average temperatures in Greenland and a 5-meter increase in sea level over just a few centuries, findings that squared neatly with the climate of the period as represented in the physical record.
“Being able to successfully simulate thousands of years of past climate for the first time with a comprehensive climate model is a major scientific achievement,” notes Bette Otto-Bliesner, an atmospheric scientist and climate modeler at National Center for Atmospheric Research (NCAR) and co-author of the Science report. “This is an important step toward better understanding how the world’s climate could change abruptly over the coming centuries with increasing melting of the ice caps.”
The rate of ice melt during the Bølling-Allerød warming is still at issue, but its consequences are not, Liu says. The modelers simulated both a slow decrease in melt and a sudden end to melt run-off. In both cases, the result was a 15-degree warming.
“That happened in the past,” Liu says. “The question is, in the future, if you have a global warming and Greenland melts, will it happen again?”
Time — both actual and computing — will tell. In 2008, the group simulated about one-third of the last 21,000 years. With another 4 million processor hours to go, the simulations being conducted by the Wisconsin group will eventually run up to the present and 200 years into the future.
Traditional climate modeling approaches were limited by computer time and capabilities, Lieu explains.
“They did slides, like snapshots,” Liu says. “You simulate 100 years, and then you run another 100 years, but those centuries may be 2,000 years apart (in the model). To look at abrupt change, there is no shortcut.”
Using the interactions between land, water, atmosphere and ice in the Community Climate System Model developed at NCAR, the researchers have been able to create a much more detailed and closely spaced book of snapshots, “giving us more of a motion picture of the climate” over millennia, Liu said.
He stressed the importance of drawing together specialists in computing, oceanography, atmospheric science and glaciers — including John Kutzbach, a UW-Madison climate modeler, and UW-Madison doctoral student Feng He, responsible for modeling the glacial melt. All were key to attaining the detail necessary in recreating historical climate conditions, Liu says.
“All this data, it’s from chemical proxies and bugs in the sediment,” Liu said. “You really need a very interdisciplinary team: people on deep ocean, people on geology, people who know those bugs. It is a huge — and very successful — collaboration.”
The new study was funded by the U.S. National Science Foundation, with additional support from the U.S. Department of Energy.
Chris Barncard | Newswise Science News
Climate change weakens Walker circulation
20.10.2017 | MARUM - Zentrum für Marine Umweltwissenschaften an der Universität Bremen
Shallow soils promote savannas in South America
20.10.2017 | Senckenberg Forschungsinstitut und Naturmuseen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research