A University of Michigan researcher is developing a unique way to reconcile these crucial data.
"If we're going to adapt to climate change, we need to be able to predict what the climate will be," said Anna Michalak, assistant professor in the Department of Civil and Environmental Engineering and the Department of Atmospheric, Oceanic and Space Sciences. "We want to know how the sources and sinks of carbon will evolve in the future, and the only way we can manage climate change is with scientific information."
Michalak is discussing the work at the symposium "Improving Understanding of Carbon Flux Variability Using Atmospheric Inverse Modeling" Sunday at the American Association for the Advancement of Science annual meeting here. She co-organized the session, "The Carbon Budget: Can We Reconcile Flux Estimates?" with Joyce Penner, a professor in the Department of Atmospheric, Oceanic and Space Sciences.
For some 50 years, scientists have measured the amount of carbon dioxide in the air on a large scale, at an increasing number of locations sprinkled across the globe, and by sampling very small areas. Together with inventories of fossil fuel use, that's given good data about how much carbon is being pumped into the atmosphere---currently approximately 8 billion tons a year.
It's also known that half of that stays in the atmosphere. The rest comes to rest in the oceans, the earth, or is gobbled up by plants during photosynthesis.
But then the data gets harder to come by and scientists have had to make some assumptions. Those flux towers only cover a few places on Earth, and it's too cumbersome to collect data on small areas. Even a powerful new tool Michalak will be using---NASA's Orbiting Carbon Observatory (OCO), a satellite designed to monitor atmospheric carbon---does not paint a perfect picture. She compares the thin data strips it harvests with wrapping a basketball with floss.
The problem: Michalak said the data takes such a big-picture approach that it is difficult to isolate carbon being emitted or taken up in specific regions, or even countries. Scientists are left with an understanding of carbon sources that isn't nimble enough to understand the variability, or to be confident about predicting the future.
Michalak has developed a robust way to use available data to understand this variability called "geostatistical inverse modeling." This method breaks the globe into small regions and examines how much CO2 must have been emitted in each region to achieve the concentrations measured at atmospheric sample points. This method also allows her and her collaborators to use information from other existing satellites that measure the Earth's surface to supplement the information from the atmospheric monitoring network. Eventually, this method aims to trace the carbon levels at each sample point to a particular source or sink on the surface.
The technique, Michalak says, is like figuring out where the cream was originally poured in a cup of half-stirred coffee.
"Winds and weather patterns mix CO2 in the atmosphere just like stirring mixes cream in a cup of coffee," she said. "As soon as you start stirring, you lose some information about where and when the cream was originally added to the cup. With careful measurements and models, however, much of this information can be recovered."
"One of our big questions is how carbon sources and sinks evolve," Michalak said. "This is all with an eye on prediction and management."
Sue Nichols | EurekAlert!
Preservation of floodplains is flood protection
27.09.2017 | Technische Universität München
Conservationists are sounding the alarm: parrots much more threatened than assumed
15.09.2017 | Justus-Liebig-Universität Gießen
Salmonellae are dangerous pathogens that enter the body via contaminated food and can cause severe infections. But these bacteria are also known to target...
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
23.10.2017 | Event News
17.10.2017 | Event News
10.10.2017 | Event News
23.10.2017 | Life Sciences
23.10.2017 | Physics and Astronomy
23.10.2017 | Health and Medicine