A University of Michigan researcher is developing a unique way to reconcile these crucial data.
"If we're going to adapt to climate change, we need to be able to predict what the climate will be," said Anna Michalak, assistant professor in the Department of Civil and Environmental Engineering and the Department of Atmospheric, Oceanic and Space Sciences. "We want to know how the sources and sinks of carbon will evolve in the future, and the only way we can manage climate change is with scientific information."
Michalak is discussing the work at the symposium "Improving Understanding of Carbon Flux Variability Using Atmospheric Inverse Modeling" Sunday at the American Association for the Advancement of Science annual meeting here. She co-organized the session, "The Carbon Budget: Can We Reconcile Flux Estimates?" with Joyce Penner, a professor in the Department of Atmospheric, Oceanic and Space Sciences.
For some 50 years, scientists have measured the amount of carbon dioxide in the air on a large scale, at an increasing number of locations sprinkled across the globe, and by sampling very small areas. Together with inventories of fossil fuel use, that's given good data about how much carbon is being pumped into the atmosphere---currently approximately 8 billion tons a year.
It's also known that half of that stays in the atmosphere. The rest comes to rest in the oceans, the earth, or is gobbled up by plants during photosynthesis.
But then the data gets harder to come by and scientists have had to make some assumptions. Those flux towers only cover a few places on Earth, and it's too cumbersome to collect data on small areas. Even a powerful new tool Michalak will be using---NASA's Orbiting Carbon Observatory (OCO), a satellite designed to monitor atmospheric carbon---does not paint a perfect picture. She compares the thin data strips it harvests with wrapping a basketball with floss.
The problem: Michalak said the data takes such a big-picture approach that it is difficult to isolate carbon being emitted or taken up in specific regions, or even countries. Scientists are left with an understanding of carbon sources that isn't nimble enough to understand the variability, or to be confident about predicting the future.
Michalak has developed a robust way to use available data to understand this variability called "geostatistical inverse modeling." This method breaks the globe into small regions and examines how much CO2 must have been emitted in each region to achieve the concentrations measured at atmospheric sample points. This method also allows her and her collaborators to use information from other existing satellites that measure the Earth's surface to supplement the information from the atmospheric monitoring network. Eventually, this method aims to trace the carbon levels at each sample point to a particular source or sink on the surface.
The technique, Michalak says, is like figuring out where the cream was originally poured in a cup of half-stirred coffee.
"Winds and weather patterns mix CO2 in the atmosphere just like stirring mixes cream in a cup of coffee," she said. "As soon as you start stirring, you lose some information about where and when the cream was originally added to the cup. With careful measurements and models, however, much of this information can be recovered."
"One of our big questions is how carbon sources and sinks evolve," Michalak said. "This is all with an eye on prediction and management."
Sue Nichols | EurekAlert!
Machine learning helps predict worldwide plant-conservation priorities
04.12.2018 | Ohio State University
From the Arctic to the tropics: researchers present a unique database on Earth’s vegetation
20.11.2018 | Martin-Luther-Universität Halle-Wittenberg
Over the last decade, there has been much excitement about the discovery, recognised by the Nobel Prize in Physics only two years ago, that there are two types...
What if a sensor sensing a thing could be part of the thing itself? Rice University engineers believe they have a two-dimensional solution to do just that.
Rice engineers led by materials scientists Pulickel Ajayan and Jun Lou have developed a method to make atom-flat sensors that seamlessly integrate with devices...
Scientists at the University of Stuttgart and the Karlsruhe Institute of Technology (KIT) succeed in important further development on the way to quantum Computers.
Quantum computers one day should be able to solve certain computing problems much faster than a classical computer. One of the most promising approaches is...
New Project SNAPSTER: Novel luminescent materials by encapsulating phosphorescent metal clusters with organic liquid crystals
Nowadays energy conversion in lighting and optoelectronic devices requires the use of rare earth oxides.
Scientists have discovered the first synthetic material that becomes thicker - at the molecular level - as it is stretched.
Researchers led by Dr Devesh Mistry from the University of Leeds discovered a new non-porous material that has unique and inherent "auxetic" stretching...
10.12.2018 | Event News
06.12.2018 | Event News
03.12.2018 | Event News
11.12.2018 | Physics and Astronomy
11.12.2018 | Materials Sciences
11.12.2018 | Information Technology