A University of Michigan researcher is developing a unique way to reconcile these crucial data.
"If we're going to adapt to climate change, we need to be able to predict what the climate will be," said Anna Michalak, assistant professor in the Department of Civil and Environmental Engineering and the Department of Atmospheric, Oceanic and Space Sciences. "We want to know how the sources and sinks of carbon will evolve in the future, and the only way we can manage climate change is with scientific information."
Michalak is discussing the work at the symposium "Improving Understanding of Carbon Flux Variability Using Atmospheric Inverse Modeling" Sunday at the American Association for the Advancement of Science annual meeting here. She co-organized the session, "The Carbon Budget: Can We Reconcile Flux Estimates?" with Joyce Penner, a professor in the Department of Atmospheric, Oceanic and Space Sciences.
For some 50 years, scientists have measured the amount of carbon dioxide in the air on a large scale, at an increasing number of locations sprinkled across the globe, and by sampling very small areas. Together with inventories of fossil fuel use, that's given good data about how much carbon is being pumped into the atmosphere---currently approximately 8 billion tons a year.
It's also known that half of that stays in the atmosphere. The rest comes to rest in the oceans, the earth, or is gobbled up by plants during photosynthesis.
But then the data gets harder to come by and scientists have had to make some assumptions. Those flux towers only cover a few places on Earth, and it's too cumbersome to collect data on small areas. Even a powerful new tool Michalak will be using---NASA's Orbiting Carbon Observatory (OCO), a satellite designed to monitor atmospheric carbon---does not paint a perfect picture. She compares the thin data strips it harvests with wrapping a basketball with floss.
The problem: Michalak said the data takes such a big-picture approach that it is difficult to isolate carbon being emitted or taken up in specific regions, or even countries. Scientists are left with an understanding of carbon sources that isn't nimble enough to understand the variability, or to be confident about predicting the future.
Michalak has developed a robust way to use available data to understand this variability called "geostatistical inverse modeling." This method breaks the globe into small regions and examines how much CO2 must have been emitted in each region to achieve the concentrations measured at atmospheric sample points. This method also allows her and her collaborators to use information from other existing satellites that measure the Earth's surface to supplement the information from the atmospheric monitoring network. Eventually, this method aims to trace the carbon levels at each sample point to a particular source or sink on the surface.
The technique, Michalak says, is like figuring out where the cream was originally poured in a cup of half-stirred coffee.
"Winds and weather patterns mix CO2 in the atmosphere just like stirring mixes cream in a cup of coffee," she said. "As soon as you start stirring, you lose some information about where and when the cream was originally added to the cup. With careful measurements and models, however, much of this information can be recovered."
"One of our big questions is how carbon sources and sinks evolve," Michalak said. "This is all with an eye on prediction and management."
Sue Nichols | EurekAlert!
How does the loss of species alter ecosystems?
18.05.2017 | Deutsches Zentrum für integrative Biodiversitätsforschung (iDiv) Halle-Jena-Leipzig
Excess diesel emissions bring global health & environmental impacts
16.05.2017 | International Institute for Applied Systems Analysis (IIASA)
Physicists from the University of Würzburg are capable of generating identical looking single light particles at the push of a button. Two new studies now demonstrate the potential this method holds.
The quantum computer has fuelled the imagination of scientists for decades: It is based on fundamentally different phenomena than a conventional computer....
An international team of physicists has monitored the scattering behaviour of electrons in a non-conducting material in real-time. Their insights could be beneficial for radiotherapy.
We can refer to electrons in non-conducting materials as ‘sluggish’. Typically, they remain fixed in a location, deep inside an atomic composite. It is hence...
Two-dimensional magnetic structures are regarded as a promising material for new types of data storage, since the magnetic properties of individual molecular building blocks can be investigated and modified. For the first time, researchers have now produced a wafer-thin ferrimagnet, in which molecules with different magnetic centers arrange themselves on a gold surface to form a checkerboard pattern. Scientists at the Swiss Nanoscience Institute at the University of Basel and the Paul Scherrer Institute published their findings in the journal Nature Communications.
Ferrimagnets are composed of two centers which are magnetized at different strengths and point in opposing directions. Two-dimensional, quasi-flat ferrimagnets...
An Australian-Chinese research team has created the world's thinnest hologram, paving the way towards the integration of 3D holography into everyday...
In the race to produce a quantum computer, a number of projects are seeking a way to create quantum bits -- or qubits -- that are stable, meaning they are not much affected by changes in their environment. This normally needs highly nonlinear non-dissipative elements capable of functioning at very low temperatures.
In pursuit of this goal, researchers at EPFL's Laboratory of Photonics and Quantum Measurements LPQM (STI/SB), have investigated a nonlinear graphene-based...
24.05.2017 | Event News
23.05.2017 | Event News
22.05.2017 | Event News
24.05.2017 | Physics and Astronomy
24.05.2017 | Physics and Astronomy
24.05.2017 | Event News