Monitoring Earth's rising greenhouse gas levels will require a global data collection network 10 times larger than the one currently in place in order to quantify regional progress in emission reductions, according to a new research commentary by University of Colorado and NOAA researchers appearing in the April 25 issue of Science.
The authors, CU-Boulder Research Associate Melinda Marquis and National Oceanic and Atmospheric Administration scientist Pieter Tans, said with atmospheric carbon dioxide concentrations now at 385 parts per million and rising, the need for improved regional greenhouse gas measurements is critical. While the current observation network can measure CO2 fluxes on a continental scale, charting regional emissions where significant mitigation efforts are underway -- like California, New England and European countries -- requires a more densely populated network, they said.
"The question is whether scientists in the United States and around the world have what they need to monitor regional fluxes in atmospheric carbon dioxide," said Marquis, a scientist at the Cooperative Institute for Research in Environmental Sciences, a joint institute of CU-Boulder and NOAA. "Right now, they don't."
While CO2 levels are climbing by 2 parts per million annually -- a rate expected to increase as China and India continue to industrialize -- effective regional CO2 monitoring strategies are virtually nonexistent, she said. Scientists are limited in their ability to distinguish between distant and nearby carbon sources and "sinks," or storage areas, for example, by the accuracy of atmospheric transport models that reflect details of terrain, winds and the mixing of gases near observation sites.
"We are in uncharted territory as far as knowing how safe these high CO2 levels are for the Earth," she said. "Instead of tackling a very complex challenge with the equivalent of Magellan's maps, we need to use the equivalent of Google Earth."
Marquis and Tans propose increasing the number of global carbon measurement sites from about 100 to 1,000, which would decrease the uncertainty in computer models and help scientists better quantify changes. "With existing tools we could gather large amounts of additional CO2 data for a relatively small investment," said Marquis. "The next step is to muster the political will to fund these efforts."
Scientists currently sample CO2 using air flasks, in-situ measurements from transmitter towers up to 2,000 feet high and via aircraft sensors. The authors proposed putting additional CO2 sensors on existing and new transmitter towers that can gather large volumes of climate data. While Europe and the United States have small networks of tall transmitter towers equipped with CO2 instruments, such towers are rare on the rest of the planet, she said.
Satellites queued for launch in the next few years to help monitor atmospheric CO2 levels include the Orbiting Carbon Observatory and the Greenhouse Gases Observing Satellite, said Marquis. The satellites will augment ground-based and aircraft measurements charting terrestrial photosynthesis, carbon sinks, CO2 respiration sources, ocean-atmosphere gas exchanges and CO2 emissions from wildfires.
Mandated by the U.N. Framework Convention on Climate Change in 1994, national emissions inventories for each country are based primarily on economic statistics to estimate greenhouse gases entering and leaving the atmosphere, said the authors. Such inventories are "reasonably accurate" for estimating atmospheric CO2 from burning fossil fuels in developed countries.
But they are less accurate for other sources of CO2, like deforestation, and for emissions of other greenhouse gases, like methane, which is emitted as a result of rice farming, cattle ranching and natural wetlands, said the authors.
There is a growing need to measure the effectiveness of particular mitigation efforts by states or regions involved in pollution caps, auto emission reduction campaigns and intensive tree-planting efforts, Marquis said. The Western Climate Initiative, for example -- a consortium of seven western U.S. states and British Columbia -- set a goal last year of reducing greenhouse gas emissions by 15 percent as of 2020.
Precise regional CO2 measurements also could help chart the accuracy of carbon trading systems involving "credits" and "offsets" now in use in various countries around the world, said Marquis. In such systems, companies exceeding CO2 emission caps can buy carbon credits from companies under the caps, and groups or companies can buy voluntary carbon offsets to compensate for personal lifestyle choices, such as airline travel.
"Independent verification through regional CO2 monitoring could help determine whether carbon credits or offsets being bought or sold are of value," Marquis said.
Preservation of floodplains is flood protection
27.09.2017 | Technische Universität München
Conservationists are sounding the alarm: parrots much more threatened than assumed
15.09.2017 | Justus-Liebig-Universität Gießen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research