Reducing global emissions of carbon dioxide this century is going to be more challenging than society has been led to believe, according to a new research commentary article appearing April 3 in Nature.
The authors, from the University of Colorado at Boulder, the National Center for Atmospheric Research in Boulder and McGill University in Montreal, said the technological challenges of reducing CO2 emissions have been significantly underestimated by the Intergovernmental Panel on Climate Change. The study concludes the IPCC is overly optimistic in assuming that, even without action by policymakers, new technologies will result in dramatic reductions in the growth of future emissions.
Recent changes in “carbon intensity” -- CO2 emissions per unit of energy consumption -- already are higher than those predicted by the IPCC because of rapid economic development, said lead author and CU-Boulder Professor Roger Pielke Jr. of the environmental studies program. In Asia, for instance, the demands of more energy-intensive economies are being met with conventional fossil-fuel technologies, a process expected to continue there for decades and eventually move into Africa.
The IPCC underestimate of carbon intensity is due in part to the panel’s differing scenarios tied to global emission changes expected to occur spontaneously and those driven by climate policies, according to the Nature authors. Titled “Dangerous Assumptions,” the Nature commentary was co-authored by Senior Scientist Tom Wigley of NCAR’s Climate and Global Dynamics Division and economics Professor Christopher Green of McGill University’s Global Environmental and Climate Change Center.
“According to the IPCC report, the majority of the emission reductions required to stabilize CO2 concentrations are assumed to occur automatically,” said Pielke. “Not only is this reduction unlikely to happen under current policies, we are moving in the opposite direction right now. We believe these kind of assumptions in the analysis blind us to reality and could potentially distort our ability to develop effective policies.”
Stabilization of atmospheric concentrations of CO2 and other greenhouse gases was the primary objective of the 1992 United Nations Framework Convention on Climate Change approved by almost all countries, including the United States, said Wigley.
“Stabilization is a more daunting challenge than many realize and requires a radical ‘decarbonization’ of energy systems,” said Wigley. “Global energy demand is projected to grow rapidly, and these huge new demands must be met by largely carbon-neutral energy sources -- sources that either do not use fossil fuels or which capture and store any emitted CO2.”
Calculations by the Nature authors show that at least two-thirds of the amount of carbon that has to be removed from the energy supply in order to stabilize atmospheric CO2 concentrations at roughly 500 parts per million is assumed to occur automatically by the IPCC authors. Atmospheric CO2 levels are currently at about 390 parts per million.
Enormous advances in energy policy in the coming decades will be needed to stabilize atmospheric concentrations of CO2 at levels currently considered “acceptable,” the Nature authors concluded.
Unlike the IPCC authors, who built in assumptions about future “spontaneous” technological innovations, the Nature commentary authors began with a set of “frozen technology” scenarios as baselines -- scenarios in which energy technologies are assumed to stay at present levels. “With a frozen technology approach, the full scope of the carbon-neutral technology challenge is placed into clear view,” said Green.
“In the end, there is no question whether technological innovation is necessary -- it is,” the authors wrote in Nature. “The question is, to what degree should policy focus explicitly on motivating such innovation? The IPCC plays a risky game in assuming that spontaneous advances in technological innovation will carry most of the burden of achieving future emissions reductions, rather than focusing on those conditions that are necessary and sufficient for those innovations to occur.”
Roger Pielke Jr. | EurekAlert!
Graphene gives a tremendous boost to future terahertz cameras
16.04.2019 | ICFO-The Institute of Photonic Sciences
Mount Kilimanjaro: Ecosystems in Global Change
28.03.2019 | Julius-Maximilians-Universität Würzburg
A stellar flare 10 times more powerful than anything seen on our sun has burst from an ultracool star almost the same size as Jupiter
A localization phenomenon boosts the accuracy of solving quantum many-body problems with quantum computers which are otherwise challenging for conventional computers. This brings such digital quantum simulation within reach on quantum devices available today.
Quantum computers promise to solve certain computational problems exponentially faster than any classical machine. “A particularly promising application is the...
The technology could revolutionize how information travels through data centers and artificial intelligence networks
Engineers at the University of California, Berkeley have built a new photonic switch that can control the direction of light passing through optical fibers...
Physicists observe how electron-hole pairs drift apart at ultrafast speed, but still remain strongly bound.
Modern electronics relies on ultrafast charge motion on ever shorter length scales. Physicists from Regensburg and Gothenburg have now succeeded in resolving a...
Engineers create novel optical devices, including a moth eye-inspired omnidirectional microwave antenna
A team of engineers at Tufts University has developed a series of 3D printed metamaterials with unique microwave or optical properties that go beyond what is...
17.04.2019 | Event News
15.04.2019 | Event News
09.04.2019 | Event News
18.04.2019 | Life Sciences
18.04.2019 | Physics and Astronomy
18.04.2019 | Life Sciences