"Climate researchers often use a scenario approach," says Dr. Klaus Keller, assistant professor of geosciences, Penn State. "Nevertheless, scenarios are typically silent on the question of probabilities."
The Intergovernmental Panel on Climate Change, which is in its third round of climate assessment, uses models that scenarios of human climate forcing drive. These forcing scenarios are, the researchers say, overconfident.
"One key question is which scenario is likely, which is less likely and which they can neglect for practical purposes," says Keller who is also affiliated with the Penn State Institutes of Energy and the Environment. "At the very least, the scenarios should span the range of relevant future outcomes. This relevant range should also include low-probability, high-impact events."
The researchers provide evidence that the current practice neglects a sizeable fraction of these low probability events and results in biased outcomes. Keller; Louis Miltich, graduate student; Alexander Robinson, Penn State research assistant now on a Fulbright Fellowship in Berlin, and Richard Tol, senior research officer, Economic and Social Research Institute, Dublin, Ireland, developed an Integrated Assessment Model to derive probabilistic projections of carbon dioxide emissions on a century time scale. Their results extended far beyond the range of previously published scenarios, the researchers told attendees today (Dec. 15) at the fall meeting of the American Geophysical Union in San Francisco.
Noting that overconfidence is an often observed effect, Keller cites a study reviewing estimates of the weight of an electron as an example. The reported range for the weight of an electron from 1955 to the mid-1960s did not include the weight considered correct today. On a more closely related topic, the range of energy use projections in the 1970s typically missed the observed trends.
"We need to identify key sources of overconfidence and critically reevaluate previous studies," says Keller.
According to their study, past scenarios of carbon dioxide emissions can miss as much as 40 percent of probabilistic projection, missing a large number of low-probability events. The omitted scenarios may include low-probability, high-impact events.
"If low-probability, high-impact events exist, such as threshold responses of ocean currents or ice sheets, omitting these scenarios can lead to poor decision making," says Keller. "We need to see the full range of possible scenarios, because the actual outcome may not be contained in the central estimate.
"New tools and faster computers enable a considerably improved uncertainty analysis," he adds. "If you do not tell how likely the probability of a scenario is, people are left to guess. A sound scientific analysis can at least tell how consistent these guesses are with the available observations and simple, but transparent assumption."
A'ndrea Elyse Messer | EurekAlert!
Colorado River's connection with the ocean was a punctuated affair
16.11.2017 | University of Oregon
Researchers create largest, longest multiphysics earthquake simulation to date
14.11.2017 | Gauss Centre for Supercomputing
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses