NASA's Goddard Institute for Space Studies (GISS) in New York, which monitors global surface temperatures on an ongoing basis, released an updated analysis that shows temperatures around the globe in 2011 compared to the average global temperature from the mid-20th century. The comparison shows how Earth continues to experience warmer temperatures than several decades ago. The average temperature around the globe in 2011 was 0.92 degrees F (0.51 C) warmer than the mid-20th century baseline.
"We know the planet is absorbing more energy than it is emitting," said GISS Director James E. Hansen. "So we are continuing to see a trend toward higher temperatures. Even with the cooling effects of a strong La Niña influence and low solar activity for the past several years, 2011 was one of the 10 warmest years on record."
The difference between 2011 and the warmest year in the GISS record (2010) is 0.22 degrees F (0.12 C). This underscores the emphasis scientists put on the long-term trend of global temperature rise. Because of the large natural variability of climate, scientists do not expect temperatures to rise consistently year after year. However, they do expect a continuing temperature rise over decades.
The first 11 years of the 21st century experienced notably higher temperatures compared to the middle and late 20th century, Hansen said. The only year from the 20th century in the top 10 warmest years on record is 1998.
Higher temperatures today are largely sustained by increased atmospheric concentrations of greenhouse gases, especially carbon dioxide. These gases absorb infrared radiation emitted by Earth and release that energy into the atmosphere rather than allowing it to escape to space. As their atmospheric concentration has increased, the amount of energy "trapped" by these gases has led to higher temperatures.
The carbon dioxide level in the atmosphere was about 285 parts per million in 1880, when the GISS global temperature record begins. By 1960, the average concentration had risen to about 315 parts per million. Today it exceeds 390 parts per million and continues to rise at an accelerating pace.
The temperature analysis produced at GISS is compiled from weather data from more than 1,000 meteorological stations around the world, satellite observations of sea surface temperature and Antarctic research station measurements. A publicly available computer program is used to calculate the difference between surface temperature in a given month and the average temperature for the same place during 1951 to 1980. This three-decade period functions as a baseline for the analysis.
The resulting temperature record is very close to analyses by the Met Office Hadley Centre in the United Kingdom and the National Oceanic and Atmospheric Administration's National Climatic Data Center in Asheville, N.C.
Hansen said he expects record-breaking global average temperature in the next two to three years because solar activity is on the upswing and the next El Niño will increase tropical Pacific temperatures. The warmest years on record were 2005 and 2010, in a virtual tie.
"It's always dangerous to make predictions about El Niño, but it's safe to say we'll see one in the next three years," Hansen said. "It won't take a very strong El Niño to push temperatures above 2010."
Leslie McCarthy | EurekAlert!
New studies increase confidence in NASA's measure of Earth's temperature
24.05.2019 | NASA/Goddard Space Flight Center
New Measurement Device: Carbon Dioxide As Geothermometer
21.05.2019 | Universität Heidelberg
A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.
The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values...
Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B.
The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or...
Engineers at the University of Tokyo continually pioneer new ways to improve battery technology. Professor Atsuo Yamada and his team recently developed a...
With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time.
Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists...
'Quantum technologies' utilise the unique phenomena of quantum superposition and entanglement to encode and process information, with potentially profound benefits to a wide range of information technologies from communications to sensing and computing.
However a major challenge in developing these technologies is that the quantum phenomena are very fragile, and only a handful of physical systems have been...
29.04.2019 | Event News
17.04.2019 | Event News
15.04.2019 | Event News
24.05.2019 | Physics and Astronomy
24.05.2019 | Medical Engineering
24.05.2019 | Life Sciences