Globally, Earth's atmosphere warmed an average of about 0.4 C (or about 0.72 degrees Fahrenheit) in 30 years, according to data collected by sensors aboard NOAA and NASA satellites. More than 80 percent of the globe warmed by some amount.
A map of Earth's climate changes since December 1, 1978, (when satellite sensors started tracking the climate) doesn't show a uniform global warming. It looks more like a thermometer: Hot at the top, cold at the bottom and varying degrees of warm in the middle.
This is a pattern of warming not forecast by any of the major global climate models.
The area of fastest warming is clustered around the Northern Atlantic and Arctic oceans, stretching from Arctic Canada across Greenland to Scandinavia. The greatest warming has been on opposite ends of Greenland, where temperatures have jumped as much as 2.5 C (about 4.6 degrees F) in 30 years.
During the same time, however, much of the Antarctic has cooled, with parts of the continent cooling as much as Greenland has warmed. But areas of cooling were isolated: Only four percent of the globe cooled by at least half of one degree Fahrenheit.
"If you look at the 30-year graph of month-to-month temperature anomalies, the most obvious feature is the series of warmer than normal months that followed the major El Nino Pacific Ocean warming event of 1997-1998," said Christy. "Right now we are coming out of one La Nina Pacific Ocean cooling event and we might be heading into another. It should be interesting over the next several years to see whether the post La Nina climate 're-sets' to the cooler seasonal norms we saw before 1997 or the warmer levels seen since then."
Virtually all of the warming found in the satellite temperature record has taken place since the onset of the 1997-1998 El Nino. Earth's average temperature showed no detectable warming from December 1978 until the 1997 El Nino.
As part of an ongoing joint project between The University of Alabama in Huntsville, NOAA and NASA, Christy and Dr. Roy Spencer, a principal research scientist in the ESSC, use data gathered by microwave sounding units on NOAA and NASA satellites to get accurate temperature readings for almost all regions of the Earth. This includes remote desert, ocean and rain forest areas for which reliable climate data are not otherwise available.
The satellite-based instruments measure the temperature of the atmosphere from the surface up to an altitude of about eight kilometers above sea level.
Once the monthly temperature data is collected and processed, it is placed in a "public" computer file for immediate access by atmospheric scientists in the U.S. and abroad.
Neither Spencer nor Christy receives any research support or funding from oil, coal or industrial companies or organizations, or from any private or special interest groups. All of their climate research funding comes from state and federal grants or contracts.Dr. John Christy, 256.961.7763
Phillip Gentry | Newswise Science News
Climate change weakens Walker circulation
20.10.2017 | MARUM - Zentrum für Marine Umweltwissenschaften an der Universität Bremen
Shallow soils promote savannas in South America
20.10.2017 | Senckenberg Forschungsinstitut und Naturmuseen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research