Berkeley Lab researcher says climate science is entering a new golden age.
Not long ago, it would have taken several years to run a high-resolution simulation on a global climate model. But using some of the most powerful supercomputers now available, Lawrence Berkeley National Laboratory (Berkeley Lab) climate scientist Michael Wehner was able to complete a run in just three months.
What he found was that not only were the simulations much closer to actual observations, but the high-resolution models were far better at reproducing intense storms, such as hurricanes and cyclones. The study, “The effect of horizontal resolution on simulation quality in the Community Atmospheric Model, CAM5.1,” has been published online in the Journal of Advances in Modeling Earth Systems.
“I’ve been calling this a golden age for high-resolution climate modeling because these supercomputers are enabling us to do gee-whiz science in a way we haven’t been able to do before,” said Wehner, who was also a lead author for the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). “These kinds of calculations have gone from basically intractable to heroic to now doable.”
Using version 5.1 of the Community Atmospheric Model, developed by the Department of Energy (DOE) and the National Science Foundation (NSF) for use by the scientific community, Wehner and his co-authors conducted an analysis for the period 1979 to 2005 at three spatial resolutions: 25 km, 100 km, and 200 km. They then compared those results to each other and to observations.
One simulation generated 100 terabytes of data, or 100,000 gigabytes. The computing was performed at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility. “I’ve literally waited my entire career to be able to do these simulations,” Wehner said.
The higher resolution was particularly helpful in mountainous areas since the models take an average of the altitude in the grid (25 square km for high resolution, 200 square km for low resolution). With more accurate representation of mountainous terrain, the higher resolution model is better able to simulate snow and rain in those regions.
“High resolution gives us the ability to look at intense weather, like hurricanes,” said Kevin Reed, a researcher at the National Center for Atmospheric Research (NCAR) and a co-author on the paper. “It also gives us the ability to look at things locally at a lot higher fidelity. Simulations are much more realistic at any given place, especially if that place has a lot of topography.”
The high-resolution model produced stronger storms and more of them, which was closer to the actual observations for most seasons. “In the low-resolution models, hurricanes were far too infrequent,” Wehner said.
The IPCC chapter on long-term climate change projections that Wehner was a lead author on concluded that a warming world will cause some areas to be drier and others to see more rainfall, snow, and storms. Extremely heavy precipitation was projected to become even more extreme in a warmer world. “I have no doubt that is true,” Wehner said. “However, knowing it will increase is one thing, but having a confident statement about how much and where as a function of location requires the models do a better job of replicating observations than they have.”
Wehner says the high-resolution models will help scientists to better understand how climate change will affect extreme storms. His next project is to run the model for a future-case scenario. Further down the line, Wehner says scientists will be running climate models with 1 km resolution. To do that, they will have to have a better understanding of how clouds behave.
“A cloud system-resolved model can reduce one of the greatest uncertainties in climate models, by improving the way we treat clouds,” Wehner said. “That will be a paradigm shift in climate modeling. We’re at a shift now, but that is the next one coming.”
The paper’s other co-authors include Fuyu Li, Prabhat, and William Collins of Berkeley Lab; and Julio Bacmeister, Cheng-Ta Chen, Christopher Paciorek, Peter Gleckler, Kenneth Sperber, Andrew Gettelman, and Christiane Jablonowski from other institutions. The research was supported by the Biological and Environmental Division of the Department of Energy’s Office of Science.
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov
Julie Chao | EurekAlert!
New AI system manages road infrastructure via Google Street View
19.06.2019 | RMIT University
'Alexa, monitor my heart': Researchers develop first contactless cardiac arrest AI system for smart speakers
19.06.2019 | University of Washington
From June 25th to 27th 2019, the Fraunhofer Institute for Digital Media Technology IDMT in Ilmenau (Germany) will be presenting a new solution for acoustic quality inspection allowing contact-free, non-destructive testing of manufactured parts and components. The method which has reached Technology Readiness Level 6 already, is currently being successfully tested in practical use together with a number of industrial partners.
Reducing machine downtime, manufacturing defects, and excessive scrap
The quality of additively manufactured components depends not only on the manufacturing process, but also on the inline process control. The process control ensures a reliable coating process because it detects deviations from the target geometry immediately. At LASER World of PHOTONICS 2019, the Fraunhofer Institute for Laser Technology ILT will be demonstrating how well bi-directional sensor technology can already be used for Laser Material Deposition (LMD) in combination with commercial optics at booth A2.431.
Fraunhofer ILT has been developing optical sensor technology specifically for production measurement technology for around 10 years. In particular, its »bd-1«...
The well-known representation of chemical elements is just one example of how objects can be arranged and classified
The periodic table of elements that most chemistry books depict is only one special case. This tabular overview of the chemical elements, which goes back to...
Light can be used not only to measure materials’ properties, but also to change them. Especially interesting are those cases in which the function of a material can be modified, such as its ability to conduct electricity or to store information in its magnetic state. A team led by Andrea Cavalleri from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg used terahertz frequency light pulses to transform a non-ferroelectric material into a ferroelectric one.
Ferroelectricity is a state in which the constituent lattice “looks” in one specific direction, forming a macroscopic electrical polarisation. The ability to...
Researchers at TU Graz calculate the most accurate gravity field determination of the Earth using 1.16 billion satellite measurements. This yields valuable knowledge for climate research.
The Earth’s gravity fluctuates from place to place. Geodesists use this phenomenon to observe geodynamic and climatological processes. Using...
24.06.2019 | Event News
29.04.2019 | Event News
17.04.2019 | Event News
25.06.2019 | Architecture and Construction
25.06.2019 | Life Sciences
25.06.2019 | Power and Electrical Engineering