Up until now, atmospheric models and hydrodynamic models have remained separate to a large extent in the Great Lakes region, with only a few attempts to loosely couple them. In a new study, published online this week in the Journal of Climate, an integrated model brings together climate and water models.
The collaborative work is the product of researchers from Michigan Technological University, Loyola Marymount University, LimnoTech and the National Oceanic and Atmospheric Administration's Great Lakes Environmental Research Laboratory. Pengfei Xue, an assistant professor of civil and environmental engineering at Michigan Tech, led the study through his work at the Great Lakes Research Center on campus.
"One of the important concepts in climate change, in addition to knowing the warming trend, is understanding that extreme events become more severe," Xue says. "That is both a challenge and an important focus in regional climate modeling."
To make those connections, the model specifically uses two-way coupling and 3-dimensional modeling to connect atmospheric and lake body interactions. Two-way coupling is like a two-way street and enables feedback between variables; other models use preset inputs that act more like one-way streets. Current models also rely on 1-D lake models that cannot account for the dynamic nature of hydrologic processes in bodies of water as large as the Great Lakes.
For comparison, most widely used global climate models use only tens of grid points (roughly 0.5 degree resolution) to cover all of the Great Lakes, if they account for the lakes at all.
To create a more nuanced view, like what has been accomplished already in ocean coastline modeling, the new model simulates the hydrodynamics of the Great Lakes region with 3-D hydrodynamic model constructed of 40 vertical layers and 2-kilometer horizontal grid resolution. That's roughly 50,000 grids for each layer, which enables feedback between air and water data.
The datasets used are so large that they can only run on a supercomputer. Xue uses the Superior supercomputer at the Great Lakes Research Center. Xue and his team vetted the model's accuracy by comparing its simulations to historical records and satellite data.
"This kind of approach has been recognized as a critical step in the Great Lakes region that has been building over the past decade," Xue says.
The next stage of the research will expand the model to include surface water runoff. Refining the model is a community effort, and the team plans to work with current collaborators to apply and test the limits of the model.
In its current version, the new model provides better footing to further Great Lakes research. By doing so, scientists will glean more information about everything from regional climate change and shipping to oil spill mitigation and invasive species.
Allison Mills | EurekAlert!
Colorado River's connection with the ocean was a punctuated affair
16.11.2017 | University of Oregon
Researchers create largest, longest multiphysics earthquake simulation to date
14.11.2017 | Gauss Centre for Supercomputing
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses