Powerful new model indicates that current pollution standards may be inadequate to ward off worsening algae blooms
New research suggests that Lake Champlain may be more susceptible to damage from climate change than was previously understood--and that, therefore, the rules created by the EPA to protect the lake may be inadequate to prevent algae blooms and water quality problems as the region gets hotter and wetter.
What future for Lake Champlain? A powerful new model from a team of scientists at the University of Vermont suggests that climate change may pose greater risks to the health of the lake than previously realized. The results may have implications for how the EPA and others manage and regulate not just this international lake but other freshwater lakes across the nation.
Credit: Joshua Brown/UVM
"This paper provides very clear evidence that the lake could be far more sensitive to climate change than is captured by the current approach of the EPA," said University of Vermont professor Asim Zia, the lead author of the new study. "We may need more interventions--and this may have national significance for how the agency creates regulations."
The research was published November 17 in the journal Environmental Research Letters.
MORE THAN MODEST
The study, led by a team of ten scientists from UVM and one from Dartmouth College, used a powerful set of computer models that link the behavior of social and ecological systems. Their results show that accelerating climate change could easily outpace the EPA's land-use management policies aimed at reducing the inflow of pollution from agricultural runoff, parking lots, deforestation, cow manure, lawn fertilizer, pet waste, streambank erosion--and other sources of excess phosphorus that cause toxic algae and lake health problems.
The EPA's modeling to prepare its rules under what's called the TMDL, for "total maximum daily load," concluded that "any increases in the phosphorus loads to the lake due to the climate change are likely to be modest (i.e. 15%)," the agency writes. But the eleven scientists, within the Vermont EPSCoR program at UVM, who led the new modeling were concerned that this approach might underestimate the range of likely outcomes in a warmer future.
UVM professor Chris Koliba, a co-author and social scientist on the new study observed that, "there have been extensive efforts by federal regulators, the State of Vermont, and many other stakeholders to try to remediate and improve water quality in our watersheds. These should be honored. The message of our research is not to demean that work, but to say that in the long run protecting the lake is going to take a lot more than what's being proposed right now."
The new lake model, with support from the National Science Foundation, integrates a much larger assembly of possible global climate change models and greenhouse gas pathways than the current TMDL approach used in its modeling. And the Vermont scientists delved deeply into the indirect and interactive effects of land use changes, "legacy phosphorus" that's been piling up for decades in the sediment at the bottom of the lake, and other factors. From this, they created a set of forecasts for what might happen to Lake Champlain over the next few decades out to 2040--including changes in water quality, temperature, and the severity of algae blooms. Their result: a much more dramatic range of possible outcomes--and greater uncertainty--than those assumed in the EPA's approach.
In several of the plausible hotter and wetter scenarios that the model considers, a cascading set of problems could lead to phosphorous pollution levels in segments of Lake Champlain that "drastically limit land management options to maintain water quality," the team wrote--especially in shallow bays like Missisquoi Bay that was the focus of the new study. In the long run, the risk of underestimating the impacts of climate change could lead to what the scientists call "intractable eutrophic conditions"--a permanent change in the lake that leads to self-perpetuating algae blooms, lost fisheries, and poor water quality.
The new integrated assessment model created by the NSF-funded team under the science leadership of Asim Zia provides a powerful tool that goes far beyond understanding Lake Champlain.
By connecting sub-models--of human behavior and land use, watershed dynamics, global climate models "downscaled" to the local region, and the hydrology of the lake itself--the overall model links together "the behavior of the watershed, lake, people and climate," said Judith Van Houten, UVM professor of biology, director of Vermont EPSCoR, and co-author on the new study. This provides "a way forward to pull back the veil that often surrounds effects of climate change," she says.
"Integrating these models is an enormous achievement that will be exportable across the US and be of practical use to many states and countries as they try to develop policies in the face of climate change," she said. It can allow lake and land managers to test scenarios that draw in a huge range of time scales and types of interactions, ranging from water chemistry to air temperature to land use policies.
Only by solving this kind of model-of-many-models problem, "as we have done," Van Houten said, could a tool be created that has predictive power for decades ahead, "allowing stakeholders to test their ideas," she says, and even "describing the health of the lake out to the turn of the century."
UVM hydrologist Arne Bomblies, a co-author on the study, noted that, "We show through this modeling work the importance of a more comprehensive consideration of climate change impact mechanisms to achieve water quality goals, and the need to adequately address climate change uncertainty."
"Lake Champlain's future is sensitive to climate change," Bomblies said, "and similar challenges are faced by other impaired waters throughout the United States."
Joshua Brown | EurekAlert!
The importance of biodiversity in forests could increase due to climate change
17.11.2017 | Deutsches Zentrum für integrative Biodiversitätsforschung (iDiv) Halle-Jena-Leipzig
Win-win strategies for climate and food security
02.10.2017 | International Institute for Applied Systems Analysis (IIASA)
The WHO reports an estimated 429,000 malaria deaths each year. The disease mostly affects tropical and subtropical regions and in particular the African continent. The Fraunhofer Institute for Silicate Research ISC teamed up with the Fraunhofer Institute for Molecular Biology and Applied Ecology IME and the Institute of Tropical Medicine at the University of Tübingen for a new test method to detect malaria parasites in blood. The idea of the research project “NanoFRET” is to develop a highly sensitive and reliable rapid diagnostic test so that patient treatment can begin as early as possible.
Malaria is caused by parasites transmitted by mosquito bite. The most dangerous form of malaria is malaria tropica. Left untreated, it is fatal in most cases....
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
21.11.2017 | Physics and Astronomy
21.11.2017 | Physics and Astronomy
21.11.2017 | Life Sciences