Xuhui Lee, professor of meteorology, and Jeffrey Sigler, a recent Yale Ph.D. and now a postdoctoral researcher at the University of New Hampshire, co-authored the Yale study "Recent Trends in Anthropogenic Mercury Emission in the Northeast United States." They found that between 2000 and 2002 the emission rate of mercury decreased by 50 percent, but between 2002 and 2004 the rate increased between 50 and 75 percent. During that five-year period, overall emissions declined by 20 percent.
The dramatic annual changes in mercury emissions, the study's authors say, cannot be explained climatologically by air flow patterns that would bring either clean or polluted air into the region.
Mild winters and a correspondent decrease in the need for regional power plants to burn coal could partially explain the decline in mercury emissions, according to the authors. The study, published this summer in the Journal of Geophysical Research-Atmospheres, estimates that power plants account for up to 40 percent of total emissions in New Jersey, New York and Pennsylvania and in New England.
"The study highlights just how important power plants are in influencing regional mercury emission," said Sigler. "We should not forget other source categories when formulating abatement policies, since they also contribute significant amounts to the total emissions," Lee added.
Mercury, which converts to highly toxic methyl mercury in ground water, is found in fish and can cause neurological problems in developing fetuses and dementia and organ failure in adults who eat fish in large amounts and over long periods.
The Yale study was conducted at Great Mountain Forest in northwestern Connecticut. The measurements were restricted to wintertime so data on carbon dioxide that comes from the same combustion sources as mercury would not be distorted by photosynthesis. The researchers used carbon dioxide to trace mercury back to its sources with a unique method called "tracer analysis."
"To our knowledge, using the carbon dioxide to trace mercury over a long time period hasn't been done before," said the authors. "We started with actual mercury that's in the atmosphere, worked back to sources that emit it, then calculated the emission rate."
The U.S. Environmental Protection Agency, which does not regulate mercury emissions, determines the mercury emission rate by taking an inventory of existing sources. "Although the EPA's approach is highly useful, it requires accurate measurements of mercury emitted from the smokestack per ton of fuel burned," said Sigler. "These data are hard to come by. Our top-down technique circumvents those rather cumbersome problems and allows for much more timely estimates of mercury emission. It's difficult to get annual changes in the emission rate with the inventory approach."
Janet Rettig Emanuel | EurekAlert!
Climate change weakens Walker circulation
20.10.2017 | MARUM - Zentrum für Marine Umweltwissenschaften an der Universität Bremen
Shallow soils promote savannas in South America
20.10.2017 | Senckenberg Forschungsinstitut und Naturmuseen
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research