The amount of gaseous mercury in the atmosphere has dropped sharply from its peak in the 1980s and has remained relatively constant since the mid 1990s. This welcome decline may result from control measures undertaken in western Europe and North America, but scientists who have just concluded a study of atmospheric mercury say they cannot reconcile the amounts actually found with current understanding of natural and manmade sources of the element.
An international group of scientists, led by Franz Slemr of the Max Planck Institute for Chemistry [Max-Planck-Institut fuer Chemie] in Mainz, Germany, studied the worldwide trend of total gaseous mercury at six sites in the northern hemisphere, two sites in the southern hemisphere, and on eight ship transatlantic ship cruises since 1977. They have published their findings in Geophysical Research Letters, a journal of the American Geophysical Union.
The fixed sites ranged from the Canadian Arctic to Antarctica. In both hemispheres, total gaseous mercury increased in the late 1970s, apparently peaked in the late 1980s, decreased to a minimum in the mid 1990s, and has remained relatively constant since then. Concentrations in the southern hemisphere are about one-third less than in the northern hemisphere. These observations accord well, the researchers say, with data on mercury deposited in peat bogs and found in ice cores.
Scientists have believed that natural processes and human activities put about equal amounts of mercury into the atmosphere. Assuming that natural emissions and re-emissions of the historically deposited mercury have remained constant, the observed reduction of about 17 percent in concentration from 1990 to 1996 would have to result from a reduction of about 34 percent in manmade emissions during that period. This, the scientists say, is three to four times larger than the 10 percent decrease in manmade emissions suggested by previous studies. Therefore, either our understanding of manmade emissions or of the ratio of natural to manmade emissions probably has to be refined, they say.
The level of atmospheric mercury is important, even though at current levels, it is not directly toxic. The problem, says Slemr, "is that some 5,000 metric tons of atmospheric mercury are currently deposited worldwide every year. The atmospheric lifetime of elemental mercury is about one year and, thus, the mercury is deposited even in remote areas."
Further, Slemr says, some of the atmospheric mercury is deposited into soil and water, where it can be "transformed to methyl mercury, one of the most toxic compounds." In ocean water, methyl mercury concentrates in plankton and further accumulates in fish, especially those high in the food chain, such as tuna. High methyl mercury levels in tuna can lead to chronic diseases in persons who eat the fish, with pregnant women most in danger.
Therefore, the researchers say, it is essential that we better understand the amount and sources of mercury in the atmosphere. The amount of mercury emitted naturally is not well understood at present. With regard to manmade emissions, coal burning definitely emits mercury, and it was recently discovered that biomass burning is another important source. Waste incineration is also a source, but not yet well quantified. Further, says Slemr, the annual re-emission of a small fraction of the 200,000 metric tons of mercury deposited into the environment since Roman times is uncertain.
Slemr and his colleagues conclude that future emission inventories must take into account the difference between atmospheric mercury levels in the northern and southern hemispheres, as well as the historic and present day emission trends. Further research will be necessary with regard to the quantitative and qualitative sources of atmospheric mercury, both natural and manmade, for any emission inventory to be credible.
The study was funded in part by the Deutsche Forschungsgemeinschaft.
Harvey Leifert | AGU
Colorado River's connection with the ocean was a punctuated affair
16.11.2017 | University of Oregon
Researchers create largest, longest multiphysics earthquake simulation to date
14.11.2017 | Gauss Centre for Supercomputing
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
17.11.2017 | Physics and Astronomy
17.11.2017 | Health and Medicine
17.11.2017 | Studies and Analyses