A team of scientists has, for the first time, completely characterized an important chemical reaction that is critical to the formation of ground-level ozone in urban areas. The team's results indicate that computer models may be underestimating ozone levels in urban areas during episodes of poor air quality (smoggy days) by as much as five to 10 percent.
Ground level ozone poses significant health hazards to people, animals and plants; is the primary ingredient of smog; and gives polluted air its characteristic odor. It is known that even small increases in ozone concentrations can lead to increases in death from respiratory problems. Because of the health hazards caused by ozone exposure, the research team's results may have regulatory implications.
The team's research, which was funded by the National Science Foundation (NSF), NASA and the California Air Resources Board CARB), appears in the October 29 issue of Science.
Big role of one reaction in predicting ozone in smoggy air
The reaction studied by the researchers plays an important role in controlling the efficiency of a sunlight-driven cycle of reactions that continuously generates ozone. In this reaction, a hydroxyl radical (OH) combines with nitrogen dioxide (NO2), which is produced from emissions generated by vehicles, various industrial processes and some biological processes.
When a hydroxyl radical and nitrogen dioxide collide, these molecules may stick together to form a stable byproduct known as nitric acid (HONO2). Because of the stability of nitric acid, its formation locks up hydroxyl radicals and nitrogen dioxide, and thereby prevents these molecules from contributing to ozone formation; this reaction thereby slows the formation of ozone.
Although scientists have long recognized the importance of the formation of nitric acid, they have, until now, been unable to agree on the speed, or "rate," at which hydrogen radicals and nitrogen dioxide combine to form this end product. "This reaction, which slows down ozone production, has been among the greatest sources of uncertainty in predicting ozone levels," said Mitchio Okumura of the California Institute of Technology--a member of the research team. This uncertainty has affected computer models that simulate air pollution chemistry.
An experimental challenge
Why is there so much uncertainty about the speed or rate of formation of nitric acid? In large part, because instead of combining to form a stable form of nitric acid, a hydroxyl radical may combine with nitrogen dioxide to form a less stable form of nitric acid (HOONO)--a snake-like molecule that quickly breaks apart in the atmosphere. This breakdown of the unstable form of nitric acid releases its hydroxyl radical back into the atmosphere where it may once again become available to form ozone; this breakdown therefore speeds the formation of ozone. Nevertheless questions about the existence, amount, speed and formation of the unstable form of nitric acid have, until now, complicated measurements of the speed or rate of the formation of the more stable form of nitric acid.
But through experiments conducted at the Jet Propulsion Laboratory (JPL) and at the California Institute of Technology using state-of-the-art techniques, Okumura and his colleague, Stanley P. Sander at JPL, led a team of researchers that accurately measured: 1) the overall speed at which hydroxyl radicals and nitrogen dioxide combine, or react, in given atmospheric conditions; 2) the ratio of stable nitric acid to unstable nitric acid that is formed under given atmospheric conditions.
In addition, new laser methods enabled researchers to directly detect the presence of the unstable form of nitric acid in microseconds. And with the help of companion calculations performed at Ohio State by Anne McCoy, they could quantify its yield as soon as it was formed.
The research team's experiments show that the stable form of nitric acid forms slower than previously believed. These results indicate that there is more OH available in polluted, ground-level air for the formation of ozone than previously believed, and thus probably more ozone in the atmosphere than previously predicted.
More ozone than previously believed
To demonstrate the significance of the new results, modelers on the research team led by Robert Harley and William Carter fed their newly quantified reaction rates and ratios into computer models to predict levels of ground-level ozone during the summer of 2010 in the Los Angeles Basin. Their results indicate that many current models have been underestimating ground-level ozone levels in the most polluted areas (where nitrogen dioxide is highest) by about 5 to 10 percent. The research team concluded that relatively small changes in the rates and proportions of reactions forming unstable and stable nitric acid could lead to small but significant changes in ground-level ozone levels.
The importance of the study
"The study illustrates the importance of developing new and improved experimental approaches that interrogate atmospheric systems at the molecular level with high accuracy," said Zeev Rosenweig, an NSF program officer. "This is imperative to reducing uncertainties in atmospheric model predictions."
"The determination of a more accurate value of the rate of nitric acid formation from a hydroxyl radical and nitrogen dioxide will be important in future air-quality modeling," said Anne B. McCoy, a member of the research team. "The research was made possible by bringing together several laboratories with different capabilities and expertise, including my lab at Ohio State, and labs at CalTech, JPL and Berkeley."
The ozone prediction models incorporated into the research team's study are similar to those used by regulatory agencies, such as the Environmental Protection Agency and the California Air Resources Board. Therefore, the team's results may have implications for future predictions of ground-level ozone used by regulatory agencies in developing air quality management plans.Media Contacts
The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2010, its budget is about $6.9 billion. NSF funds reach all 50 states through grants to nearly 2,000 universities and institutions. Each year, NSF receives over 45,000 competitive requests for funding, and makes over 11,500 new funding awards. NSF also awards over $400 million in professional and service contracts yearly.
Lily Whiteman | EurekAlert!
Litter is present throughout the world’s oceans: 1,220 species affected
27.03.2017 | Alfred-Wegener-Institut, Helmholtz-Zentrum für Polar- und Meeresforschung
International network connects experimental research in European waters
21.03.2017 | Leibniz-Institut für Gewässerökologie und Binnenfischerei (IGB)
The Institute of Semiconductor Technology and the Institute of Physical and Theoretical Chemistry, both members of the Laboratory for Emerging Nanometrology (LENA), at Technische Universität Braunschweig are partners in a new European research project entitled ChipScope, which aims to develop a completely new and extremely small optical microscope capable of observing the interior of living cells in real time. A consortium of 7 partners from 5 countries will tackle this issue with very ambitious objectives during a four-year research program.
To demonstrate the usefulness of this new scientific tool, at the end of the project the developed chip-sized microscope will be used to observe in real-time...
Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.
The results will be published on March 22 in the journal „Astronomy & Astrophysics“.
Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...
Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.
Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...
In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...
20.03.2017 | Event News
14.03.2017 | Event News
07.03.2017 | Event News
29.03.2017 | Materials Sciences
29.03.2017 | Physics and Astronomy
29.03.2017 | Earth Sciences