A special section being published next month in the Journal of Environmental Quality addresses that question. The collection of papers grew out of a symposium at the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America 2011 Annual Meetings.
The section acknowledges the problems that have been encountered with P Index development and implementation, such as inconsistencies between state indices, and also suggests ways in which the indices can be tested against data or models to improve risk assessment and shape future indices.
The P Index was proposed in a 1992 symposium after people became aware of the environmental impacts of P loss from fields. Many farmers were applying manure or other biosolids to their fields at rates that over-applied P. Researchers realized that assessing the risk of P loss from those products was important to protect water quality.
The P Index tool was needed to connect various conditions because P loss is influenced by both site characteristics (e.g., soil test levels, connectivity to water) and the sources of P applied (e.g., inorganic fertilizer, organic sources). It was therefore a great improvement over the use of agronomic soil testing for P risk assessment.
"The objective of the original P Index was to identify fields that had high risk of P loss and then guide producers' decisions on implementing best management practices," says Nathan Nelson, ASA and SSSA member and co-author of the special section's introductory paper. "The P Index has developed into a widely used tool to identify appropriate management practices for P application and fields suitable for such application."
The original 1993 paper by Lemunyon and Gilbert laid out three short-term objectives for the P Index: 1) to develop a procedure to assess the risk for P leaving a site and traveling toward a water body; 2) to develop a method of identifying critical parameters that influence P loss; and 3) to select management practices that would decrease a site's vulnerability to P loss.
These objectives were to be met using fairly simple calculations that took into account both source factors and transport factors. Source factors included levels of P in the soil, rates of P fertilization, and methods or timing of P addition. Features such as soil erosion, runoff, and distance to streams composed the transport factors.
"P loss is high when you have both a lot of P present and an easy transport pathway," explains Nelson. "The index has been designed to evaluate the interaction between these different factors."
Because the P Index can be used to guide conservation practices, the USDA-National Resource Conservation Service (NRCS) adopted it as part of their management planning process. The NRCS, then, left it up to each state to develop their own P Index best suited for their environments and concerns.
"The P Index was meant to be something that could be easily computed with readily available data, so an NRCS agent would be able to obtain the necessary inputs," says Nelson. "But there are many different factors that influence P loss as you move from one physiographic region to the next. The differences in transport processes, soils, and landscapes in each state have led to 48 different versions of the P Index, and some of them are very different."
The inconsistencies of indices across states, along with a perceived lack of improvement in water quality in some regions, are now bringing the accuracy of the P Index into question. With different calculations in place, a set of factors may be categorized as low risk in one state and medium, or even high, risk in another. These discrepancies become especially obvious along state borders.
Researchers understand the need to improve P indices and have made it a priority to base any changes on sound scientific data. Efforts to preserve, evaluate, and improve the P index led the NRCS to release a Request for Proposals within the Conservation Innovation Grant Program. Three regional efforts were funded to evaluate and improve the indices in the Heartland, the Southern State, and the Chesapeake Bay regions of the U.S. Additionally, a national coordination project and two other state-level efforts (Ohio and Wisconsin) were recently funded through the Conservation Innovation Program.
While the final suggestions for the next generation of the P Index are likely a few years off, the research is currently underway. Due to variations in regional characteristics and the problems previously encountered by state boundaries, it is likely that suggestions for improved indices will be based on regional distinctions, Nelson says. The objective is that the evaluations will lead to optimized P indices and better management tools that accurately incorporate site and source characteristics to predict the risk of P loss from fields.
"The scientific community backs the P Index as the best method to assess P loss risk," says Nelson. "The challenge now is to develop consistency in P indices across state boundaries and quantify the accuracy of P index risk assessments."
The full article is available for no charge for 30 days following the date of this summary. View the abstract at https://www.agronomy.org/publications/jeq/abstracts/41/6/1703.
The Journal of Environmental Quality is a peer-reviewed, international journal of environmental quality in natural and agricultural ecosystems published six times a year by the American Society of Agronomy (ASA), Crop Science Society of America (CSSA), and the Soil Science Society of America (SSSA). The Journal of Environmental Quality covers various aspects of anthropogenic impacts on the environment, including terrestrial, atmospheric, and aquatic systems.
Caroline Schneider | EurekAlert!
Alkaline soil, sensible sensor
03.08.2017 | American Society of Agronomy
New 3-D model predicts best planting practices for farmers
26.06.2017 | Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign
Whether you call it effervescent, fizzy, or sparkling, carbonated water is making a comeback as a beverage. Aside from quenching thirst, researchers at the University of Illinois at Urbana-Champaign have discovered a new use for these "bubbly" concoctions that will have major impact on the manufacturer of the world's thinnest, flattest, and one most useful materials -- graphene.
As graphene's popularity grows as an advanced "wonder" material, the speed and quality at which it can be manufactured will be paramount. With that in mind,...
Physicists at the University of Bonn have managed to create optical hollows and more complex patterns into which the light of a Bose-Einstein condensate flows. The creation of such highly low-loss structures for light is a prerequisite for complex light circuits, such as for quantum information processing for a new generation of computers. The researchers are now presenting their results in the journal Nature Photonics.
Light particles (photons) occur as tiny, indivisible portions. Many thousands of these light portions can be merged to form a single super-photon if they are...
For the first time, scientists have shown that circular RNA is linked to brain function. When a RNA molecule called Cdr1as was deleted from the genome of mice, the animals had problems filtering out unnecessary information – like patients suffering from neuropsychiatric disorders.
While hundreds of circular RNAs (circRNAs) are abundant in mammalian brains, one big question has remained unanswered: What are they actually good for? In the...
An experimental small satellite has successfully collected and delivered data on a key measurement for predicting changes in Earth's climate.
The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) CubeSat was launched into low-Earth orbit on Nov. 11, 2016, in order to test new...
A study led by scientists of the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) at the Center for Free-Electron Laser Science in Hamburg presents evidence of the coexistence of superconductivity and “charge-density-waves” in compounds of the poorly-studied family of bismuthates. This observation opens up new perspectives for a deeper understanding of the phenomenon of high-temperature superconductivity, a topic which is at the core of condensed matter research since more than 30 years. The paper by Nicoletti et al has been published in the PNAS.
Since the beginning of the 20th century, superconductivity had been observed in some metals at temperatures only a few degrees above the absolute zero (minus...
16.08.2017 | Event News
04.08.2017 | Event News
26.07.2017 | Event News
21.08.2017 | Materials Sciences
21.08.2017 | Health and Medicine
21.08.2017 | Materials Sciences