The results—published in the Sept. 5, 2011 online issue of the Proceedings of the National Academy of Sciences (PNAS)—are in line with the federal government's official estimates, but just as importantly validate the innovative measuring techniques the team employed.
The accuracy of the measurements was crucial because, "Ultimately, the impact of the oil on the environment depends primarily on the total volume of oil released," according to a report by the Flow Rate Technical Group (FRTG), a collection of research teams charged with using different means to generate an accurate estimate of the amount of oil released into the Gulf.
The new study represents a comprehensive look at the data and findings of the flow rate investigations, focusing on the quality and accuracy of the on-the-fly, under-pressure measurements last summer. "It provides a rigorous assessment of the statistical and systematic uncertainty in our earlier findings," said WHOI scientist Richard Camilli, lead author of the PNAS paper.
On May 19, 2010, prior to commencing investigation of the Deepwater Horizon leak, Camilli testified to Congress that this proposed acoustic measurement technique would be capable of quantifying the flow rate to within "a factor of two." The WHOI-led team found just a 17% uncertainty, or error, associated with their estimate.
Added Chris Reddy of WHOI, another co-author of the study, "Considering all they [the WHOI team] had to do in such a short time frame, I'd be quite pleased with any uncertainty rate under 20 percent."
That low uncertainty rate was due, in large part, to the team's pioneering measuring techniques, devised primarily by Camilli and WHOI colleague Andrew Bowen. In late May of 2010, the WHOI team installed two acoustic instruments on a remotely operated vehicle called Maxx3. The first was an acoustic Doppler current profiler or ADCP, which measures the Doppler shift in sound, such as the change from the higher pitch of a car as it approaches to a lower pitch as it moves away.
"We aimed (the ADCP) at the jet of oil and gas that was coming out, and based on the frequency change in the echoes that came back from the jet, we could tell just how fast it was moving," said Camilli. Within minutes, they obtained more than 85,000 Doppler measurements.
They also used an imaging multibeam-sonar, which operates on the same principles as medical ultrasound. "It gives you the equivalent of black-and-white images of the cross section of the flow of oil and gas," Camilli said. This enabled the researchers to distinguish oil and gas from seawater, Camilli said.
"By using the acoustic techniques, we were able to collect a tremendous amount of data in the limited time window that was available," Camilli said. "We were able to see inside of the flow and make measurements of the velocities. With optical systems, you see only the outside. This was sort of like x-ray vision."
The more than 2,500 sonar images of the jets gave the team a detailed view of the jets' cross sectional areas. Multiplying these average areas by their average velocities yielded an accurate estimate of the rate of oil and gas released. The method was able to capture the full flow by directly measuring the flow at the well's leak sources before the fluids could disperse, the FRTG report stated.
While working at the disaster site, Camilli set up a satellite link with a team of researchers throughout the country to meticulously analyze the data. Using computer models of turbulent jet flow, they came up with an estimate of how fast the fluids were flowing out of the pipe.
To collect and analyze the well fluid itself, WHOI used an isobaric gas-tight sampler, or IGT, a deep-sea device developed at WHOI to sample hydrothermal vent fluids. A pristine fluid sample from within the well was crucial to understand what fraction of the flow was oil.
Analysis of the sample showed that, by mass, the Macondo well fluid contained 77 percent oil, 22 percent natural gas, and less than one percent other gases. With data on how much of what was escaping, the scientists could make a preliminary calculation of how much oil was flowing out of the well.
An accurate flow rate gave engineers a clearer picture of what was happening below the surface and a better chance of figuring out how to stem the flow, how much dispersant should be applied to prevent oil from reaching the surface, and to map strategies to regain control of the well, collect the oil, and limit the environmental damage.
Of the nearly 5 million-barrel total of oil released, an estimated 800,000 were recaptured directly from the well by containment measures and never reached the environment, according to the FRTG report.
Unlike most oil spills in the ocean, which occur at or near the surface, this one was happening nearly a mile deep. It had not been known exactly how petroleum released under the intense pressure and cold temperatures in the depths would behave chemically or physically, but many suspected that not all of it would make it to the surface. "No proven techniques existed for estimating the flow under such conditions," said an FRTG report dated March 11, 2011.
"Over the past decade ultra deepwater oil platforms have gone from non-existent to representing about 1/3 of the Gulf of Mexico's oil production and plans call for a growing number of such facilities," Camilli said. "Society benefits when industry, government, and academia work cooperatively to improve assessment and intervention capabilities" in that setting. The new study confirms "a new tool in the repertoire" to monitor ultra deepwater facilities, he added.
Reddy concurred. "If there is any silver lining" to the Deepwater Horizon spill, he said, "it's that the techniques and equipment developed and used by [Camilli and Bowen] could be used in the future" to help monitor and control any problems with deep-water oil facilities.
Other WHOI members of the research team include Dana R. Yoerger, Jeffrey S. Seewald, Sean P. Silva, Judith Fenwick and Louis L. Whitcomb (also of Johns Hopkins University); along with Daniela Di Iorio of the University of Georgia, and Alexandra H. Techet of MIT.
The study was funded by the U.S. Coast Guard with additional support from a National Science Foundation RAPID grant and the WHOI Coastal Ocean Institute.
The Woods Hole Oceanographic Institution is a private, independent organization in Falmouth, Mass., dedicated to marine research, engineering, and higher education. Established in 1930 on a recommendation from the National Academy of Sciences, its primary mission is to understand the ocean and its interaction with the Earth as a whole, and to communicate a basic understanding of the ocean's role in the changing global environment.
Media Relations | EurekAlert!
Safeguarding sustainability through forest certification mapping
27.06.2017 | International Institute for Applied Systems Analysis (IIASA)
Dune ecosystem modelling
26.06.2017 | Albert-Ludwigs-Universität Freiburg im Breisgau
An international team of scientists has proposed a new multi-disciplinary approach in which an array of new technologies will allow us to map biodiversity and the risks that wildlife is facing at the scale of whole landscapes. The findings are published in Nature Ecology and Evolution. This international research is led by the Kunming Institute of Zoology from China, University of East Anglia, University of Leicester and the Leibniz Institute for Zoo and Wildlife Research.
Using a combination of satellite and ground data, the team proposes that it is now possible to map biodiversity with an accuracy that has not been previously...
Heatwaves in the Arctic, longer periods of vegetation in Europe, severe floods in West Africa – starting in 2021, scientists want to explore the emissions of the greenhouse gas methane with the German-French satellite MERLIN. This is made possible by a new robust laser system of the Fraunhofer Institute for Laser Technology ILT in Aachen, which achieves unprecedented measurement accuracy.
Methane is primarily the result of the decomposition of organic matter. The gas has a 25 times greater warming potential than carbon dioxide, but is not as...
Hydrogen is regarded as the energy source of the future: It is produced with solar power and can be used to generate heat and electricity in fuel cells. Empa researchers have now succeeded in decoding the movement of hydrogen ions in crystals – a key step towards more efficient energy conversion in the hydrogen industry of tomorrow.
As charge carriers, electrons and ions play the leading role in electrochemical energy storage devices and converters such as batteries and fuel cells. Proton...
Scientists from the Excellence Cluster Universe at the Ludwig-Maximilians-Universität Munich have establised "Cosmowebportal", a unique data centre for cosmological simulations located at the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences. The complete results of a series of large hydrodynamical cosmological simulations are available, with data volumes typically exceeding several hundred terabytes. Scientists worldwide can interactively explore these complex simulations via a web interface and directly access the results.
With current telescopes, scientists can observe our Universe’s galaxies and galaxy clusters and their distribution along an invisible cosmic web. From the...
Temperature measurements possible even on the smallest scale / Molecular ruby for use in material sciences, biology, and medicine
Chemists at Johannes Gutenberg University Mainz (JGU) in cooperation with researchers of the German Federal Institute for Materials Research and Testing (BAM)...
19.06.2017 | Event News
13.06.2017 | Event News
13.06.2017 | Event News
28.06.2017 | Physics and Astronomy
28.06.2017 | Physics and Astronomy
28.06.2017 | Health and Medicine