Among their findings, the Berkeley Lab team found that reservoir permeability had a strong influence on oil flow rate. This graphic tracks oil flow rate, in barrels per day, as a function of reservoir permeability and gas-oil ratio in a model in which the pressure at the blowout preventer is 4,400 pounds per square inch.
The clock was ticking: Their work would help assess the environmental impact of the disaster, as well as develop ways to cap the well, which had been spewing unchecked since April 20.
They used some of the world’s most sophisticated numerical modeling tools, developed at Berkeley Lab over the past two decades for applications ranging from geothermal energy production to environmental hydrology.
Working quickly and amid abundant uncertainties, they estimated that between 60,000 and 100,000 barrels of oil were flowing into the Gulf each day. Their calculations were in line with a final estimate derived two months later based on much more information.
Their research is recounted in an article published in this week’s online early edition of the Proceedings of the National Academy of Sciences.
“We were able to harness Berkeley Lab’s expertise in multiphase flow and computational tools to quickly take on this urgent problem,” says Curt Oldenburg, a staff scientist in Berkeley Lab’s Earth Sciences Division and lead author of the article. Also on the team were fellow Earth Sciences Division scientists Barry Freifeld, Karsten Pruess, Lehua Pan, Stefan Finsterle, and George Moridis.
The scientists were part of a group established by the National Incident Commander in May 2010 to estimate the oil flow rate from the wellhead. One component of this effort comprised scientists from five Department of Energy national laboratories, including Berkeley Lab.
The Berkeley Lab team first developed a simplified conceptual model of the system despite a lack of knowledge about the flow path from the reservoir into the well, reservoir permeability, and pressure in the blowout preventer. They then developed a coupled model of the reservoir and wellbore using a numerical program, called TOUGH2, which simulates fluid and heat flow in porous and fractured media.
Their simulations painted a range of flow rates, from a low of 60,000 barrels of oil per day to a high of 100,000 barrels of oil per day. Their initial estimates are in line with a final estimate established in August 2010 by the entire group and based on independent analyses and observations. It pegged the rate at 62,200 barrels of oil per day upon initial blowout in April, tapering to 52,700 barrels per day just before the well was capped in mid-July.
The Berkeley Lab team’s modeling approach also allowed them to determine the role played by various uncertainties. For example, they found that the rate of oil flow greatly increased as the length of the well in contact with the reservoir increases.
Surprisingly, they also determined that oil flow rate is relatively insensitive to the pressure at the bottom of the blowout preventer. Common sense dictates that as pressure drops at the bottom of the blowout preventer, the oil flow rate increases. Instead, the scientists found that the lower the pressure, the more natural gas exsolves from the oil. Natural gas interferes with oil flow and counteracts the pressure that drives oil upward in the well.
The work was supported by the Department of Energy’s National Energy Technology Laboratory and Assistant Secretary for Fossil Energy.
Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 12 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy’s Office of Science. For more, visit www.lbl.gov.Additional information:
Dan Krotz | EurekAlert!
Litter is present throughout the world’s oceans: 1,220 species affected
27.03.2017 | Alfred-Wegener-Institut, Helmholtz-Zentrum für Polar- und Meeresforschung
International network connects experimental research in European waters
21.03.2017 | Leibniz-Institut für Gewässerökologie und Binnenfischerei (IGB)
The Institute of Semiconductor Technology and the Institute of Physical and Theoretical Chemistry, both members of the Laboratory for Emerging Nanometrology (LENA), at Technische Universität Braunschweig are partners in a new European research project entitled ChipScope, which aims to develop a completely new and extremely small optical microscope capable of observing the interior of living cells in real time. A consortium of 7 partners from 5 countries will tackle this issue with very ambitious objectives during a four-year research program.
To demonstrate the usefulness of this new scientific tool, at the end of the project the developed chip-sized microscope will be used to observe in real-time...
Astronomers from Bonn and Tautenburg in Thuringia (Germany) used the 100-m radio telescope at Effelsberg to observe several galaxy clusters. At the edges of these large accumulations of dark matter, stellar systems (galaxies), hot gas, and charged particles, they found magnetic fields that are exceptionally ordered over distances of many million light years. This makes them the most extended magnetic fields in the universe known so far.
The results will be published on March 22 in the journal „Astronomy & Astrophysics“.
Galaxy clusters are the largest gravitationally bound structures in the universe. With a typical extent of about 10 million light years, i.e. 100 times the...
Researchers at the Goethe University Frankfurt, together with partners from the University of Tübingen in Germany and Queen Mary University as well as Francis Crick Institute from London (UK) have developed a novel technology to decipher the secret ubiquitin code.
Ubiquitin is a small protein that can be linked to other cellular proteins, thereby controlling and modulating their functions. The attachment occurs in many...
In the eternal search for next generation high-efficiency solar cells and LEDs, scientists at Los Alamos National Laboratory and their partners are creating...
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are less stable. Now researchers at the Technical University of Munich (TUM) have, for the first time ever, produced a composite material combining silicon nanosheets and a polymer that is both UV-resistant and easy to process. This brings the scientists a significant step closer to industrial applications like flexible displays and photosensors.
Silicon nanosheets are thin, two-dimensional layers with exceptional optoelectronic properties very similar to those of graphene. Albeit, the nanosheets are...
20.03.2017 | Event News
14.03.2017 | Event News
07.03.2017 | Event News
29.03.2017 | Materials Sciences
29.03.2017 | Physics and Astronomy
29.03.2017 | Earth Sciences