Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Supercomputers to enable safter, more efficient oil drilling

11.10.2005


Oil companies could soon harness the power of distant supercomputers to tackle problems such as where to place equipment and how to clean up oil spills.



For decades, the industry has used computers to maximize profit and minimize environmental impact, explained Tahsin Kurc, assistant professor of biomedical informatics at Ohio State University.

Typically, companies take seismic measurements of an oil reservoir and simulate drilling scenarios on a local computer. Now Kurc and his colleagues are developing a software system and related techniques to let supercomputers at different locations share the workload. The system runs simulations faster and in much greater detail – and enables analysis of very large amounts of data.


The scientists are employing the same tools and techniques that they use to connect computing resources in biomedical research. Whether they are working with images from digitized microscopes or MRI machines, their focus is on creating software systems that pull important information from the available data.

From that perspective, a seismic map of an oilfield isn’t that different than a brain scan, Kurc said. Both involve complex analyses of large amounts of data.

In an oilfield, rock, water, oil and gas mingle in fluid pools underground that are hard to discern from the surface, and seismic measurements don’t tell the whole story.

Yet oil companies must couple those measurements to a computer model of how they can utilize the reservoir, so that they can accurately predict its output for years to come. And they can’t even be certain that they’re using exactly the right model for a field’s particular geology.

“You never know the exact properties of the reservoir, so you have to make some guesses,” Kurc said. “You have a lot of choices of what to do, so you want to run a lot of simulations.”

The same problems arise when a company wants to minimize its effects on the environment around the reservoir, or track the path of an oil spill.

Each simulation can require hours or even days on a PC, and generate tens of gigabytes (billions of bytes) of data. Oil companies have to greatly simplify their computer models to handle such large datasets.

Kurc and his Ohio State colleagues – Joel Saltz, professor and chair of the Department of Biomedical Informatics, assistant professor Umit Catalyurek, research programmer Benjamin Rutt and graduate student Xi Zhang – are enabling technologies to spread that data around supercomputers at different institutions. In a recent issue of the journal Concurrency and Computation: Practice and Experience, they described a software program called DataCutter that portions out data analysis tasks among networked computer systems.

This project is part of a larger collaboration with researchers at the University of Texas at Austin, Oregon State University, University of Maryland, and Rutgers University. The institutions joined together to utilize the TeraGrid network, which links supercomputer centers around the country for large-scale studies.

Programs like DataCutter are called “middleware,” because they link different software components. The goal, Kurc said, is to design middleware that works with a wide range of applications.

“We try to come up with commonalities between the applications in that class,” he said. “Do they have a similar way of querying the data, for instance? Then we develop algorithms and tools that will support that commonality.”

DataCutter coordinates how data is processed on the network, and filters the data for the end user.

The researchers tested DataCutter with an oilfield simulation program developed at the University of Texas at Austin. They ran three different simulations over the TeraGrid: one to assess the economic value of an oilfield, one to locate sites of bypassed oil, and one to evaluate different production strategies – such as the placement of pumps and outlets in an oil field.

The source data came from simulation-based oilfield studies at the University of Texas at Austin. That data and the output data from the simulations were spread around three sites: the San Diego Supercomputer Center, the University of Maryland, and Ohio State.

Using distributed computers, they were able to reduce the execution time of one simulation from days to hours, and another from hours to several minutes. But Kurc feels that speed isn’t the only benefit that oil companies would get from doing their simulations on computing infrastructures such as TeraGrid. They would also have access to geological models and datasets at member institutions, which could boost the accuracy of their simulations.

The National Science Foundation funded this project to make publicly available, open-source software products for industry and academia, so potential users can download the software through an open source license and use it in their projects.

Tahsin Kurc | EurekAlert!
Further information:
http://www.osu.edu

More articles from Information Technology:

nachricht The TU Ilmenau develops tomorrow’s chip technology today
27.04.2017 | Technische Universität Ilmenau

nachricht Five developments for improved data exploitation
19.04.2017 | Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, DFKI

All articles from Information Technology >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Making lightweight construction suitable for series production

More and more automobile companies are focusing on body parts made of carbon fiber reinforced plastics (CFRP). However, manufacturing and repair costs must be further reduced in order to make CFRP more economical in use. Together with the Volkswagen AG and five other partners in the project HolQueSt 3D, the Laser Zentrum Hannover e.V. (LZH) has developed laser processes for the automatic trimming, drilling and repair of three-dimensional components.

Automated manufacturing processes are the basis for ultimately establishing the series production of CFRP components. In the project HolQueSt 3D, the LZH has...

Im Focus: Wonder material? Novel nanotube structure strengthens thin films for flexible electronics

Reflecting the structure of composites found in nature and the ancient world, researchers at the University of Illinois at Urbana-Champaign have synthesized thin carbon nanotube (CNT) textiles that exhibit both high electrical conductivity and a level of toughness that is about fifty times higher than copper films, currently used in electronics.

"The structural robustness of thin metal films has significant importance for the reliable operation of smart skin and flexible electronics including...

Im Focus: Deep inside Galaxy M87

The nearby, giant radio galaxy M87 hosts a supermassive black hole (BH) and is well-known for its bright jet dominating the spectrum over ten orders of magnitude in frequency. Due to its proximity, jet prominence, and the large black hole mass, M87 is the best laboratory for investigating the formation, acceleration, and collimation of relativistic jets. A research team led by Silke Britzen from the Max Planck Institute for Radio Astronomy in Bonn, Germany, has found strong indication for turbulent processes connecting the accretion disk and the jet of that galaxy providing insights into the longstanding problem of the origin of astrophysical jets.

Supermassive black holes form some of the most enigmatic phenomena in astrophysics. Their enormous energy output is supposed to be generated by the...

Im Focus: A Quantum Low Pass for Photons

Physicists in Garching observe novel quantum effect that limits the number of emitted photons.

The probability to find a certain number of photons inside a laser pulse usually corresponds to a classical distribution of independent events, the so-called...

Im Focus: Microprocessors based on a layer of just three atoms

Microprocessors based on atomically thin materials hold the promise of the evolution of traditional processors as well as new applications in the field of flexible electronics. Now, a TU Wien research team led by Thomas Müller has made a breakthrough in this field as part of an ongoing research project.

Two-dimensional materials, or 2D materials for short, are extremely versatile, although – or often more precisely because – they are made up of just one or a...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Expert meeting “Health Business Connect” will connect international medical technology companies

20.04.2017 | Event News

Wenn der Computer das Gehirn austrickst

18.04.2017 | Event News

7th International Conference on Crystalline Silicon Photovoltaics in Freiburg on April 3-5, 2017

03.04.2017 | Event News

 
Latest News

Bare bones: Making bones transparent

27.04.2017 | Life Sciences

Study offers new theoretical approach to describing non-equilibrium phase transitions

27.04.2017 | Physics and Astronomy

From volcano's slope, NASA instrument looks sky high and to the future

27.04.2017 | Earth Sciences

VideoLinks
B2B-VideoLinks
More VideoLinks >>>