The Impact of High Performance Computing on US Oil Independence

Published December 4th, 2000 - 02:00 GMT

The U.S. Office of Technology Assessment estimates that U.S. dependency on foreign oil could exceed 70 percent by the year 2000, unless new resources are identified and tapped.  

 

New sub-surface imaging techniques, made possible by the advent of massively parallel high performance computers, are significantly reducing the need for imports by enabling the petroleum industry to identify new domestic oil and natural gas sources and to increase production of existing fields.  

 

Academic supercomputing centers were among the earliest users of massively parallel processors (MPP) and, over the past 20 years, helped greatly to develop the technology.  

 

MPP ties together many inexpensive micro-processors to create a single powerful machine that can visualize and analyze huge amounts of data by splitting complex scientific and mathematical problems into small parts, then solving the pieces simultaneously -- or in parallel.  

 

Because it relies on lower cost equipment, parallel processing is transforming the economics of supercomputing.  

 

Today, nearly every major petroleum company has invested in massively parallel computers and the dividends are already great.  

 

The recent discovery by Chevron and PGS Tensor, Inc., of 10 billion barrels of oil in the Gulf of Mexico, underneath a 3,000 foot thick salt dome five miles beneath the Earth's surface, is a case in point. 

 

This extraordinary find is attributed directly to advanced seismic imaging, made technically possible and economically feasible by MPP.  

 

The power derived from parallel computing enabled the move from two-dimensional to more complex three dimensional modeling.  

 

In the quest for new oil sources, this '3-D pre-Stack Depth Migration' makes possible intricate computational manipulation of sound wave data from land and sea, yielding images of huge sub-surface layers and formations with unprecedented detail and clarity.  

 

The resulting 'pictures' of the earth's interior are used by geophysicists to identify potential sources of oil and natural gas, where earlier conventional seismic exploration provided insufficient information.  

 

Massively parallel systems are also enabling geologists and petroleum engineers to:  

examine the sub-surface flow of existing oil fields, to better predict long term capacity.  

 

reexamine seismic data from older sites to identify previously unnoticed deposits.  

 

examine enhanced oil recovery techniques.  

 

provide a better understanding of petroleum production and its effect on local environments.  

 

Chevron continues working as an industrial extension partner with the Pittsburgh Supercomputing Center, where it is running simulations to discern how much geological detail is actually needed to accurately map oil reserves, and to determine the cost effectiveness of various oil extraction techniques.  

 

Massively Parallel Processing has been applied to solving a wide range of industrial and scientific problems, including those of the oil industry.  

 

Federal support through several agencies -- the Department of Defense, the National Security Agency, the National Science Foundation and others -- has provided the catalyst for such efforts.  

Source: www.csac.org 

 

© 2000 Mena Report (www.menareport.com)

You may also like