Explorationists will look to computer technology, other industries and the history of climate change to find hydrocarbons.

Increased computer power, along with lessons learned in other industries, is ushering in an amazing new era for oil and gas exploration. In the future, computers will be stretched to emulate human thought processes, represent subsurface images in ways that we can better comprehend and compile data from the Earth's history to indicate more precisely where to find the next big field.

Fuzzy logic, clearer results
The emergence of artificial intelligence tools promises to get computers thinking more like the human brain, a trait that can be extremely useful in oil and gas exploration. Often the best explorationists are not the ones who are the most adept at using the latest software but rather the ones who know the geology of an area inside and out.
At the New Mexico Institute of Mining and Technology's Petroleum Recovery Research Center (PRRC), the React group is developing a system that mimics the explorationist's thought process. The system uses neural networks to combine a deterministic database of known information from adjacent fields with a knowledge base of exploration rules based on the subjective judgments of expert oil finders, expressed as "if-then" rules. A consortium of small and large oil companies is providing data and exploration expertise.
The deterministic base has been loosely grouped into three criteria: proximity to oil occurrences or oil shows, regional and local geologic structure, and regional potential field data. Proximity to oil occurrences can be assessed from oil and gas production records or records of oil shows on mud logs or drill stem tests. In undrilled basins, oil seeps are direct indicators of the presence of hydrocarbons. If oil seeps are not present, source beds and migration paths are substituted.
Maps provide a first view of the regional structure, and well logs provide a more detailed picture of the subsurface reservoir elevations. Seismic, aeromagnetic and gravity measurements plus outcrop analogs can provide additional constraints on structure.
These criteria form the deterministic part of the knowledge base. If the density or proximity of these data is sufficient, a trained neural network can be used to forecast risk at an undrilled location. For exploration, however, the density of the data typically is not sufficient, and the system will apply the if-then rules simulating a human expert's knowledge. This will involve a range of attributes, and the system will assign risk based on the different possibilities implied. Researchers estimate 1,000 to 5,000 rules may be necessary to emulate an explorationist's thought process.
In addition to geologic risk, economic risk is an essential part of the package. The PRRC has tracked the price history and availability of crude oil since 1986, which is not an accurate predictor of the future, but it can provide estimates of deviations from the obvious trend when other future prices are included. These deviations will be used to determine the up or down nature of the current oil price and factored into risk assessment for new projects.
The technology has been tested and used to predict which fields in the Nebraska Panhandle would benefit from waterflood. The procedure, applied to the Cliff Farms Unit, predicted waterflood response sufficient to justify the installation.
In addition, projects in New Mexico's Delaware Basin use a trained neural network to predict the future oil rate from new and behind-pipe Lower Brushy Canyon completions. Predictions on new wells can be made based on log data in less than a day using the database of Brushy Canyon log and core data from 34 wells.
Conclusions from the research show that fuzzy ranking can see obscure relationships and neural networks can correlate multiple fuzzy relationships with oil production to predict recovery.
For more information, visit www.baervan. nmt.edu/REACT/reacthomepage.htm.
Imaging technology
As many oil companies scale back their research and development spending, they are finding success collaborating with each other, service companies, universities and even other industries. Geophysics has a long history of synthesizing techniques from multiple fields to provide and analyze images of the Earth's structure. Techniques from medicine, nuclear physics, satellite reconnaissance, nondestructive testing, holography and even economics have been adapted successfully to improve geophysical testing techniques.
One of the best past examples is in volume rendering, a concept developed in the medical industry that has found its way into such commercial exploration and production technology as Paradigm Geophysical's Voxel Geo product. Thomson Marconi also has transferred technology from the military and developed solid streamers that provide better seismic measurements in rough sea conditions. And Bell Geospace is using US Navy full-tensor gradient technology to obtain better gravity readings.
Technology transfer has been a two-way street. Upstream exploration and production drove much of the early innovation in parallel supercomputing and 3-D volume visualization. Geophysical developments in acoustic, gravity and electromagnetic sensor technologies have had significant applications in the defense, environmental, space and medical communities. Seismic imaging techniques have been deployed to solve problems in areas such as ground-penetrating and airborne radar, medical ultrasound and even solar and planetary investigations.
In hopes of continuing this cross-pollination, the Society of Exploration Geophysicists led a summer research workshop titled "Synergies in Geophysical, Medical and Space Imaging." The workshop brought together a multidisciplinary group of researchers to share the latest advances in imaging in their various specialties.
According to Geoff Dorn, executive director of the BP Center for Visualization at the University of Colorado, there are important distinctions between the application areas.
"First, the three fields use different types of waves to image different kinds of media," he said. "Seismic imaging is applied to elastic anisotropic wave propagation through a very complex medium. Medical imaging uses a variety of modalities, such as acoustic, X-ray and nuclear magnetic resonance, traveling through a 3-D medium, but it's one that is typically less complex and where the 3-D surfaces tend to be smoother than those encountered in seismology. Space imaging uses a variety of frequency ranges in the electromagnetic spectrum to image the surface to shallow subsurface of the Earth or other planetary bodies.
"Second, constraints on acquisition geometry vary significantly. In surface seismic, we are restricted to sources and receivers both being on the surface of the material which we are trying to image, which requires the use of reflections to image the geology. In medical imaging, transmission is used for most imaging modalities rather than reflection, ultrasound being one exception. Space imaging typically uses sources and receivers at a significant distance from the surface being imaged.
"Third, the need for real-time results varies between the fields, with real-time imaging results being most important to medical imaging."
Even with these differences, the areas of overlap are more than sufficient to provide promise for technology sharing among these disciplines, Dorn said. Scattering theory is used in image processing in all three application areas, although the demands of data and signal strength vary. Repeated or time-lapse studies may be able to benefit significantly from the work in time-lapse medical imaging, particularly in registration of data. Techniques also may be shared to improve registration between volumes acquired using different modalities.
Visualization is another area in which there might be significant technology crossover, Dorn said, and the three application areas have focused on different aspects of visualization. For example, medical and biological visualization has made significant strides in incorporating motion, both in terms of movement in the data, such as fluid flow, and in terms of using motion as an aid to data analysis.
Dave Ridyard, chief operating officer at Continuum Resources, added visual collaboration in the oil industry might benefit from gains made in the military training and simulation area and the emerging Internet games marketplace. While some companies offer the ability for interpreters in different locations to view the same data simultaneously, typically the types of collaboration used in the oil industry force all participants to look at the same view.
"It would be like two geologists going to a rock outcrop, taping their heads together and looking at exactly the same thing," he said. "On a real field trip, one would go downdip and the other updip, and they'd holler if they found something. That's real collaboration."
High-frequency climate history
For years explorationists have looked for clues in the Earth's stratigraphic record to determine the most likely hiding places for oil and gas deposits. A team of researchers from Argonne National Laboratory, ChevronTexaco, the University of Illinois at Chicago and NASA hopes to develop tools to predict the basins with the highest reservoir potential for a specific geologic time period on a global scale and forecast the depositional system with the highest potential in a stratigraphic interval within a particular basin. In addition, they are developing a statistical approach to determine the highest potential areas for new fields in mature basins.
The techniques to predict reservoir potential are based on understanding the causes of high-frequency climate cycles, the regions where these cycles will have the greatest impact and how these cycles interact with sea level change. Understanding large climatic shifts is important for determining reservoir potential. It turns out the most extreme conditions can occur relatively rapidly and repetitively. Being able to predict where these changes occur allows the researchers to evaluate how the river systems that deliver the sediment to a depositional basin will be affected over time.
"Essentially, it is supply-side stratigraphy," said Marty Perlmutter, manager of basin evaluation for ChevronTexaco. For instance, a change from arid to wet conditions can occur within a time period as short as 5,000 years. This kind of change can significantly increase the volume of sand transported by a river. The most likely location of this sand deposit - a delta, a deepwater submarine fan or something in between - is affected by where the river mouth was on the continental shelf at the time of the sand pulse. In other words, if sea level was high when the pulse occurred, the highest potential for reservoir might be in a delta. If sea level was low when the pulse occurred, reservoirs would be more likely found in a submarine fan.
Modeling climate change will give the researchers the ability to create a global map indicating the locations of high-potential basins and reservoirs through time. "The climate modeling research is basically attempting to find out how stable the climate systems are in any one location over long periods of time," said Thomas Moore, a geologist in the Energy Systems Division at the Argonne National Laboratory. "Since we believe that most erosional processes and sediment transport are governed in part by climate effects such as precipitation, that would give us a better idea of how to properly model sedimentary basins."
The research team is also working on a statistical approach, developing a software package called Stratistics, which includes spatial statistics that aren't typically applied to geological data but can be used to find patterns and relationships within and between various datasets. Perlmutter said concepts and techniques are being tested on known hydrocarbon-producing areas in the Gulf of Mexico. "We're evaluating the distribution of oil and gas fields relative to specific geologic characteristics of the basin," he said. "Our initial analyses indicate that there is a definite relationship between these datasets. We are very optimistic that this approach will result in a technique that will be used to rapidly define and rank prospective areas of mature basins away from production."