An integrated study is attempting to minimize risk in Deep Shelf wells through careful analysis of available and new data.

The Deep Shelf play in the Gulf of Mexico has operators pretty excited. It's under-explored and holds the promise of vast stores of natural gas.

It's also one of the trickiest new plays in the world.

Deep gas reserves mean high-pressure/high-temperature wells, and the Gulf's enigmatic geology causes reservoir characterization headaches on a regular basis. With very little well control and seismic datasets that are often less than reliable due to salt structures and other problems, success rates have been far lower than operating companies (and their stakeholders) are comfortable with.

The Integrated Reservoir Solutions division of Core Laboratories is launching an ambitious project to help increase success rates and reduce risk. A data integration project on a grand scale, its goal is to take what is known, extrapolate it to what isn't known, and gather as much new data as possible to unlock some of the secrets of this difficult play.

Collecting the data

Core Lab is coordinating a multi-company study to characterize the reservoir quality and petrophysical properties of deep shelf reservoirs, predict reservoir quality in undrilled areas, and calculate the seal capacity of the seal rocks bounding the reservoirs. To gather the data needed to complete the study, Core has invited operators active in the Deep Shelf play to sign on.

The idea is not new. "The business model that our group has been working from for 25 years is we work closely with our clients, identify where they're having problems and provide them with solutions to those problems," said Wayne Sealey, senior technical advisor for the Integrated Reservoir Solutions Division. Over the years Sealey's group has done a multitude of regional studies with input from operators. In the Wilcox Formation in South Texas, for instance, clients were having problems identifying pay, predicting permeability and completing wells. "We put together the technologies we had available to us to address those questions," he said. "We then used the information we generated in the lab, combined with support information such as logs and drilling and completion reports, and developed a series of analogs.

"The next time an operator drills a well in the Wilcox, we can evaluate this rock to see if it's a similar rock type to what we've profiled in the lab. We can say, this is rock type A, yes it was productive, here's how we completed it, etc."

While data in the Deep Shelf is scarce, data in other parts of the region is not. Core Lab has done regional studies in the deepwater Gulf, the offshore Miocene play, and onshore from the Jurassic through to the Miocene. Between these studies, Sealey said more than 500 conventionally cored wells have been evaluated in detail.

Core Lab also has worked with most of the operators active in the Deep Shelf play in conducting these other studies. "Because the Deep Shelf wells are so expensive, they've asked us to use the work we've done elsewhere in the Gulf to help them minimize their risk by predicting the reservoir quality in these undrilled Deep Shelf areas," he said. "That was the challenge they laid down before us - to help them minimize their risk and increase their success rate in these deep exploration wells."

With very little well or core data to go by, Sealey's group was encouraged by clients to team up with Geocosm. Geocosm has developed a rigorous and reproducible method that uses petrographic input parameters and core analysis data to model diagenesis. The software program developed by Geocosm to do the diagenetic modeling is called Touchstone. "Combining Geocosm's expertise of reservoir modeling with Core's extensive existing core database seemed a logical step toward helping our customers' predict reservoir quality in the under explored Deep Shelf reservoirs," he said.

Ten companies have joined the project to date. As a participant in the project, each company is asked to contribute conventional or rotary sidewall cores, percussion sidewalls and/or drill cuttings from four wells for analysis, evaluation and inclusion into the database.

From existing cores and new cores a series of calibration wells will be developed. A calibration well entails performing detailed petrographic analysis with respect to grain size, sorting, clay content and quartz cement quantification; all are important input parameters for diagenetic modeling. A burial history profile is also developed for each calibration well. Now the well is ready to be used as an analog for reservoir quality prediction in appropriate undrilled areas.

"With the calibration well we know what the rock properties are and what the rock has been through in terms of thermal history and stress regimes during burial," Sealey said. "The calibration wells can be used very successfully to model reservoir properties in the undrilled areas. Our clients have seismic data, we'll have worked with them on the basin modeling and we'll have some idea of the burial history for their prospect area. We can now model an undrilled prospect using the calibration wells where the geology makes sense in terms of age and provenance."

Predicting permeability and porosity will be more reliable using the correct calibration wells, and additionally some predictions can be made in terms of cementation and advanced rock properties, he said.

Reservoir quality prediction will be performed for three selected prospect areas for each member company using Touchstone.

The various reservoir quality, reservoir quality prediction and seal rock data generated on contributed wells will be compiled, interpreted and presented to each company in digital individual well reports and in periodic summary reports. For newly drilled wells, the data and interpretations will first be released only to the contributing company. Once the well logs are released to the MMS, however, all other participants will also receive the data, meaning that eventually they'll have data and interpretations of every well contributed to the project.

A competitive advantage

The result will, Sealey said, be a powerful risk analysis tool for participating companies. "They do risk analysis now, but it's not based on rock properties," he said. "It's based predominantly on public domain production data from analogous fields. Our approach is based on rigorous scientific methodology using rock property data and information."

The project hasn't been a tough sell. Sealey said that a representative of one of the first companies to sign on told him that he'd had no problem promoting the concept internally because "the value was so readily apparent." "Different companies have different reasons for participating in this project," he added. "Some companies don't have time or internal expertise to tweak the burial history curves and run multiple Touchstone simulations. Others have in-house basin modelers and may have an alternative burial history, so they'll want to run their own models.

"We want to get to a standardized approach so that when you're using these wells in an analogous situation, you're comparing apples to apples."

In response Core Lab has set up technical sub-committees to build consensus on issues such as burial history development.

For many companies the real enticement to participate in the study is the calibration dataset. A significant portion of the participation fee is budgeted to develop calibration wells; costs to develop this size of calibration set independently would be substantial.

Despite the regional nature of much of Core Lab's work, analogs can exist in any part of the world. One company, for instance, is interested in the data to risk wells for its Southeast Asia drilling program. As more companies come on board, more data will be collected. "Every time another company joins, we gain a better understanding of Deep Shelf reservoir and seal rock quality and expand the calibration dataset," he said.

Data integration

Sealey said his group is comfortable with the sometimes difficult process of scaling information up from pore system scale at the core level to kilometer scale at the seismic level. "We're able to evaluate rock from cuttings, and we have a unique way of classifying rocks that goes back to systems developed years ago," he said. "So we do feel comfortable being able to apply what we know on a core sample level up through seismic scale."