An iterative process of reservoir modeling helps avoid costly mistakes.

Approximately 80,000 wells are drilled a year worldwide, but 40% of those are dry or economically marginal. Sub-optimal drilling, operational and reservoir management decisions result in nearly 60% of known hydrocarbons being left in the ground. The average rate of depletion, based on the first year of production, is accelerating in mature basins like those found in the United States. In the last decade, this rate increased from 19% to 28%, highlighting the challenge E&P enterprises face in both replacing and growing their reserve base. We call this "going up the down escalator."

In response to these challenges, the industry has pursued new strategies such as moving to more high-risk basins internationally, drilling deeper in mature basins, specializing in and dominating regional plays, and focusing on unraveling more complex geology. Yet the technologies dedicated to furthering the understanding of the subsurface and thus supporting the pursuit of these strategies have aged and are often times inadequate.

Technology overview

There are a number of key technologies that have been applied to reservoir characterization over the past decades. These include well logs, seismic, engineering analytics and reservoir simulation. Collected at a resolution of millimeters to meters, well logs aid in determining where to complete in the rock column and in the description of highly localized reservoir features and properties in and around the well. Their limitation is their lack of visibility beyond the well's close-in vicinity.

Over the past two decades the application of seismic technology, particularly 3-D seismic, has had a dramatic impact on predicting the location of hydrocarbon-bearing rock and thus improving exploration success. It is useful in identifying features such as faults and has the capability of distinguishing changes in reservoir characteristics and possible fluid content. More recently, high resolution seismic has been used to characterize smaller reservoir features and, to some lesser extent, rock properties. However, seismic resolution is fundamentally limited by the attenuation of the high frequencies needed for the detail that is critical to understanding reservoir behavior and reservoir recovery. Further, at depths of 15,000 ft (5,000 m) or greater and in and around features such as salt diapirs, seismic resolution degrades and interpretation can be highly unreliable. Time-lapse seismic, the next seismic innovation, is very costly, and thus the rate of adoption is slow.

Tools used today by reservoir engineers include well test analysis, material balance methods and decline curve analysis. Each of these three methods provides information to address a specific reservoir concern. For instance, well test analysis provides permeability estimation and near-wellbore effects such as skin and boundaries. However, the interpretation is based on a homogenous reservoir assumption. Material balance methods can estimate in-place volume, but this method becomes problematic when the drive mechanism is more than simple volumetric depletion. Material balance methods use only static build-up pressure and cumulative production. Decline curve analysis also is useful in estimating in-place volume and provides some information on average permeability, but it cannot determine shape or connectivity of the reservoir. Decline curve analysis uses flowing pressure, rate and cumulative production but not the important shut-in build-up pressure information. These engineering methods are independent of each other, and they provide only partial information about the reservoir and, in many cases, inconsistent or contradictory interpretations. In addition, they all require considerable amounts of production, in the order of 10% to 20% of the recovery, before a consistent trend can be discerned.

A major technology, reservoir simulation, was developed more than 40 years ago for field-wide resource estimates and production forecasts. The domain of gurus because of the difficulty of its use, reservoir simulation has undergone evolutionary and incremental improvement over the past few decades. Parallel processing, interfaces with geocellular models and preprocessing routines have been added with the objectives of making it more accessible and speeding run time. Nevertheless, reservoir simulation technology has limited flexibility in adequately modeling across the continuum of scale inherent in reservoirs. Small-scale features may be obliterated in the up-scaling process or inadequately captured by the grid architecture. Further, because of the difficulty in modifying the grid system, the geological and geophysical (G&G) interpretation may be accepted as de-facto and not routinely subjected to comprehensive hypothesis testing and exacting interplay with the engineering data.

The conundrum with these maturing technologies is that companies increasingly have to deal with subtle reservoir features and complex reservoir behavior that well logs, seismic interpretation, engineering analysis and reservoir simulation may not detect, capture or adequately model. Limitations of reach, resolution, analysis, grid scaling and architecture; integrative capability between G&G interpretation and reservoir engineering; and cycle time may stymie these efforts. Most importantly, the modeling and characterization must be completed early and in sufficient time to proactively and skillfully impact decision-making and reservoir management.

A new paradigm

There is a new technology-based paradigm that overcomes many of the shortcomings in the industry's existing technology toolkit. Dynamic Reservoir Characterization™ (DRC™) is a process built on technology specifically designed to accommodate the dynamic range of scale in both time and space needed to accurately characterize and model both reservoir features and behavior. It supports interactive and iterative collaboration between the G&G and reservoir engineering disciplines. Beginning with the geologic interpretation (a static model) and subjecting it to the dynamic data stream of pressure and production readings, hundreds of automated realizations are generated to test boundaries, drainage, size, properties, interpretation and drive mechanism. Finally, there is convergence to a precise, predictive and consistent model that honors the reservoir data, including production, flowing and shut-in pressure information; has tested and typically modified the geologic interpretation; and replicates observed reservoir behavior.

The process has universal applicability in quickly and precisely characterizing and modeling reservoir features and behavior. Converged models are usually arrived at in days to weeks. The process leverages existing data and work, including collected and real-time pressure and production data, lab analysis, test and shut-in data, and geologic interpretations. The technology adapts to data from millimeter to kilometer, seconds to months, all facilitated by its automated, unstructured and adaptive meshing capability. Sensitivity analysis and rapid scenario testing are enabled for richer understanding and investigation of planned operations such as development drilling. Thus modeling can commence in front of the drill bit, from first test or with initial production. Reservoir surveillance is enabled because the model can be continuously updated as new readings and information are delivered from the field. The dynamic scaling, automated meshing, synthesis of G&G interpretation with engineering data, and rapid and real-time workflow mean subtle features can be captured beyond the vicinity of the well, seismic interpretation can be tested, and reservoir behavior can be understood and predicted in real-time. The result is earlier, more informed and predictive decision-making and management. This includes drilling and production decisions, resource estimates, forecasts, well test design and performance, and completion and fracing operations. Used proactively, the challenges of drilling deeper, as well as more complex and subtle traps, are not as daunting.

A case study

This example involves a gas well drilled to a depth of more than 10,000 ft (3,000 m) into a sandstone reservoir. The flow rate averaged more than 10 MMcf/d for a 60-day flow period followed by a 2-day shut-in period for build-up. A total of 700 MMcf/d was obtained in the 2-month production history. Wellhead pressure was recorded throughout the history of the well, while bottomhole pressure was obtained during the build-up period.

The well was drilled based on 3-D seismic in a reservoir interpreted as a turbidite submarine fan with restricted channels.

The information available consists of well production and wellhead pressure data, well logs, seismic maps, well completion diagrams, daily reports, core analyses, a cleanup well test report and bottomhole pressure information collected during the build-up period.

The DRC process began with the construction of a model depicting the initial concept of the reservoir based on the information and interpretation at that time. As part of the process, an engineering consultant integrated the geologic information, well information and reservoir conditions into a model that represented all the rock and fluid characters of the reservoir. The other crucial characterization was identification of boundaries, faults, fractures and layering that influenced the fluid flow in the reservoir and, hence, the performance of the well. In this example, the well was interpreted to be located at the thickest point [more than 50 ft (15 m)] in the reservoir with the pay thinning away from the well. The reservoir shape was believed to be an elliptically shaped body (Figure 1).

The objectives of the study were to determine the size of the reservoir and the drive mechanism since the operating company suspected the presence of an aquifer in the downdip part of the pool.
The iterative modeling process showed a reservoir with the producing well located in a channel connecting to sources of gas at both ends. The shape of the reservoir is consistent with two lobate-shaped sand bodies connected by a channel valley fill. The final shape of the reservoir (Figure 2) is substantially different than the initial concept of an elliptical reservoir.

The best model was based on volumetric depletion without the support of an aquifer. After comprehensive testing and hundreds of realizations, interpreters found there was no geologic model with the presence of an aquifer that replicated the complete pressure and production data. Subsequent additional production supported the volumetric depletion conclusion as it followed the prediction. The integrated DRC approach resulted in an estimated Original-Gas-In-Place (OGIP) of more than 25 Bcf for the reservoir. This is significantly higher than the 7 Bcf obtained by using a wellhead pressure-cumulative gas decline curve method and the 14 Bcf obtained by using a traditional material balance plot. Without using all of the flow and build-up information, as the traditional methods do, the operating company would have underestimated the remaining gas in the ground. Further, additional drilling location opportunities were identified in case the operator wanted to accelerate production.

With 2 months of production and with a cumulative recovery of 700 MMcf (2.8% of OGIP), the DRC process was able to provide a reserves estimate superior to all of the traditional methods. The model also was available to test development scenarios.

Conclusion

Recent statistics support a range of US $113 billion to $124 billion worldwide average expenditure on drilling and production. As noted at the beginning of this article, 40% of the drills are dry or sub-optimal. This suggests that $45 billion/year may be expended on disappointing results. There is substantial capital at stake. Dynamic Reservoir Characterization can be used for real-time and predictive reservoir modeling and characterization, resulting in more successful and profitable drilling programs and reservoir management.