The oil industry is at the brink of the “perfect storm.” Easy oil is gone, and in its place we’re left with harder-to-find reservoirs that are structurally and geologically more complex and
![]() |
|
Figure 1. A new approach enables a highly faulted reservoir model such as this one to be constructed in less than an hour. (Image courtesy of Paradigm) |
How do we survive?
We must start with the premise that the ultimate integration that can exist between the geosciences and reservoir engineering is based on the concept of a shared earth model. The central piece (at the reservoir scale) is the reservoir model resulting from the integration of geophysical (interpretation and attribute) data, petrophysical (logs), geological (conceptual, sedimentological), and engineering (production). It will be used to understand and predict reservoir behavior. A current drawback is that constructing such a model can be very time-consuming. It often must be done by the few “modelers,” modeling software experts with multidisciplinary knowledge, for whom there is unfortunately no obvious training or career track either in industry or in university. The cost in time of training geoscientists and engineers during the course of a project is often too large to support the effort. As a result, we need tools that enable anyone to construct accurate reservoir models for fast turnaround decisions.
On the technical side, current state-of-the-art reservoir modeling technology revolves around the construction of a 3-D “corner-point” geometry stratigraphic grid. This grid is equally used by geologists for geostatistical property modeling and volumetric analysis and by reservoir engineers for flow simulation and reserve assessment. Their construction is dictated by the so-called pillar or extrusion approach that works well in simple, vertically faulted, layer-cake topologies. However, when even a small amount of structural complexity (e.g., multi-z, reverse or y- faults) is added, currently available modeling applications create severe distortion of cells near faults. Very often the reservoir grid simply cannot be constructed. This is a problem that forces users to “fudge” their geological model. Common work-arounds include simplifying or removing the offending interpreted fault data so that the modeling application can be accommodated. The application can now proceed, but the resulting model is simplified, often beyond recognition.
Furthermore, deciding on which faults to remove or “verticalize” can be extremely time- consuming. As a result, the reservoir model no longer accurately reflects the interpreted underlying geology or the data that supports it. Fault-blocks are not adequately represented, which impacts the compartmentalization and connectivity of the reservoir model. Geological distances and constant volume assumptions of geostatistical algorithms are violated due to the deformation of the cells as soon as faults are not sub-vertical. Therefore, the correlations imposed on the facies, porosity and permeability models will be wrong. With a bad model going into the reservoir flow simulator, history-matching results can only be erroneous and definitely not predictive.
The large amount of money that was spent collecting and interpreting the data has now been marginalized by the “fudging” of the model. Data that was difficult to incorporate was ignored due to the time needed to include it into the model. Updating the models with new information or changes in interpretation was a long process as we had to call the consulting modeling expert back and start the model building over again. So we didn’t do that very often either. Thinking of alternative interpretation, geological or production scenarios to understand the associated uncertainty and manage our risks was out of the question (who has the time?).
And we wonder why the production forecasts were wrong?
Riding the perfect storm
There is a new approach. Instead of calling on the expert, we feel any geoscientist or engineer should be able to construct models without being overwhelmed by the idiosyncrasies of the modeling software. They need to focus on the science, not the buttons. Users should be guided through mostly automated processes, focusing at each step only on decisions that require geological and engineering judgments.
Instead of removing or simplifying data, we encourage the use of all available interpretation data (even when there are hundreds of faults). We also reject the idea of “fudging” the answer to make model building faster or easier. Using all the information available ensures confidence in the accuracy of the model and the answers extracted from it and provides better support for decision-making.
Instead of using the same grid format for both geostatistical property modeling and reservoir flow simulation, we use one adapted for each case, honoring all available data and also the constraints of each discipline. The “geological grid” is no longer a victim of the pillar nightmare, and geological distances are respected. The “simulation grid” satisfies this discipline’s requirements yet accounts for the total complexity of the structural geology. These two grids are intrinsically linked, ensuring an accurate upscaling of reservoir properties.
Instead of spending days to months constructing a single reservoir model, new technology developed by Paradigm makes it possible for projects to be completed in hours. Recent benchmarks found reductions in modeling time by factors of 50 to 100 times using every piece of data available and no “fudging.”
With modeling so easily available to all geosciences domains, the earth model can now become a true “live document” that engineers and geoscientists can together share, edit, refine or derive into several alternatives.
Conclusion
Three-D reservoir modeling must become a commodity. It is a foundation piece to the shared-earth model and should be to any reservoir management decision. It cannot be the reserved playground of a select few. Its construction must not be cumbersome and constrained by software limitations.
The modeling revolution has arrived.
Recommended Reading
Not Sweating DeepSeek: Exxon, Chevron Plow Ahead on Data Center Power
2025-02-02 - The launch of the energy-efficient DeepSeek chatbot roiled tech and power markets in late January. But supermajors Exxon Mobil and Chevron continue to field intense demand for data-center power supply, driven by AI technology customers.
Ovintiv Names Terri King as Independent Board Member
2025-01-28 - Ovintiv Inc. has named former ConocoPhillips Chief Commercial Officer Terri King as a new independent member of its board of directors effective Jan. 31.
Murphy Shares Drop on 4Q Miss, but ’25 Plans Show Promise
2025-02-02 - Murphy Oil’s fourth-quarter 2024 output missed analysts’ expectations, but analysts see upside with a robust Eagle Ford Shale drilling program and the international E&P’s discovery offshore Vietnam.
EON Enters Funding Arrangement for Permian Well Completions
2024-12-02 - EON Resources, formerly HNR Acquisition, is securing funds to develop 45 wells on its 13,700 leasehold acres in Eddy County, New Mexico.
Confirmed: Liberty Energy’s Chris Wright is 17th US Energy Secretary
2025-02-03 - Liberty Energy Founder Chris Wright, who was confirmed with bipartisan support on Feb. 3, aims to accelerate all forms of energy sources out of regulatory gridlock.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.