What do producers in the Bakken and Granite Wash shale plays have in common with Home Depot, Wal-Mart and Capital One Credit? They all benefit from data mining, a powerful technological tool that probes vast volumes of data to uncover patterns and trends that can lead to more effective decision making. With the right tools and the right mind-set, analysts can mine data to identify cause-and-effect relationships and boost companies' bottom lines.
For reservoir asset developers, sophisticated data-mining software can help turn huge drilling, completion and production databases into information that can optimize production from older assets, especially assets acquired in a merger.
The Academic Approach
Influenced by their days in the classroom, engineers often take the academic view. In the case of reservoir development, they have learned that reservoirs deplete and so must be viewed as a diminishing asset. The rate of asset depletion is a function of many factors: permeability, reservoir pressure, drive mechanism and possibly, drilling and completion practices. Engineers may understand how the reservoir is "supposed" to work, and thus not find it necessary to confirm their views with data.
The academic approach often involves analytical modeling, another term for "discrete modeling." Discrete modeling is the process of first, making assumptions, such as "low perm equals low delivery," which can lead to conclusions such as this: "Shale reservoirs are not capable of producing hydrocarbons." While discrete models are useful, this example shows why they may not always be the best way to fully understand cause-and-effect relationships.
Discrete models follow this work flow:
• Make assumptions
• Apply engineering principles
• Develop well models
• Evaluate well opportunities.
Making an assumption such as "permeability and reservoir pressure will drive the recovery percentage and ultimate value of the asset" requires measuring permeability and monitoring its change over time. If hydraulic fracturing is applied to low-permeability reservoirs, fracture conductivity is often assumed to be a key driver to recovery. Assumptions from the first step dictate how asset developers continue to optimize, survey and understand their aging assets. But since permeability and conductivity are seldom monitored over the life of a project, it may be futile to use discrete models as the only way to optimize an asset.
Data-driven Modeling
The counterpart of discrete models is the data-driven approach. This approach, called "statistical modeling," seeks causes reflected in the data. A benefit is that while data-mining begins with linear relationships, it can lead to non-linear relationships. If investigators have the proper attitude, they can realize, for example, that degrees of permeability may be reflected in rates of depletion, water cuts and gas breakouts.
Since these models require significant amounts of data, they are best suited for older, developed assets, especially where many types of drilling and completion practices have been applied over the asset's life. A criticism of the data-driven approach is that there is a tendency to abandon sound engineering principles. Thus, the practitioner's attitude, experience and approach are keys to success.
Data-driven models' work flow differs from discrete models' flow:
• Gather and integrate data
• Develop field models
• Use process knowledge to select the best model(s)
• Evaluate opportunities.
In data-driven models, unique metrics are developed that enable the expert user to judge and change the effects of inputs. This generates the desired outcome—optimized production.
It is important to note that engineering validation "audits" the results in the data-driven model. This step incorporates engineering principles into the solution. The reservoir must be characterized, and the completion effect measured against the anticipated reservoir performance. For example, poor reservoirs seldom benefit from exotic completion practices. However, high-yielding reservoirs (higher perm and reservoir pressure) usually benefit from exotic completions if performed correctly.
Thus, there is a life-cycle benefit to using both types of models, adjusting their use as data are collected. In the rare case of a new project (such as an untried shale reservoir), discrete modeling is the best method available. As the asset develops, however, data should be collected, and the data-driven approach—looking for opportunities to exploit the asset—can be applied. The Bakken shale offers an example of how both models have been used by experts to improve and capture optimal production.
Bakken Case Study
Development of the Bakken has transitioned from discrete to data-driven modeling. Drilling began in 1953, and production peaked several times as a result of finding better-quality reservoirs and at least twice courtesy of enhanced technology. The current technological improvement—highly compartmentalizing the hydraulic fractures—is effective.
The discrete-model time span focused on finding the zone and evaluating it as a producer—not just as a source rock for the overlying layers. Discrete models helped develop the Bakken's vertical wells. Next, hydraulic stimulation drove production increases.
Further benefits accrued when horizontal drilling was introduced in the early 1990s. The combination of horizontal drilling and hydraulic fracturing boosted production through the mid-2000s. Discrete models helped producers to understand how to place a horizontal lateral, successfully drill and stay in zone, and protect the tubulars at the heel or "bend." The models also helped show how to improve factors such as fracture conductivity.
By 2007, however, this type of modeling began to run its course in continuous results improvement, and the data-driven approach gained momentum.
In our case study of the Bakken, we used data-driven models to seek cause-and-effect relationships. Following the data-driven-model work flow, we gathered data on as many types of well completions as possible. We filtered the data, looking for outliers we could justify by applying engineering principles. If we could not justify the data, we removed it.
Permeability is a main driver for well delivery, but it is seldom measured. As models were developed, several indicators showed that the reservoir was easily characterized by using data from mud logs. We found three indicators recorded on mud logs that reflect permeability.
Next, we identified several factors that, collectively, represented completion best practices. These steps aided completions by defining reservoir quality and showing how each completion component affected peak oil production. The completion improvements did not overly increase total well costs, but they did enhance production, yielding a very high rate of return on the incremental costs.
The data-mining process allowed us to weigh two factors against each other while holding other factors (there were 11 total) constant. This is key to acquiring information and metrics from mining, as it reveals which factors are the most sensitive in improving production.
The "Bakken Timeline" shows the impact of fracture initiation control (open hole/sliding sleeve against cased-hole perforations isolating with bridge plugs) on peak-month oil production. Other comparisons yielded similar results, leading to best practices. Additional factors considered in the model include proppant conductivity, crosslinked gel percentage, frac-fluid volume and the number of frac compartments.
An additional data-driven model (see chart) contains data from the Bakken Consortium's Nesson State #41X-36 in Williams County. The benefits of drilling a longer lateral and applying the highly compartmentalized system along the full extent, as well as maintaining hydraulic-fracturing procedures, are clear.
Both discrete and data-driven models have an important role to play at different stages in the life-cycle of an asset. The longer the life of a project, the more important the role data-driven models can play in optimizing recoveries.
Lyle V. Lehman and Robert F. Shelley are principal consultants with StrataGen Engineering (info@stratagenengineering.com), Houston.
Recommended Reading
From Days to Minutes: AI’s Potential to Transform Energy Sector
2024-11-22 - Despite concerns many might have, AI looks to be the next great tool for the energy industry, experts say.
NatGas Prices Drop After Freeport LNG Train Shutdown
2024-11-22 - Freeport LNG’s report says one of three liquefication trains went offline due to an oil lube pump issue on Nov. 20.
McKinsey: Big GHG Mitigation Opportunities for Upstream Sector
2024-11-22 - Consulting firm McKinsey & Co. says a cooperative effort of upstream oil and gas companies could reduce the world’s emissions by 4% by 2030.
ConocoPhillips: Longer Laterals Coming to Delaware Basin After Marathon Close
2024-11-22 - After closing a $17.1 billion acquisition of Marathon Oil, ConocoPhillips’ Delaware Basin leader sees opportunities to drill longer laterals and investigate secondary benches underground.
Energy Transition in Motion (Week of Nov. 22, 2024)
2024-11-22 - Here is a look at some of this week’s renewable energy news, including the ranking of top corporate solar users in the U.S.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.