Group computing has the potential to direct decisions from knowledgeable individuals to knowledgeable teams. Part II of a two-part series.
What does group computing offer? Simply stated, it offers fast optimization of complex, multi-stage design and operational processes. The optimization of complex processes is at the core of E&P operations, where designs and decisions depend on the input from many disciplines. When these disciplines can work in parallel, sharing ideas and intermediate analytics, the result is faster, better answers.
As an up-front exception, a technical strategy aimed at individual users still can be successful among certain small, highly focused operating companies. These "techno-explorers" tend to play in a particular geographic area and focus on one portion of the E&P supply chain, such as prospect generation. They succeed by providing a small cadre of top-performing individuals with their preferred information technology (IT) toolkits, and by generously recognizing and rewarding their contributions to the bottom line. A narrow business definition can be supported by a narrow IT strategy.
However, larger operating companies that compete across multiple geographic regions and phases of the oilfield life cycle stand to lose by adopting an IT strategy based on a "portfolio of brilliant individuals."
For example, the optimal development plan for an offshore discovery may face the following trade-offs:
* Increasing the number of wells accelerates production, but increases up-front capital costs;
* Increasing completion intervals increases initial production rates, but accelerates potential water and gas breakthrough problems and increases facilities costs;
* Designing for a lower peak production rate due to reservoir quality uncertainties saves capital, but extends field life; and
* Increasing artificial lift capacity increases production rates, but may lead to a water-handling problem.
These issues play out across the five to 10 discipline-based professionals who have input. What "looks" correct in one step may create problems elsewhere. In addition, the relationships are complex and have first-, second- and third-order effects. Consequently, the best way to optimize field development planning is through iteration; i.e., given a sufficient number of comprehensive trials, including geophysical and geological interpretation, dynamic modeling, completion, wellbore and surface facilities, we can get close to the best answer overall. Group computing allows free iteration through exchange of ideas and leads to an answer that best satisfies every point in the chain. Far from the ethereal, this is about matching facilities to production rates, optimizing reserves per well and cost reduction on data acquisition because it would not affect the final answer. This is about making money.
From a management perspective, the difference is that we are computerizing processes rather than individuals. So rather than "tooling up the reservoir engineers," we are computer-enabling the field development planning process. The latter approach includes drilling engineers, production engineers, facilities engineers, geologists and geophysicists. They have all had input before and their own toolkits computerizing the group rather than linking individuals leads to faster, better answers.
The justification and measurement of this approach is better than tracking how much time interpreters spend searching for data. E&P processes tend to have solid metrics, such as unit lifting cost, that are at the core of what matters. Some would argue that process efficiency is too broad a measure to track technical aspects of IT. But current measurements are so narrow as to be meaningless. As our thinking about IT broadens from the individual to the group level, justifying and measuring IT by overall process efficiency will become common.
The move from individual to group computing also changes how technical computing investments are justified. In the first place, the classic approach has not helped measure individual productivity improvement all that much; it would have been a poor approach for group computing, even if it had helped. The classic analysis, available in one form or another for the past 20 years, has been to reduce the time spent looking for data.
Even at an individual level, this analysis is not particularly satisfying since the results of these analyses do not vary much over time; yet, it would be hard to argue that there has been no benefit from digital technology over the past 20 years. More likely, professionals are gathering and integrating more types of relevant information than in the past, resulting in better decisions. This is borne out of the improved productivity of individuals in the face of declining resource quality as fields deplete.
Secondly, the benefits of group computing applied to a business process are more associated with cycle time reduction and accuracy improvement from sharing intermediate results in the work itself than from shaving time from each participant's individual data search. Automating processes through group computing requires a fundamentally different approach to cost justification. Ultimately, the costs of technical computing have to be balanced against the other costs associated with a process. As one company executive put it recently, "My investments in information technology have to compete with investments in drilling wells." The first issue becomes determining what portion of total process costs are represented by technical computing. A breakdown of the design costs has shown technical computing to represent roughly 15% of those costs. This allows the following assertions to be made:
* There is a negative correlation between the design costs and executions costs (presumably, there are diminishing returns at some point);
* Technical computing represents a small portion of design costs, and therefore of the total costs (1% to 2%); and
* There should be a negative correlation (with diminishing returns) between total costs and technical computing costs.
These assertions are consistent with Figure 1, which compares technical computing speed with total costs. We know these figures with some certainty from 1980 through the present. Point A is back calculated by attributing 50% of the improvement in unit cost performance to IT, as calculated by CERA. There is a possibility that the current spend is optimal, in which case, further expenditures on technical computing would not affect other design or execution process costs. If this is true, a 50% increase in expenditures would result in point B. The slope of the line to B is low since technical computing expenditures account for only 1% to 2% of total process cost. Alternatively, further expenditures could lead to continued improvement in unit cost performance, resulting in point C. The microeconomic issue - now quantified - becomes optimizing the input (technical computing) to minimize total process cost.
What is interesting about Figure 1 is the asymmetries on either side of the current spend. Investing too much on technical computing leads - at worst - to flat process costs. However, investing too little leads to higher process costs. This results in a preference to increase investment when there is uncertainty about the credibility of point C, and suggests the revolution of technical computing in oil and gas E&P is far from over.
For this optimization to take place, it will be necessary to split technical computing from back-office computing for budgetary purposes. Back-office computing and technical computing often are lumped into an aggregate "IT Overhead" category for benchmarking. There are two problems with this approach: it assumes some correlation between IT spending on back-office automation (e.g., accounts payable) and technical computing (e.g., 3-D seismic interpretation). It would be better to categorize technical computing spend with process execution costs since there is an established, negative correlation. Of course, including technical computing spend in the budget for line operations would encourage management to take ownership of these costs as they seek to minimize their total unit costs of finding, developing and lifting oil and gas.
Deploying a process will require more care than deploying an individual toolkit. Companies will have to define the process they are going to computerize, set performance baselines and implement innovations at both the individual and group levels. The way we train people currently on new technology - mostly in traditional classroom settings supplemented by one-on-one mentoring - will not be sufficient. We have to move from "adopting some tools" to "implementing a system."
Ownership changes
Traditionally, technical aspects of IT strategy have been the IT department's responsibility or a specialized technical computing group. Obviously, leadership by these people in deploying new process-based IT will still be critical. However, more active involvement of E&P line management in crafting the right IT strategy will be essential. Line management owns operating processes and their efficiency, and must take a larger role as technical IT is applied at the process level.
We in the industry have faced this kind of change before. The introduction of computers at the individual level was a revolution in its own right. The success we have had with computing in E&P over the past couple of decades has bred a set of rules and a mindset about technical computing at the individual level. We will continue to see innovations in traditional applications. The next big level of improved decision-making will depend on process optimization through group computing. Harnessing this potential will require a change in how we manage technical IT.
Recommended Reading
National Grid Agrees to Sell $1.7B Onshore US Renewables Business
2025-02-24 - The sale of National Grid Renewables to Brookfield comes as the utility continues efforts to streamline its business and focus on networks.
Baker Hughes Appoints Ahmed Moghal to CFO
2025-02-24 - Ahmed Moghal is taking over as CFO of Baker Hughes following Nancy Buese’s departure from the position.
Houston American Energy Corp. Buys Renewable Fuels Developer
2025-02-24 - Houston American Energy Corp. (HUSA) is acquiring renewable fuels producer Abundia Global Impact Group (AGIG), the companies announced Feb. 24.
Six New Dean Wildcats Come With 95% Oil in Northern Midland Basin
2025-02-21 - SM Energy reported geologic variability in deposition in the new play in southern Dawson County, Texas, but “it's really competitive.”
Enchanted Rock’s Microgrids Pull Double Duty with Both Backup, Grid Support
2025-02-21 - Enchanted Rock’s natural gas-fired generators can start up with just a few seconds of notice to easily provide support for a stressed ERCOT grid.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.