Today’s decision makers in the E&P industry have entered uncharted territory, with access to more data than they have ever had before. As leaders at E&P companies seek to follow the lead of other industries and transform their organizations into data-driven enterprises, a key question still remains: How can new value be unlocked from the data the industry already has?
Finding data that support better decision-making
Trailblazers in the industry have embraced and operationalized digital technologies, and they are already enjoying the “first order” benefits of enterprise data management, real-time information flows, and improved knowledge management and communications—namely that the same patterns are now faster, cheaper and better.
However, few companies have the datasets they need to take the next step and reach “second order” benefits, where the data lead them to make new or different decisions that improve asset values and reduce HSE risk.
For analytics to deliver the kind of insights expected, companies need to ensure that the algorithms are processing as complete a dataset as possible. Ultimately, industry consortiums will prove to be the most effective way to develop the kind of robust datasets that can transform the industry by unlocking new ways of creating value and new modes of operation. Those companies that are open to pooling data and collaborating on solutions will find themselves collectively outcompeting their larger—but more insular—competitors.
Need for E&P consortiums
Every objection there is to an E&P data consortium—like a company’s data are too valuable, competitive or complex—has been heard, but there is a growing recognition that things need to be done differently. “Big Tech” has proved that data equal power, and E&P companies are eager to see the kind of impressive results that other industries already have achieved. Executives and investors of E&P companies are looking for results and signs of a material return on investment for the business. Increasingly, the industry is learning of new and growing key performance indicators (often financial) placed on the people who were originally asked to experiment, innovate and educate the business.
As more and more E&P companies turn to Big Tech for help, they are also coming to terms with the fact that there is not a magic technology that can deliver these kinds of results. If introducing new analytical tools was all it took to improve performance, then a marked difference between companies using Big Data solutions and those that are not would be seen. Instead, subject matter experts are complaining that they are spending almost all of their time wrangling data or worrying that they cannot trust the datasets. There is no question that—when deployed correctly—data and analytics have great potential, and that machine learning, artificial intelligence and other technologies will deliver new value, but this can only happen if that value can be found in the data that have been analyzed.
Uncovering new value through more integrated datasets
Finding new value from data requires bringing together disparate, cross-functional datasets and using the algorithms (appropriately) to find patterns across domains, the kinds of patterns the human brain is not capable of identifying when working within its functional silos. Most companies have likely already brought all the company data together in a shared environment.
However, the more data types, granularity and value-add done to internal data, the more the analytics is limited to only being able to learn from activities that an individual company operates. The algorithm can only learn from what it is shown, so unless external data are brought into the mix, the analysis done will not extrapolate well.
Companies that look to publicly available data will find data that are so severely limited in completeness, accuracy, granularity and timeliness that, while they provide the ability to analyze a much broader population of observations, they do not yield the answers to the more detailed questions. Publicly available data also cannot be combined with robust, high-quality internal datasets, because the underlying data required to correctly and consistently engineer the important features are not available externally.
This is what motivates operators to trade data, but it is hard enough to manage and prepare internal data into tidy, analytics-ready datasets, let alone wrangle datasets provided by multiple other operators.
Moreover, herein lies the rationale for an industry data exchange or data consortium. Other verticals have discovered the value and power of industry data consortiums, as Wood Mackenzie has grown to appreciate through its parent company, Verisk Analytics, which serves insurance and financial services, two of the most digitally evolved industries.
For insurance companies, pooling data—centrally managed and prepared by a data analytics group—has allowed them to conduct actuarial science on practically the whole population being insured, not just their slice of the market. In consumer finance, banks have been able to analyze their profitability and potential default losses from those they extend credit to, even when they are but one of many credit cards in any given wallet. In both cases, insurers and banks have contributed their data to one data analytics company, a far more effective and economical way to consistently prepare and protect data than multilateral, self-organized data trades.
With that central, analytics-ready dataset, companies can get straight into the analysis to find and optimize the value in their portfolios. Over time, having all of these data in one place leads to new ways of adding value that is only possible with that combined dataset, such as fraud detection and cross-industry predictive analytics.
Decades ago, companies in the insurance and consumer finance industries were at that same point of frustration that E&P companies are at today with data and analytics. The difference is that today insurers and consumer finance companies are enjoying the return on investment they have gained from analyzing data in industry consortiums and finding new ways of generating business value.
The E&P industry could easily do so as well. Instead of trying to develop cutting-edge technologies or introduce new processes, E&P companies should work to embrace the idea of industry data consortiums to develop the kind of robust, cross-company dataset the industry has the means to analyze and support business decisions adequately.
Recommended Reading
Poten: North American LNG Projects to Double Capacity by 2027
2024-09-27 - Nine North American LNG export projects under construction will add an estimated 98.6 mtpa of capacity by the end of 2027, with 6 of them located in the U.S., according to Poten & Partners.
TPH: LNG Agreements Start to Materialize, but Most Non-binding
2024-09-17 - Since the end of April, U.S. LNG export facilities have made agreements totaling 16 mtpa, or 72% of the year’s 22.5 mtpa in offtake agreements.
North American LNG Exports Surge: Texas Fuels Mexico’s Growth
2024-11-05 - Mexico is finally getting its feet off the ground with LNG exports, joining the U.S. to make North America an LNG exporting powerhouse.
Ryan Lance: ConocoPhillips Wants to ‘Double’ LNG Project Portfolio
2024-09-20 - ConocoPhillips aims to ingrain itself into the global LNG ecosystem, from shale gas output at the wellhead to liquefaction to shipping to regasification in Europe and Asia, CEO Ryan Lance said.
Kinder Morgan to Boost NatGas Capacity in Texas
2024-10-16 - Kinder Morgan said it has made FID for the Gulf Coast Express expansion and confirmed a new pipeline project to move gas to the site of future Southeast Texas LNG export facilities.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.