While storytellers inevitably paint artificial intelligence (AI) as a recipe for the apocalypse, they tend to play down the technology’s transformative potential.
The oil and gas industry has used machine learning (ML) and AI for years, but it was only with the advent of cloud computing that ML- and AI-based solutions really started to take off.
The technologies’ cases abound, but so do the drawbacks. The computer programming maxim of garbage-in-garbage-out holds painfully true for ML and AI: biased or flawed data used to train the software can skew results profoundly.
More recently, generative AI has shot up the hype curve, dominating headline after headline. The technology makes AI accessible to the masses and has the capacity to transform how people search for information and do their jobs.
At the same time, generative AI will not admit when it doesn’t know something, which leads the nascent technology to supply erroneous information, commonly referred to as “hallucinations.”
Mehdi Miremadi, a senior partner at McKinsey & Co., noted the definition of AI has evolved over the years as the capabilities of the technology evolved.
“What people considered, say, about 10 years ago as AI now seems rudimentary, and in fact, many may not necessarily call that AI anymore,” he said. “AI and generative AI are value creators, right? And as an industry, we should really look at them as, ‘Where are the areas where this can optimize our performance?’”
Manas Dutta, general manager for the Workforce Excellence Break Through Initiative portfolio at Honeywell Process Solutions, said AI and ML can augment human intelligence by processing information that a person may not be able to digest due to limitations on human brain data processing capacity.
“If we are using AI, ML, it does not mean that it's going to completely replace human intelligence. It is mainly a co-pilot,” Dutta said.
Jay Shah, principal of energy marketing and innovation programs at Amazon Web Services (AWS) said ML can train itself using vast amounts of data, without manual intervention, and identify trends and patterns from the information.
“AI is a variant of machine learning in that it's built on foundational models, so you can create these models that then contextualize even further those insights for specific workloads,” he said.
ML models are also exploding in size, said Vasi Philomin, vice president and general manager for Generative AI at AWS.
“The machine learning models are getting bigger and bigger and bigger,” he said.
In typical ML learning, a set of data is used to train the program to carry out a specific task—detecting objects in an image, for instance, or summarizing the contents of a document.
“Once you've trained the model, you put it out into the wild on data that it has never seen before, which means there's new documents coming in all the time. And this model will then start extracting the things that it was trained to do,” Philomin said.
In data we trust?
Honeywell’s Dutta noted the oil and gas industry generates massive volumes of data daily. Often, it’s not “refined” into a way it can be used properly. AI, he said, helps with that.
“If data is the new oil, AI is the refining,” he said.
AI is particularly useful in the geoscience sector, which acquires huge amounts of data that must be heavily processed in order to be of use in guiding exploration, said Song Hou, who heads CGG’s AI lab.
“Over the last 10 years, the whole industry, including CGG, has been actively adapting AI to optimize our workflows,” he said.
AI can provide more consistency, he noted, because people might interpret seismic imaging differently, whereas an ML model integrated into the workflow will be more objective.
“If we do purely rely on the human, we have limited ability to process big data," he said. When interpreting satellite imagery manually, "we probably look at 10 scenes per day or even less.”
But harnessing the power of the ML model makes it possible to improve efficiency because it can process much more data more rapidly.
“It’s not just for the speed, it's for the complexity because sometimes we get the data from different sources,” he said.
Sometimes, he said, the data comes from different sources and the relationship is too complex for a human or traditional algorithms to extract.
“That's where machine learning and AI come into play. They effectively analyze our complex data and help us to extract the useful information,” Song said.
But the results depend on a solid starting point: good data. Trust in the training model is critical.
Getting good results out of generative AI hinges on the data that goes into it, so it’s vital to curate the data, said James Brady, chief digital officer for Baker Hughes oilfield services and equipment. Biases in the training data can lead to biased responses, he noted.
“We're in a very high risk industry, right? And our data is not always so good,” he said.
Scrubbing data is part of ensuring the training data is trustworthy.
Training days
ML is now unsupervised compared to the early days of AI and ML, when humans were very involved in the process, Brady said.
The pivot to unsupervised learning, along with cloud computing capabilities, made it possible to use the models to generate something new, he said.
“When you talk about training these large language models, you talk about it in terms of gigawatts,” Brady said. “You don't talk about megaflops or any of that stuff that we used to. You talk about it in terms of energy consumption. That's how big some of this stuff is.”
The approaches used for training standard ML and for generative AI are similar, but the training sets for generative AI have to be exponentially larger — leaping from about 300 million parameters in 2019 to 500 billion by 2022. The resulting models are correspondingly larger, Philomin said.
“The bigger the model is, the more data you need to use to train those models,” Philomin said.
The learning goes on behind the scenes.
“Given the scale of data that these big models have seen, they've got a whole bunch of patterns and knowledge that they have learned automatically behind the scenes. And now you can, in a very simple way, unlock all of that knowledge and put these models to use,” Philomin said.
Without much effort, he added, it’s possible to get large models to perform multiple tasks, rather than the single tasks from older generation ML models.
“You don't have to build a separate model for each task anymore,” he said.
Instead, the big model can be primed with a few examples, and then it can do the task in question.
The upshot?
“Generative AI is now within the reach of everybody,” Philomin said.
What has partly thrust generative AI into the limelight this year actually stretches back about six years ago, when cloud vendors started offering pay-as-you-go compute access.
“Machine learning requires a lot of compute,” Philomin said.
Elastic computing allows for a user to spin up compute power on thousands of machines as long as it’s needed and then release them when the work is done. That elastic compute access made ML accessible to people who couldn’t afford their own hardware and data center.
The cloud was a natural place for data to accumulate, Philomin said.
“When all of that data sits in one place, you can drive a whole bunch of insights and make that data much more useful than the data just sitting there,” Philomin said.
Manoj Saxena, founder of the Responsible AI Institute, said AI needs “guardrails” at every stage, from design to development and deployment to monitoring. That prevents AI from going off-path and creating massive unintended consequences.
“AI is unlike any other technology that humankind built. Every system we have built before was rules-based,” he said.
AI, however, is “pattern based and learning based.”
“These are systems that connect the dots across pieces of information or pieces of image, and then they learn and evolve on their own. And it's that potential and peril that requires responsible AI. So on one hand, you have this ginormous brain that is able to make sense out of a lot of patterns and learn and evolve on its own. Yet on the other hand, if that system learns the wrong things, it has potential to create massive harm.”
Fundamental change
Moe Tanabian, chief product officer at Cognite, said generative AI has the potential to fundamentally change the way people work. Historically, specialists were required to build an app or piece of software to answer specific business questions.
“Now, that application is being replaced by an English—or any language—sentence,” Tanabian said. “That is the application. Now, everybody, as long as they can speak in natural language, they are a software developer.”
In short, he said, generative AI takes a natural language sentence and turns it into code, which is something the underlying software technology can understand. A well-considered query can uncover important information for a business.
“Let's say I want to buy a hundred pumps for my facilities in Alaska, and the temperature gets really low. I can ask a question from my data platform that shows me all the pumps that have failed or have had downtime of more than two hours in the last six months where the preceding week's temperature was below minus 15 degrees,” Tanabian said. “We can do this now. We can answer that question. But what it takes to answer this question, you have to have a digital representation of your refinery, your offshore platform, your factory.”
Sriram Srinivasan, senior vice president for Halliburton Global Technology, said generative AI is especially helpful with coding because programming language requires precision.
“The more precise the language is, the better the generated AI based on that language will be. So it's no surprise that coding is seeing the most impact at the moment,” he said.
That’s one reason training sets have to be so large—in order to create the large language models needed by generative AI.
But one of the biggest shortcomings of generative AI is its inability to admit when it doesn’t know something.
As Song put it, “Generative AI can talk nonsense confidently.” He said the key to successful applications at CGG is supplying it with precise and accurate data.
The faulty responses it provides—the hallucinations—happen fairly frequently. Sometimes they can come across as funny. But in a safety-minded industry like oil and gas, a wrong response can be disastrous.
As an example, Brady said an offshore worker in the Caspian Sea might ask a generative AI assistant what to do if they smell eggs.
“Some of us that have been around know exactly what that means,” he said.
But a response from a generative AI system not trained on oil and gas safety might recommend a trip to the canteen to order eggs and bacon rather than advise the worker to take precautions against hydrogen sulfide poisoning, he said.
“I think within our industry, there's going to be a lot of caution around that, where you have really high liability decisions and impact,” Brady said. “I think it's incredibly powerful. I don't want to paint all the bad things, but I think there's a certain sobriety that we all have to have with any new technology.”
Editor’s note: This is the first part of a multi-part series examining the use of artificial intelligence in the oil patch.
Recommended Reading
North American LNG Exports Surge: Texas Fuels Mexico’s Growth
2024-11-05 - Mexico is finally getting its feet off the ground with LNG exports, joining the U.S. to make North America an LNG exporting powerhouse.
FERC OKs Export Capacity Expansion at KMI’s Georgia LNG Plant
2024-11-25 - Kinder Morgan subsidiary Southern LNG’s Elba Island terminal has been granted permission by the Federal Energy Regulatory Commission to go ahead with a 0.4-mtpa production bump.
First-half 2024's US LNG Exports Rise 3%, DOE Says
2024-10-11 - U.S. LNG exports rose 3% in the first half of 2024 compared to the same six month period in 2023 and the top 10 countries importing U.S. LNG accounted for 67% of the North American country’s LNG exports in the first half of 2024, according to a recent report from the U.S. DOE.
Mexico Pacific’s Saguaro: LNG’s Quicker Route to Asian Markets
2024-11-19 - Mexico Pacific’s 30-mtpa Saguaro LNG terminal promises a connection to Asia for Permian Gas that avoids the Panama Canal.
DOE: ‘Astounding’ US LNG Growth Will Raise Prices, GHG Emissions
2024-12-17 - The Biden administration released Dec. 17 a long-awaited report analyzing the effects of new LNG export projects, which was swiftly criticized by the energy industry.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.