Larger dataset sizes and computational needs have led to a wave of developments to meet digital oilfield needs that are driven by techniques to better understand subsurface happenings as exploration efforts progress.
E&P companies are finding data solutions by turning to systems developed by companies such as NVIDIA. Technical experts from the several companies, including Petrobras, recently participated in Hart Energy’s three-part “Big Data and the Cloud” webinar series. The second installment in the series focused on how to make more accurate prospect decisions in less time with hybrid compute environments.
Nowadays, “it’s not enough just to have good seismic data. You need to have the content to be able to make better drilling decisions and well path locations. So you have data from electromagnetic surveys to help correlate that,” said Ty McKercher, an HPC solution architect from NVIDIA. “And every time you add another data type or a different data type or a different discipline, it introduces delays in the system.”
That’s where the graphics processing unit (GPU) system steps in to help reduce delays in the system and allow diverse teams to collaborate and view that data. Such systems are affordable and allow the leverage of existing knowledge to a familiar tool, McKercher continued. It is being used not only for reverse-time migration, but also for wave-equation migration, Kirchhoff time-in-depth migration, and service multiple elimination.
Placing the Sandy Bridge and GPU hybrid systems side-by-side, McKercher pointed out how the GPU hybrid fares better. With Sandy Bridge (84 central-processing-unit [CPU] servers), there are two racks at 25.2 kW and 58 teraflops (FLOPS or floating-point operations per second) for an initial cost of US $600,000. With a GPU hybrid system (34 servers plus 136 GPUs), there is one stack at 25.5 kW with 323 TFLOPS for an initial cost of $400,000. The GPU hybrid is 5.5 times faster.
“You can use [fewer] servers and therefore use less power and cooling. You get a lower cost of ownership. It’s not just the initial cost that’s less; it’s the cost over time. And that allows you to have a faster return on investment,” McKercher said. “If you have alternative cooling solutions, you can fill that rack with even more resource, and the peak performance per wattage would go even higher.”
He noted that it’s important for GPUs and CPUs to cooperate via application codes. This enables tasks to be performed simultaneously by the two. For example, data filtering can be done on the CPU, while doing computations on the GPU.
Paulo Souza, geophysical technology group, Petrobras, spoke on hybrid computing for seismic processing, pointing out more benefits for using GPUs. These included lots of registers at 36 terabytes per second (TB/s) max on K10; shared memory at 3 TB/s; dynamic load balancing by hardware (single GPU); interpolation hardware; and vector gather/scatter.
Petrobras started using GPUs in 2006 after moving from main frames. Now, GPUs make up more than 90% of the company’s processing power, Souza said.
In using reverse-time migration for imaging complex structures using GPUs, Souza said, the velocity field is read once per job. Groups of GPUs are used to process a group of shots, one shot at a time per group. And a group of shots is stacked in memory before going to disk about every three to six hours.
Souza also spoke about how to overlap computation and communication between GPUs, which requires a minimum of 1.6 gigabytes per second (GB/s). The process involves breaking the border data into small pieces and starting a pipeline to overlap four communication stages.
For example, Souza demonstrated how five tasks run at once. “We have the GPU calculating the bulk of the model. We are moving data from the GPU to the host. Also, we are sending the data to the neighbor, receiving the data from the neighbor, and copying the data from the GPU. It is all done using 1.6 GB/s.”
By using GPUs, Petrobras saw gains of up to 10 times in performance per price and per watt over traditional architectures.
Contact the author, Velda Addison, at vaddison@hartenergy.com.
Recommended Reading
Companies Hop on Digital Twins, AI Trends to Transform Day-to-day Processes
2024-10-23 - A big trend for oil and gas companies is applying AI and digital twin technology into everyday processes, said Kongsberg Digital's Yorinde Lokin-Knegtering at Gastech 2024.
ChampionX Releases New Plunger Lift Well Solution
2024-09-13 - The SMARTEN Unify control system is the first plunger lift controller in ChampionX’s SMARTEN portfolio.
BP to Use Palantir Software to Improve AI in Operations
2024-09-09 - BP and Palantir have agreed to a five-year strategic relationship in which Palantir’s AIP software will use large language models to improve BP operations.
Liberty Capitalizes on Frac Tech Expertise to Navigate Soft Market
2024-10-18 - Liberty Energy capitalized on its “competitive edge” when navigating a challenging demand environment in third-quarter 2024, CEO Chris Wright said in the company’s quarterly earnings call.
Geospace Technologies Sells Seismic Data Nodes to SAExploration
2024-08-28 - SAExploration exercised its purchase option from an existing rental contract to make the purchase of seabed ocean bottom wireless seismic data acquisition nodes.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.