The advent of cluster computing, massive parallel processing and less expensive memory has paid off in the ability to perform seismic processing steps that would have been unthinkable just a few years ago.
Imaging the subsurface with sound (seismic exploration) has always been a science dominated by the art of compromise. Early explorers used arrays of geophones in the field to filter the spatial bandwidth of signals recorded. Why? Because the galvanometers used on the cameras had a very small dynamic range and it was necessary to let them "see" only the most relevant signal.
Seismic processing is the practice of selectively reducing the data recorded in the field to some manageable dataset that can be digested by the interpreter. The process of getting to the "right" dataset has always been limited by a mixture of what technology allows and the art and skill of the processor who must select what data to keep and what data to throw away.
The technological limits on the processor's art have been reduced by the fantastic developments in computer power we have all experienced in the last 2 decades. The theory of what we want to do in processing usually leads the ability to actually do, but the evolution of process is rather like pulling a spring along the floor. Computing capability lags the requirements of theory and then suddenly springs forward, offering the opportunity for new theory and new practices to develop.
Today we can process more data more correctly and with fewer limiting assumptions than we could have imagined a short decade ago. The results presented in this article are compelling and lead one to wonder where we will be a decade from now.
Single-sensor technology
One recent major development in our industry - that of acquiring data from each seismic sensor as a separate digital channel - opens the door to the next level of reservoir-quality data. Techniques such as noise attenuation can now be handled in a purely digital domain, enabling major improvements in the signal-to-noise ratio. This, in turn, encourages the use of high-fidelity algorithms whose usage cannot be justified in terms of cost for analog groups. To capture all of the value requires a complete reappraisal of the way data are calibrated and used. Just as importantly, it also demands an even greater allocation of compute resources. A single-sensor survey, assuming the processing starts from this point, must cope with between five and 20 times the data volume of a conventional survey. Today's single-sensor surveys generate more than a terabyte of measurements every day, all of which must be analyzed in near-real time and processed to a final result in ever-faster turnaround cycles. This daunting challenge cannot be met by treating data acquisition and processing as separate activities. Indeed, they are so intimately linked that it should not be possible to define whether a particular rack of computers is doing an acquisition, processing or quality control (QC) function at any particular moment.
It can be argued whether or not we even have the need for more data. However, techniques such as inverse scatter demultiple work best with a finely sampled wavefield, while real 4-D repeatability depends upon the ability to repeat a configuration both inline and crossline. The use of deterministic 4-D may not be widespread yet; however, as the single-sensor technology that enables it spreads, it will become the most acceptable approach. This will have major consequences for the calibration and repeatability demands placed on both acquisition and data processing. Furthermore, this must be done in days, not months, if we as an industry are to make seismic reservoir definition a real business and not just a research project.
Wave equation migration
The basic principles underlying wave propagation in elastic media have been known for well over a century. Yet geophysicists have been unable to make full use of these principles to properly image the Earth's interior through seismic sounding. The culprit has always been the lack of compute power, and geophysicists have had to rely on approximations to make subsurface imaging practical and affordable.
It is, therefore, no surprise that breakthroughs in seismic imaging have always been triggered by revolutions in the computer industry. The most recent development of the past 1 or 2 years has been the implementation of wave equation migration for 3-D prestack depth imaging. Back in the late 1970s wave equation migration was used for 2-D time migration and was seen as the high end of the available algorithms. Now the wavefield extrapolation approach is being used again for the high end, only the stakes have changed and it is for full 3-D prestack imaging in depth. It is only because of the latest hardware technology that such approaches can be considered economical on an industrial scale. The wave equation approach offers superior results over the previous state-of-the-art Kirchhoff method both in terms of improved lateral resolution and handling of complex multi-pathing, particularly relevant for subsalt. Based on the belief that it will rapidly become the standard, the wave equation solution is now being offered for the same price as the Kirchhoff method at CGG. The combination of hardware developments and research has enabled efficient implementation of the wave equation approach, allowing all available seismic shots to be migrated without decimation and the final migrated data to be output as high-quality angle gathers, in turn allowing velocity model update, post-processing and amplitude variations with azimuth analysis.
Wave equation PSDM solutions
The last decade has seen the prestack depth migration (PSDM) market transition from imaging small target areas using Kirchhoff techniques to large-scale Kirchhoff projects and on to imaging small/medium-sized projects using wavefield extrapolation based on a solution of the one-way wave equation. This has been made possible by the rise of the Linux PC cluster, where price/performance ratios are constantly decreasing, so allowing processors to pursue the use of better and more computationally intensive algorithms. As long as this trend continues, the size of projects that use wave equation PSDM will increase, and algorithms will continue to improve towards the ultimate goal of a wavefield extrapolator capable of handling wave propagation at all dips in a medium with arbitrary lateral velocity variations. Such algorithms might be based on a solution of the full two-way wave equation.
Figure 2 illustrates the improvements that have been achieved by moving from Kirchhoff PSDM to wave equation PSDM. We can expect further incremental improvements as wave equation algorithms improve. The dataset is from the Gulf of Mexico. The overall improvement in image resolution is clear, as is the improved resolution of the sediment beds on the left edge of the salt body and the improved imaging of the subsalt reflectors.
Subsalt imaging
Beginning in 2003, All-Shot 3-D Prestack Wave Equation Finite-Difference Depth Migration started being applied routinely to improve the imaging around salt bodies in the Gulf of Mexico. For the first time, all shots were being used. This alone would not have been enough to generate a lot of attention. What made this attractive is that:
Full volume iterations could be turned around in a fraction of the time it had previously taken; and
Angle-of-incidence domain gathers were being generated at every bin location for every single migration iteration.
This gave the interpreter access not only to inline, crossline and depth-slice data, but also links with the gathers, all of which could be quickly brought up on one of the multi-screen sessions set up for interpretation. This enhanced his or her ability to interpret complex salt bodies and surrounding sediments in a homogeneous, interactive processing/interpretation workstation environment developed specifically for processing and interpretation conviviality.
Depth imaging projects are by their nature very interpretive. The interaction between the interpreter and the processor had been overshadowed by the time element and the compromises that had to be made in order to complete projects in a reasonable time. Being able to reduce the full migration iterations to days from weeks or months now allows the geoscientist to quickly test his or her interpretation and update the velocity model.
The added bonus of looking at angle gathers to corroborate the interpretation and make sense out of extremely complex geology cannot be overstated. Angle of incidence gathers also yield another benefit, which is the possibility of "tuning" the mutes to sharpen the final migrated stack images.
3-D SRME
Unfortunately, when we collect seismic data we record a lot of energy that actually degrades our ability to see what we are looking for. One class of such energy is multiples, energy that "bounces around" between interfaces in the subsurface instead of being directly reflected to our recording sensors. This results in apparent reflections on the seismic section that interfere with real mapping of the subsurface. Multiples often are stronger than and may obscure the primary energy we wish to interpret. These events are often easily recognized but can be surprisingly difficult to remove.
One of the more effective techniques used in the marine environment today is free Surface Related Multiple Elimination (SRME). Here the multiples are modeled from the raw data using cross convolutions, and no additional information is required (assuming you have positioning data and a source wavelet). All multiples associated with a free surface will be accurately estimated. These would include "bounces" in the water column and those associated with the top and base of salt.
3-D SRME is a combination of steps and processes that attempt to describe the real 3-D nature of the multiples present in our data. The details are quite complex and beyond the scope of this piece. Put simply, it requires:
3-D data regularization and construction;
3-D multiple model prediction;
3-D model deregularization; and
3-D adaptive subtraction of multiple predictions from the raw data.
While some of the improvements in this technique may be subtle, in high-risk, high cost environments these subtleties may be the difference between success and failure. And when multiples are not eliminated effectively, the application of prestack depth migration may degrade the image.
Editor's note: The staff of E&P magazine polled several of the largest geophysical contractors to get their insights into the next wave of seismic data processing. Contributors included Chris Cunnel, WesternGeco; Guillaume Cambois, CGG; Clive Gerrard, PGS; Frank Dumanoir and Peter Bennion, TGS Imaging; and Colin Murdoch, Veritas Geoservices. Introduction by Peter Duncan, president of the Society of Exploration Geophysicists.
Recommended Reading
Dividends Declared Week of Feb. 17
2025-02-21 - 2024 year-end earnings season is underway. Here is a compilation of dividends declared from select upstream, midstream, downstream and service and supply companies.
Viper Makes Leadership Changes Alongside Diamondback CEO Shakeup
2025-02-21 - Viper Energy is making leadership changes alongside a similar shake-up underway at its parent company Diamondback Energy.
Diamondback’s Stice to Step Down as CEO, Van’t Hof to Succeed
2025-02-20 - Diamondback CEO Travis Stice, who led the company through an IPO in 2012 and a $26 billion acquisition last year, will step down as CEO later this year.
SM Energy Restructures Leadership Team
2025-02-20 - SM Energy Co. has made several officer appointments and announced the retirement of Jennifer Martin Samuels, the company’s vice president of investor relations and ESG stewardship.
Q&A: Petrie Partners Co-Founder Offers the Private Equity Perspective
2025-02-19 - Applying veteran wisdom to the oil and gas finance landscape, trends for 2025 begin to emerge.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.