Ask anyone in geophysics and geology and they will tell you that visualization is extremely important to their job, yet what they refer to is not so much the renderings but the actual workflows and how they interact with the data. And that is part of the challenge — previous-generation visualization toolkits often allowed limited interaction with the data, and often in surprisingly unintuitive ways.
For example, users have been required to view and interact with their respective seismic
![]() |
|
Figure 1. Prestack for Interpreters enables the user to simultaneously view and interact with both the poststack data (as served by Petrel natively) in combination with inline and cross line prestack common midpoint gathers. (All images courtesy of Hue) |
“Blame the workflows”
Many vendors who promote visualization tend to ignore the importance of the end-user workflows, instead promoting just visual rendering features or scalable visualization technology as the “Omni cure” to all problems. It is important to understand that the choice of applications (or solution framework) and the choice of visualization solution need to go hand in hand.
For example, it is fully feasible to make a single-user, single-workstation application scale up in terms of visualization performance. In some cases, that might be the most appropriate thing to do, but only if the visualization is the main bottleneck to achieve overall workflow efficiencies. This also implies that all other computations (e.g., post- processing, calculation of attributes, simulations and so on) are not the real bottlenecks. As any end-user will tell you, that is rarely the case.
The visualization revolution
To tell a tale from our own experience as a visualization toolkit vendor: HueSpace1 was the first commercially available toolkit for cluster-based volume visualization. It was introduced to the market in 2003 as an integral part of the GigaViz product from Schlumberger Information Solutions. HueSpace1 relied on the CPU for all computation and visualization. The benefit of HueSpace1 was that the technology allowed end-users to interactively view very large datasets. Unfortunately, that was all that HueSpace1 did: With the introduction of cluster-based visualization, HueSpace1 offered the end-users little more than scalability.
Three years later, after a significant investment in research and development, HueSpace2
![]() |
|
Figure 2. Prestack gathers can also be volume-rendered in a probe along with the poststack section. |
Innovative prestack workflows
The HueSpace2 framework/toolkit bears little resemblance to the architecture of other existing toolkits. One of the ways it differs from other toolkits is that HueSpace2 assumes that data – and the relationships — are multidimensional in nature. Headwave (formerly Finetooth Inc.) uses the HueSpace2 technology for prestack/poststack quality control, prestack interpretation and related workflows, providing the end-user instant access to terabyte-sized prestack datasets. Prestack datasets are multidimensional by nature. The Headwave platform is able to present the actual seismic data in multiple ways (wiggle traces, 3-D prestack/poststack linkage and 4-D comparisons) and switch between different domains. The applications also maintain header information and all the meta data so that end-users and algorithms are capable of using the meta data and the inherent data relationships as part of the workflows.
Figure 1 shows Headwave’s Prestack for Interpreters plug-in for Petrel Workflow Tools 2007 release. Here the user is able to simultaneously view and interact with the poststack data (as served by Petrel natively) in combination with inline and/or cross line prestack common midpoint gathers. The gathers are rendered in 3-D as 2-D cross sections that are linked to the poststack cross section. The interpreter can now move the prestack gathers along the poststack cross section. They are immediately updated, ready for further detailed analysis.
As an alternative, the prestack gathers can be volume-rendered in a probe along with the poststack section and all other data (grids, faults, attributes) in the Petrel project (Figure 2).
Headwave thus uses the HueSpace2 toolkit to load and serve data to the end-user. The prestack and poststack datasets are stored in a multi-dimensional, compressed format (with support for multiple sort orders). When a user pans through or interacts with the data, the GPUs are used to decompress the data on the fly. Normally, the access of arbitrary parts of terabyte-plus prestack data in real time would require hundreds of cluster nodes. However, the HueSpace2 engine can handle this on a single PC workstation, even at full 32-bit resolution. Our benchmarking of wavelet compression/decompression shows that one GPU does the work of 200 CPUs.
Interpreting workflows
Earlier we addressed the need for more intuitive workflows, and why imaging toolkits play a role.
Traditionally, probes used in interpretation workflows are box-shaped or, at best, limited by two defined surfaces such as seismic horizons. As every interpreter knows, geo-bodies and subsurface regions of interest take any shape or form. Being limited to only box-shaped probes makes the interpretation harder and consumes more time. Ideally, any polygon shape can define a body or probe/region of interest, and there should not be a distinction between “polygons” and “voxels.”
Removing these obstacles opens up interesting applications: First, it is possible to use any
![]() |
|
Figure 3. Volume visualization of tracked geo-bodies using HueSpace2 — each of the geo-bodies are separate objects and can be interacted with independently or in combinations |
Computation and visualization unite
True workflow efficiencies are achieved when the gap between visualization and processing is bridged transparently for the end-user. It used to be that things like the generation of certain attributes was so time- and resource-intensive that the cubes would take days to output. Currently, it is possible to design interactive workflows for interactive applications of algorithms such as segmentation, connectivity analysis, volume deformation and processing as part of the visualization workflow pipeline.
As another example, it is now often much faster to generate a “visual attribute on demand” than to retrieve it from disk. New workflows can therefore be designed that avoid unnecessary and tedious work for the end-users. Instead, end-users can focus on the task at hand and immediately see results regardless of the underlying effort at the toolkit and application level.
Over the next few years, geophysicists and geologists will see a shift in the level of usability of their subsurface applications. Next-generation data processing, interpretation, modeling and simulation applications will be able to take advantage of the combined visualization and compute capabilities similar to those described in the article. We believe that this combination will significantly improve end-user workflows with respect to enhanced intuitiveness and ease of use.
Recommended Reading
Delivering Dividends Through Digital Technology
2024-12-30 - Increasing automation is creating a step change across the oil and gas life cycle.
E&P Highlights: Feb. 3, 2025
2025-02-03 - Here’s a roundup of the latest E&P headlines, from a forecast of rising global land rig activity to new contracts.
E&P Highlights: Dec. 16, 2024
2024-12-16 - Here’s a roundup of the latest E&P headlines, including a pair of contracts awarded offshore Brazil, development progress in the Tishomingo Field in Oklahoma and a partnership that will deploy advanced electric simul-frac fleets across the Permian Basin.
E&P Highlights: Feb. 10, 2025
2025-02-10 - Here’s a roundup of the latest E&P headlines, from a Beetaloo well stimulated in Australia to new oil production in China.
Analysis: Middle Three Forks Bench Holds Vast Untapped Oil Potential
2025-01-07 - Williston Basin operators have mostly landed laterals in the shallower upper Three Forks bench. But the deeper middle Three Forks contains hundreds of millions of barrels of oil yet to be recovered, North Dakota state researchers report.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.