The amateur of classics certainly remembers Laurence Olivier screaming "A horse! A horse! My kingdom for a horse!" in the movie "Richard III." Without any additional information, the viewer systematically concludes that the man is a monster. The forefather of the data interpreters, Shakespeare, has done a good job to alter our perception, as historical data-mining reveals a much gentler king. The famous author is not unlike today's geoscientists who dramatize and caricature data so that it is more easily digested by non-technicians. Is it possible not to misjudge reality? Yes, by collecting all the facts and verifying the sources.
We stay with history but abandon England and the House of York to show the importance of contextual information in logs. If contextual information is not recorded and made available to the data user, log data becomes rapidly useless, especially years after the time of acquisition when it is not possible to get in touch with the logging engineer. Through the processing of log interpreters, data may sometimes tell the opposite of what it is supposed to say.
What is contextual information?
Before we explore the way it was slowly introduced in the industry, it may be necessary to set the problem. Contextual information is all the data around the main data, the latter generally displayed versus depth. Computer technologists call contextual information metadata. A quick example: a porosity (pu) logging tool is run in a deviated well. At 12,456 ft (3,800 m) measured depth, the value is 23.45 pu.
A few questions need to be asked:
Is this a density-, neutron-, nuclear magnetic resonance- or acoustic-derived porosity? Is it corrected? If yes, what are the corrections?
Is it calibrated? If yes, what are the calibration coefficients?
Was the tool properly centralized? With what?
What is the size of the centralizing device?
Is there some signal processing?
What are the processing options?
Where is the volume investigated by the logging tool located?
Is depth documented?
Is the depth-measuring device calibrated?
Is depth stretch-corrected?
Is there a stable procedure to acquire depth?
How were deviation and azimuth measured? If this is through a magnetic survey, how was magnetic declination corrected for?
What is the geodetic reference?
Which logging company ran the log? Who ran the log? What was his or her level of training? Etc.
While the first log was like magic, and the whole process much of a black box, today the informed user can reach a better understanding of data through the perusal of contextual information. This information allows a build-up of confidence in the data and a subsequent reduction of the risks.
A day in the fall of 1927
The circumstances of the first log are known, but not universally. Some magazines mention that Conrad and Marcel Schlumberger ran the first log. Fortunately, several witness accounts are available, in particular the narrations of Henri Doll, the first field engineer, and of Roger Jost, who was with Karl Scheibli, part of the three-man team that performed that premiere. Jost's story just appeared in the 2006 edition of "Petrophysics."
We find the team in a small village in Alsace Sept. 5, 1927. Imagine the three men sent to run a prototype tool put together in a matter of days. They have no operating manual, no maintenance manual and no hotline to seek advice. The field engineer is very green because this is his first job. This is the ultimate real-time job, where instant solutions are needed at a diabolical frequency. Indeed, there is no standard to tell how to present the data. Nevertheless, the deliverable, whose copy can be seen in Figure 1, was outstanding, including the name of the well, the name of the rig and correlations with external information taken from cuttings. From a close look at this document, one can recognize one name, Henri-Georges Doll, written with a slightly different style. It is difficult to be sure if this was added some time after the job, but a long-standing tradition of having the name of the father (or the mother) printed on the log is established. A few lines in the account of Jost are critical: Doll writes down the values given by the potentiometers (Figure 2). Two quick computations on his slide rule, and the triumph of this first measure point resounds. It works! The verification is correct. In summary, the measurement correctness is checked step by step. It is noticeable that this information is not added to the diagram delivered to the client. It will take decades before quality control curves are displayed on the logs or delivered to the data user.
Other firsts
First repeat of anomalies. This antedates the first log. A disturbance was suspected in Saint-Bel in 1913, as Conrad Schlumberger was checking a surface survey. The measurements were repeated.
The first wellsite witness and the first repeat section. These were done in 1928. An employee of Royal Dutch Shell made a field visit to Alsace in the company of Conrad Schlumberger, qualifying him as "earliest wellsite
witness." It appears that he also requested a repeat survey. This is the world's first repeat survey made as an official log quality product. His first condition for an acceptable log is that it should wiggle and repeat. This is necessary, but not sufficient. More controls will follow.
Overlapping the same interval. In the late '40s, the recording of overlapping zones between two runs was very important. By this process, the stability (or lack thereof) of readings with a change of mud salinity or borehole size was checked. Field engineers were spending large amounts of time trying to explain invasion and hole effects.
The first quality curve. The first quality curve - a curve that is not directly used by the log analyst - appears to be delta-rho, developed with the second-generation density tool in the '60s. It resulted from the development of a dual-detector density device, bringing two densities, one strongly affected by the near borehole domain, the other one affected by both near borehole and formation. Delta-rho informs the log user how the densities differ.
Traceability before the computer
Before the introduction of the computer, a huge amount of esoteric but critical information was just momentarily kept in the brain of the field engineer, soon to be forgotten on the next job. It includes but is not limited to position of dials on panels, selection of memory lengths, adjustment of the power sent to the tool and alignment of galvanometers.
Here comes the computer
In the late '60s, encoding of logging information started. At the beginning of the digital era, only depth-sampled data was flowed to tapes. This information was absolutely useless unless graphical data (prints or film) was available. All parameters, mud data, header information and processing options still belonged to the graphical record. But in the '70s all information was finally recorded to a digital file. It is at this point that traceability has been made possible.
With the computer, all parameters can be stored. Ultimately, if the data user really wanted to know what went wrong (or right) during a logging job, he could obtain the scroll, a record of all the data entries and changes performed during the job.
A new look at depth
While depth has been used on the first log and is almost always present on a log, it has not got the attention it deserves until recently. In the '90s it was found that depth was the only measurement for which logging companies are not delivering information on calibrations, instruments used and correction parameters. It is therefore not surprising to observe large differences between logging companies, different conveyance modes and the driller. The Data Quality conference in Taos in 1996 marks a turning point and requests the delivery of all details on the depth measurement.
At a point when terabytes of data are delivered to users, there are two categories of data that are still poorly covered: complete positioning information, allowing the log analyst to know the location of the volume of rocks that has been surveyed before he or she starts making inferences about it, and uncertainties of the performed measurements. Almost 80 years after the first log, the quest for complete log contextual information is still on.
Recommended Reading
E&P Highlights: Dec. 16, 2024
2024-12-16 - Here’s a roundup of the latest E&P headlines, including a pair of contracts awarded offshore Brazil, development progress in the Tishomingo Field in Oklahoma and a partnership that will deploy advanced electric simul-frac fleets across the Permian Basin.
Innovative Insulation: The Future of Thermal Management
2024-12-31 - Silicone-based, spray-on coating simplifies application and improves protection for downstream assets and workers.
E&P Highlights: Feb. 3, 2025
2025-02-03 - Here’s a roundup of the latest E&P headlines, from a forecast of rising global land rig activity to new contracts.
TGS Launches Advanced Imaging Centers for Petrobras
2025-01-21 - TGS' 4D technologies will provide enhanced subsurface clarity in basins offshore Brazil for Petrobras.
QatarEnergy Joins Joint Venture Offshore Namibia
2024-12-17 - QatarEnergy acquired a 27.5% stake in petroleum exploration license 90 offshore Namibia.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.