From nanotechnology to fiber optics, and from P2P to fuzzy logic, advances in information technology look set to digitize the oil patch.
It most definitely is not your grandfather's oil business any longer. In fact, the wildcatters of yore would have a hard time comprehending how computerized the upstream oil business has become. But if you think the digital revolution is over, well, honey, I hate to break it to you, but it's just now starting to get interesting.
Nanotechnology is going to make computers so small, you'll be able to fit a Cray inside a pen. Downhole fiber optics are going to enable engineers to monitor just about everything in real time, even in high-pressure, high-temperature environments. And to help process all that data, distributed peer-to-peer (P2P) computer networks will be able to turn idle PCs into supercomputers, and artificial intelligence will spot trends that otherwise would go unnoticed. Better keep your IT skills up to date, because it's going to be a very digital experience.
Nanotechnology
When Nobel Prize winner Richard Smalley discovered buckminsterfullerene at Rice University in Houston, Texas, in 1985, little did he know how his research would affect the oil industry. The US Department of Energy (DOE) increased its funding of nanotechnology by 62% to US $94 million to study so-called buckyballs (C60) and buckytubes - carbon cylinders only one-billionth of a meter in diameter. Buckytubes have tensile strength 100 times that of steel at one-sixth the weight, the electrical conductivity of copper, and the thermal conductivity of diamond.
Kellogg Brown & Root, a business unit of Halliburton, recently signed an engineering services agreement with Carbon Nanotechnologies Inc. (CNI) for the commercialization of buckytubes. These hollow carbon structures are the world's strongest, stiffest and toughest fibers - a perfect candidate for lightweight offshore tethers. Nanofilters would help with oilfield separations, and nanocatalysts could have a multibillion-dollar impact on refining and processing. Buckytubes also have many potential IT applications, including electromagnetic interference shielding, flat-panel displays, novel composite materials and high-performance electronics.
Kellogg Brown & Root will provide the conceptual and detailed design, fabrication, construction and operation support to CNI, which has leased about 6,000 sq ft (540 sq m) of office and laboratory space at the KBR Technology Center in Houston. Pilot plants for testing and producing buckytubes will be built at this center.
"Kellogg Brown & Root's extensive experience in developing and commercializing proprietary technologies permits us to supply to CNI a unique set of skills and services," said Martin Van Sickels, vice president and chief technology officer. "We are pleased to work with them in developing this exciting and revolutionary technology."
Houston-based Molecular Electronics Corp. is building tiny circuits from molecules, arranging them to function as electrical switches and memory. Through a chemical phenomenon known as self-assembly, groups of atoms or molecules spontaneously arrange themselves into units that behave as transistors when they come in contact with a metal such as gold.
Dr. Jim Tour, co-founder of the company and chemistry professor at Rice University, said, "If today's chips that power modern computers are, for example, the size of a sheet of paper, a chip even more powerful would be the size of a pinhead using molecular electronics." Within 10 years, these tiny, heat-resistant circuits made from buckytubes may be incorporated into slimhole tools.
Downhole fiber optics
DOE awarded a $2 million research contract to the Photonics Laboratory at Virginia Tech to develop optical-fiber sensors for exploration and production applications. The goal is to design and develop pressure, temperature, acoustic and fluid-flow sensors that will operate in the harsh downhole environment, where temperatures can exceed 256°F (125°C) and pressures can reach 5,000 psi.
"Commercially available sensors have very short lifetimes under such conditions, and replacing them is time-consuming and costly. The sensors that we are developing are engineered specifically to survive these harsh environments for extended periods, hopefully up to 2 years," said associate professor Anbo Wang, who is heading up the research program. The Photonics Laboratory has teamed up with the University of Tulsa and oil companies to demonstrate the new sensors in laboratory tests and in an oil field.
The sensors under development are based on a new optical-fiber sensor technology called the Self-Calibrating Interferometric/ Intensity-Based sensor, recently invented by Wang. This new approach to optical-fiber sensor design combines high sensitivity, simple signal processing and ease of manufacturing. "Most pressure sensors based on optical fibers are unfortunately also sensitive to changes in temperature, so that they are only useful when you keep the temperature constant. We solved that problem by changing the way we process the optical signal, giving the sensors a low cross-sensitivity to undesired effects. Therefore, our pressure sensors are not affected by changes in temperature."
Artificial intelligence
DOE's National Energy Technology Laboratory (NETL) has several ongoing projects involving neural networks, genetic algorithms and fuzzy logic to improve exploration and production.
The Gas Technology Institute, along with specialists from West Virginia University, Intelligent Solutions Inc. and TechnoMatrix Inc., will develop computer-assisted methods for identifying and optimizing preferred management practices in upstream oil production operations. The resulting Virtual Intelligence technique will enable operators to increase oil production 15% in 5 years at a cost equal to or less than that of common practices.
Benson-Montin-Greer Drilling Corp. (BMG), with subcontractor Correlations Co. of Socorro, N.M., will use new log interpretation methods based on artificial intelligence and neural networks to evaluate well recompletion opportunities in BMG's Gavilan and West Puerto Chiquito Mancos fields in the San Juan Basin.
Luff Exploration Co. is developing an intelligent computing system to apply to reservoir-production models for analysis of seismic and geologic data from Class II reservoirs. Nearly 25 years of data collected on the Red River carbonate formation in the Williston Basin will be assimilated into a computer-based reservoir analysis system. Technology will be used to identify and map nonlinear relationships between 3-D seismic, production, geological and petrophysical data using neural-net computing, fuzzy logic and probabilistic reasoning as well as such conventional techniques as geostatistical and classical pattern recognition.
Neuro3 is a 32-bit MS Windows software application developed by DOE's National Petroleum Technology Office as part of the Risk Analysis tool set. Common oil and gas applications include forecasting of reservoir properties from wireline log signatures, extension of reservoir properties for simulation, and seismic interpretation. While this application was written for the oil and gas community, it is generic enough to apply to any problem for data-mining, correlation or categorization needs. The application, which can be downloaded from www.npto.doe.gov, has a spreadsheet interface to allow import and export of external data sets.
Other outfits are also using fuzzy logic and neural networks. New Mexico Institute of Mining and Technology's Petroleum Recovery Research Center (PRRC) is developing an artificial intelligence system to provide realistic estimates of risk. Using fuzzy logic, inexact and incomplete information can be integrated with modern computational methods to derive usable conclusions. Using this concept, PRRC has developed a Fuzzy Expert Exploration (FEE) Tool to model the decision-making mechanisms explorationists use to produce predictions with different levels of certainty or confidence. The FEE Tool, which will be made available via the Internet, is expected to reduce exploration time and expense. On Aug. 29, William Weiss of PRRC presented results of research that used fuzzy logic to identify trends from seemingly unrelated datasets to predict which fields in the Nebraska panhandle would benefit from waterflooding. When applied to the Cliff Farms unit, the new technology correctly predicted the value of waterflooding.
Project engineers in the Delaware Basin of New Mexico use a trained neural network to determine the best chances for completion in Brushy Canyon wells. Predictions on new wells based on log data can be made in less than a day using the database of Brushy Canyon log and core data from 34 wells. Conclusions from the research show that fuzzy ranking can see obscure relationships, and that neural networks can correlate multiple fuzzy relationships with oil production to predict recovery.
Halliburton is using neural networks to make well construction safer. Research engineer Roger Schultz used neural networks to dramatically improve the detection of weak subsurface signals in the presence of noise contamination at oil well sites. During perforating operations, it is vital to know whether all the explosives have detonated before the apparatus is brought back to the surface. Accelerometers that capture acoustic waves generated while perforating also capture signals from pumps and other equipment. Schultz needed a filter that would separate the signal from these ambient sounds, but linear filters would not work; they assume the signal can be modeled linearly, which is an oversimplification. Neural networks, however, can predict any real signal, require no limiting assumptions of normality or linearity and are resilient to distortion.
Using MATLAB and the Neural Network Toolbox from The MathWorks, Schultz developed an adaptive, predictive, nonlinear filter that cleanses the signals of the noises, leaving only the impulsive components that include the signal generated by the explosion. Schultz chose MATLAB for this project because "you can do really fast matrix manipulation. Neural networks are formulated in terms of matrices, and so MATLAB is a perfect fit." The next step was to create a stand-alone application for use in the field. Schultz used the MATLAB Compiler to convert the application to freely distributable, stand-alone C++ code. Following successful trials, Halliburton initiated patent protection for the technology.
Distributed processing and P2P
Rather than each company having an in-house computer network, in the next few years processing power will be sold over the Internet like a utility. A couple of companies already are constructing secure server farms. For example, Randy Premont, formerly chief technology officer of GeoNet Services, is creating a Linux server farm called Houston Open Technology Grid, or HOTg. Computing time is available for seismic processing by the hour, day or week, with no long-term contracts.
"It's an aggregation of many sites coming together as one supercomputer," Premont said.
Compaq, Hewlett-Packard and IBM have rolled out similar clusters. IBM has invested $4 billion in 50 server farms around the world, and is helping the UK and Dutch governments set up national computing grids for science research. Hundreds of nodes or machines are linked together, and open-source protocol and Globus software can focus their combined power on one task. The utility pricing model means companies won't have to invest in servers, bunkers to store and protect them, or staff to operate them.
DataSynapse's distributed computing platform, LiveCluster, uses P2P technology to speed up processing (Table 1).
"Firms in the industry have generally made substantial investments in technology with thousands of resources such as PCs, file servers, redundant environments and disaster recovery sites, often underutilized and present in virtually every business unit," said Tom Eliseuson of DataSynapse. "By drawing on the untapped capabilities of existing resources, companies in the industry can quickly realize measurable benefits."
"This P2P technology will revolutionize seismic processing," added Chief Executive Officer Peter Lee. Because P2P can be implemented inside the enterprise's firewall, security risks are greatly diminished.
Craig Barret, Intel, chief executive officer, summed up P2P: "We think it's one of those things that will shape the future of computing. It allows users to do much more with resources that are currently unused."
Recommended Reading
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.