Accurate measurement has perennially been one of the upstream sector’s greatest challenges, with many believing that data and digitalization could help solve measurement dilemmas. This will only become reality if the data are reliable and reproducible.
Oil company senior managers are often involved in disputes over what and how much has been measured. It is very difficult to prove to a legal standard. Even during relatively straightforward operations, like sending crude from storage to measuring stations, there can be disagreements about what volumes were sent and received. And as soon as it gets to court, it usually costs millions of dollars to resolve, with decisions essentially being based upon unreliable data.
Measurement ambitions
When a combination of fluids such as oil, water and gas flow from a well it has traditionally been difficult to correctly measure individual phases without separating them. Multiphase metering first started with the aim of addressing inaccuracy and reliability, providing valuable data and delivering cost reductions. Those newly realized savings also would facilitate single well tiebacks, shared use of existing pad and pipeline facilities and continuous online monitoring for economically marginal wells—or so the theory went.
The upstream sector takes great pride in its use of technology, science and risk assessment. However, there is a data trust gap between what is being promised and what is being delivered. This is due to the multiphase metering market not fulfilling its potential over the years.
It is rare to find a business that claims to get better than 10% accuracy from its multiphase meters. But more astonishingly, it is very common to hear people in the field say they are lucky to get 20% to 30%.
Such a level of uncertainty matters. For example, one field operator noted that an emulsion formation would develop as soon as the well went above a 40% water cut, resulting in full storage tanks and the need to shut in the field to empty everything. This is just one example, but it is a ubiquitous issue. Even on a 1,000-bbl/d field an emulsion incident can really hurt the bottom line.
This inaccuracy in large part is due to traditional meters not being optimized for reliability. Manufacturers have prioritized expensive technology that embeds uncertainty in flow-rate measurement over accuracy and repeatability in parameters that can be directly measured. This inherently leads to complexity, human intervention and validation-hungry systems.
To address these weaknesses, cumbersome and expensive test separators remain in operation, but they provide only piecemeal or fragmented information that rarely delivers more than limited value. So while oil and gas companies seek the benefits of access to data, they’ve been consistently unable to access lower-cost, reliable and reproducible information sets.
Quantifying change
There is no doubt the industry needs to spend less per barrel. It is a theme that is coming straight from the top: How can the industry reduce its biggest costs? Can technology stop the industry’s salary graph from going up and its production graph from going down?
In the onshore industry where the costs are often welldriven, operators that want to reduce expenditure have generally cut people or workovers. That is where the greatest costs are found. However, eliminating workovers will lead to trouble later on. Workovers and other well interventions need to be done at some point, and once completed, it is important to know what worked best to determine future strategy. Without measurement, it is difficult to get a clear picture of what has been effective. Continuous well-by-well data will deliver significant improvement on this front.
Meters at every well would be a best practice if it was cost-effective, and it is a reality today with M-Flow. Operators can manage an oil field without having to amend the operational pattern every time something changes. But as with other process industries, this will involve smart monitoring systems throughout the upstream production chain.
One factor that will govern the future price of oil is the extent to which the standardized, manufacturing-like processes that characterize tight oil production are implemented across the industry. For example, best practice in the U.S. shale plays has transposed swiftly between operators and operations (e.g., pad drilling, high-volume completions and tighter well spacing). All have made statistically visible differences in costs and how quickly and successfully projects are brought to commercialization.
At the heart of this is the requirement for reliable data to improve performance through the reproducibility and tight process control that delivers the marginal gains that compound into improvements.
Removing statistical doubt
In rethinking the challenges that have inhibited the growth of multiphase data for the production optimization market, a technology that provides confidence at the wellhead was developed.
M-Flow focuses on understanding well performance through phase fraction measurement because this system delivers through direct measurement the key parameters that quantify and signal production change.
It can be combined with other measurement systems and datapoints to provide more complex understanding.
M-Flow’s carbon fiber construction creates a transparent window on the pipe flow and makes it possible to deploy sensor systems fully protected from aggressive oil well fluids.
The company’s meters experience none of the harsh fluids induced degradation or calibration changes that are the main drivers for multiphase flowmeter intervention.
In contrast to traditional meters, the company’s new carbon fiber multiphase meters require minimal manpower, lower capex and almost zero opex. Costs for the five-year life-cycle meter are on average 20% of traditional multiphase flowmeter costs.
By delivering directly measured, constant data on water cut and gas fractionation in a discreet, packaged and valuable dataset, M-Flow has shifted focus to moving dialogue within the multiphase market away from the meter and onto the impact of accurate and reliable data to redefine upstream operations.
Reliable data form the foundation of the modern oil field. Unless the information that is derived at the wellhead is consistently reliable and replicable, the challenges of today and tomorrow will not be solved.
There is broad recognition that digging into a well-managed dataset reveals insights, trends and patterns that will help increase return on investment, decrease HSE incidents and create the foundation for future achievement. In this environment, data trust is a competitive advantage. Success is built upon actionable insights and that starts with credible data.
Recommended Reading
Regulators Greenlight CRC’s Carbon TerraVault I CCS Project
2024-10-22 - The approval by California’s Kern County Board of Supervisors clears California Resources Corp. to start construction activities for the carbon capture and storage project.
Oxy’s 1PointFive Lands US Funding for Carbon Sequestration Hubs
2024-10-03 - 1PointFive plans to use the funding to advance site characterization activities, permitting and environmental approvals for construction at each hub, Occidental said.
Honeywell, SAMSUNG E&A to Collab on CCUS Solutions
2024-09-18 - SAMSUNG E&A will leverage Honeywell's carbon capture technologies, in particular Honeywell’s advanced solvent carbon capture technology.
1PointFive Taps Enterprise Products for CO2 Pipeline Network
2024-10-28 - The network will carry captured carbon from the Houston Ship Channel to a southeast Texas sequestration hub.
DOE to Offer Up to $1.3B to Advance CCUS Technologies
2024-09-30 - The Office of Clean Energy Demonstrations said it anticipates the funding solicitation will be released in late 2024.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.