[Editor's note: This is the final part a three-part series. A version of this story appears in the December 2019 edition of Oil and Gas Investor. Subscribe to the magazine here.]
Forecasts are now commonly made, often rhetorically and without evidence, that we should expect to see a rapid decline in future costs for wind/solar/battery technologies that continue the gains already achieved. The first two decades of commercialization, after the 1980s, saw a greater than tenfold reduction in cost of solar and wind hardware. But the path for further improvements won’t emulate the past. Instead, it now follows what mathematics call an asymptote; or, put in economic terms, improvements are subject to a law of diminishing returns where every incremental gain yields less progress than in the past (Figure 1).
This is a normal phenomenon in all physical systems. Throughout history, engineers have achieved big gains in the early years of a technology’s development, whether wind or gas turbines, steam or sailing ships, internal combustion or photovoltaic cells. Over time, engineers manage to approach nature’s limits. Bragging rights for gains in efficiency—or speed, or other equivalent metrics such as energy density (power per unit of weight or volume)—then shrink from double-digit percentages to fractional percentage changes. Whether it’s solar, wind tech or aircraft turbines, the gains in performance are now all measured in single-digit percentage gains. Such progress is economically meaningful but is not revolutionary.
RELATED:
Part One: Debunking The New Energy Economy
Part Two: Batteries Cannot Save The Grid Or The Planet
The physics-constrained limits of energy systems are unequivocal. Solar arrays can’t convert more photons than those that arrive from the sun. Wind turbines can’t extract more energy than exists in the kinetic flows of moving air. Batteries are bound by the physical chemistry of the molecules chosen. Similarly, no matter how much better jet engines become, an A380 will never fly to the moon. An oil-burning engine can’t produce more energy than contained in the physical chemistry of hydrocarbons.
Combustion engines have what’s called a Carnot Efficiency Limit, which is anchored in the temperature of combustion and the energy available in the fuel. The limits are long established and well understood. In theory, at a high-enough temperature, 80% of the chemical energy that exists in the fuel can be turned into power. Using today’s high-temperature materials, the best hydrocarbon engines convert about 50% to 60% to power. There’s still room to improve, but it’s nothing like the tenfold to nearly hundredfold revolutionary advances achieved in the first couple of decades after their invention. Wind/solar technologies are now on the same place of that asymptotic technology curve.
For wind, the boundary is called the Betz Limit, which dictates how much of the kinetic energy in air a blade can capture; that limit is about 60%. Capturing all the kinetic energy would mean, by definition, no air movement and thus nothing to capture. There needs to be wind for the turbine to turn. Modern turbines already exceed 45% conversion. That leaves some real gains to be made but, as with combustion engines, nothing revolutionary. Another tenfold improvement is not possible.
For silicon photovoltaic (PV) cells, the physics boundary is called the Shockley-Queisser Limit: A maximum of about 33% of incoming photons are converted into electrons. State-of- the-art commercial PVs achieve just over 26% conversion efficiency—in other words, near the boundary. While researchers keep unearthing new non-silicon options that offer tantalizing performance improvements, all have similar physics boundaries and none is remotely close to manufacturability at all—never mind at low costs. There are no tenfold gains left.
Future advances in wind turbine and solar economics are now centered on incremental engineering improvements: economies of scale in making turbines enormous, taller than the Washington Monument, and similarly massive, square-mile utility-scale solar arrays. For both technologies, all the underlying key components—concrete, steel and fiberglass for wind; and silicon, copper and glass for solar—are all already in mass production and well down asymptotic cost curves in their own domains.
While there are no surprising gains in economies of scale available in the supply chain, that doesn’t mean that costs are immune to improvements. In fact, all manufacturing processes experience continual improvements in production efficiency as volumes rise. This experience curve is called Wright’s Law. (That “law” was first documented in 1936, as it related then to the challenge of manufacturing aircraft at costs that markets could tolerate. Analogously, while aviation took off and created a big, worldwide transportation industry, it didn’t eliminate automobiles or the need for ships.) Experience leading to lower incremental costs is to be expected; again, that’s not the kind of revolutionary improvement that could make a new energy economy even remotely plausible.
As for modern batteries, there are still promising options for significant improvements in their underlying physical chemistry. New non-lithium materials in research labs offer as much as a 200% and even 300% gain in inherent performance. Such gains nevertheless don’t constitute the kinds of tenfold or hundredfold advances in the early days of combustion chemistry. Prospective improvements will still leave batteries miles away from the real competition: petroleum.
There are no subsidies and no engineering from Silicon Valley or elsewhere that can close the physics-centric gap in energy densities between batteries and oil (Figure 2). The energy stored per pound is the critical metric for vehicles and, especially, aircraft. The maximum potential energy contained in oil molecules is about 1,500% greater, pound for pound, than the maximum in lithium chemistry. That’s why aircraft and rockets are powered by hydrocarbons. And that’s why a 20% improvement in oil propulsion (eminently feasible) is more valuable than a 200% improvement in batteries (still difficult).
Finally, when it comes to limits, it is relevant to note that the technologies that unlocked shale oil and gas are still in the early days of engineering development, unlike the older technologies of wind, solar and batteries. Tenfold gains are still possible in terms of how much energy can be extracted by a rig from shale rock before approaching physics limits. That fact helps explain why shale oil and gas have added 2,000% more to U.S. energy production over the past decade than have wind and solar combined.
Digitalization won’t uberize the energy sector
While there are no new physics on the horizon to offer 10x gains in any energy technology, a lot of hope and hype have been afforded to what analytics and artificial intelligence could do to optimize things. Digital tools are already improving and can further improve all manner of efficiencies across entire swaths of the economy, and it is reasonable to expect that software will yet bring significant improvements in both the underlying efficiency in both fabricating and using wind/solar/ battery machines and in the efficiency of how such machines are integrated into infrastructures. Silicon logic has improved, for example, the control and thus the fuel efficiency of combustion engines, and it is doing the same for wind turbines. Similarly, software epitomized by Uber has shown that optimizing the efficiency of using expensive physical assets lowers costs. Uberizing all manner of capital assets is inevitable.
Uberizing the electric grid without hydrocarbons is another matter entirely.
The peak demand problem that software can’t fix
In the energy world, one of the most vexing problems is in optimally matching electricity supply and demand (Figure 3). Here the data show that society and the electricity-consuming services that people like are generating a growing gap between peaks and valleys of demand. The net effect for a hydrocarbon-free grid will be to increase the need for batteries to meet those peaks.
All this has relevance for encouraging electric vehicles (EVs). In terms of managing the inconvenient cyclical nature of demand, shifting transportation fuel use from oil to the grid will make peak management far more challenging. People tend to refuel when it’s convenient; that’s easy to accommodate with oil, given the ease of storage. EV refueling will exacerbate the already-episodic nature of grid demand.
To ameliorate this problem, one proposal is to encourage or even require off-peak EV fueling. The jury is out on just how popular that will be or whether it will even be tolerated.
Although kilowatt-hours and cars—key targets in the new energy economy prescriptions—constitute only 60% of the energy economy, global demand for both is centuries away from saturation. Green enthusiasts make extravagant claims about the effect of Uber-like options and self-driving cars. However, data show that the economic efficiencies from uberizing have so far increased the use of cars and urban congestion. Similarly, many analysts now see autonomous vehicles amplifying, not dampening, that effect.
That’s because people, and thus markets, are focused on economic efficiency and not on energy efficiency. The former can be associated with reducing energy use; but it is also, and more often, associated with increased energy demand. Cars use more energy per mile than a horse, but the former offers enormous gains in economic efficiency. Computers, similarly, use far more energy than pencil-and-paper.
Uberizing improves energy efficiencies but increases demand
Every energy conversion in our universe entails built-in inefficiencies—converting heat to propulsion, carbohydrates to motion, photons to electrons, electrons to data and so forth. All entail a certain energy cost, or waste, that can be reduced but never eliminated. But, in no small irony, history shows—as economists have often noted—that improvements in efficiency lead to increased, not decreased, energy consumption.
If at the dawn of the modern era, affordable steam engines had remained as inefficient as those first invented, they would have never proliferated, nor would the attendant economic gains and the associated rise in coal demand have happened. We see the same thing with modern combustion engines. Today’s aircraft, for example, are three times as energy-efficient as the first commercial passenger jets in the 1950s. That didn’t reduce fuel use but propelled air traffic to soar and, with it, a fourfold rise in jet fuel burned.
Similarly, it was the astounding gains in computing’s energy efficiency that drove the meteoric rise in data traffic on the internet— which resulted in far more energy used by computing. Global computing and communications, all told, now consumes the energy equivalent of 3 billion barrels of oil per year, more energy than global aviation.
The purpose of improving efficiency in the real world, as opposed to the policy world, is to reduce the cost of enjoying the benefits from an energy-consuming engine or machine. So as long as people and businesses want more of the benefits, declining cost leads to increased demand that, on average, outstrips any “savings” from the efficiency gains. Figure 4 shows how this efficiency effect has played out for computing and air travel.
Of course, the growth in demand for a specific product or service can subside in a (wealthy) society when limits are hit: the amount of food a person can eat, the miles per day an individual is willing to drive, the number of refrigerators or lightbulbs per household, etc. But a world of 8 billion people is a long way from reaching any such limits.
The macro picture of the relationship between efficiency and world energy demand is clear (Figure 5). Technology has continually improved society’s energy efficiency. But far from ending global energy growth, efficiency has enabled it. The improvements in cost and efficiency brought about through digital technologies will accelerate, not end, that trend.
Energy revolutions are still beyond the horizon
When the world’s poorest 4 billion people increase their energy use to just 15% of the per-capita level of developed economies, global energy consumption will rise by the equivalent of adding an entire United States’ worth of demand. In the face of such projections, there are proposals that governments should constrain demand, and even ban certain energy-consuming behaviors. One academic article proposed that the “sale of energy-hungry versions of a device or an application could be forbidden on the market, and the limitations could become gradually stricter from year to year, to stimulate energy-saving product lines.” Others have offered proposals to “reduce dependency on energy” by restricting the sizes of infrastructures or requiring the use of mass transit or car pools.
The issue here is not only that poorer people will inevitably want to—and will be able to—live more like wealthier people but that new inventions continually create new demands for energy. The invention of the aircraft means that every $1 billion in new jets produced leads to some $5 billion in aviation fuel consumed over two decades to operate them. Similarly, every $1 billion in data centers built will consume $7 billion in electricity over the same period. The world is buying both at the rate of about $100 billion a year.
The inexorable march of technology progress for things that use energy creates the seductive idea that something radically new is also inevitable in ways to produce energy. But sometimes, the old or established technology is the optimal solution and nearly immune to disruption. We still use stone, bricks and concrete, all of which date to antiquity. We do so because they’re optimal, not “old.” So are the wheel, water pipes, electric wires … the list is long. Hydrocarbons are, so far, optimal ways to power most of what society needs and wants.
More than a decade ago, Google focused its vaunted engineering talent on a project called “RE<C,” seeking to develop renewable energy cheaper than coal. After the project was canceled in 2014, Google’s lead engineers wrote: “Incremental improvements to existing [energy] technologies aren’t enough; we need something truly disruptive. … We don’t have the answers.” Those engineers rediscovered the physics and scale realities of energy systems.
An energy revolution will come only from the pursuit of basic sciences. Or, as Bill Gates has phrased it, the challenge calls for scientific “miracles.” These will emerge from basic research, not from subsidies for yesterday’s technologies. The internet didn’t emerge from subsidizing the dial-up phone, or the transistor from subsidizing vacuum tubes, or the automobile from subsidizing railroads.
However, 95% of private-sector R&D spending and the majority of government R&D are directed at “development” and not basic research. If policymakers want a revolution in energy tech, the single most important action would be to radically refocus and expand support for basic scientific research.
Hydrocarbons—oil, natural gas and coal—are the world’s principal energy resource today and will continue to be so in the foreseeable future. Wind turbines, solar arrays and batteries, meanwhile, constitute a small source of energy, and physics dictates that they will remain so. Meanwhile, there is simply no possibility that the world is undergoing—or can undergo—a near-term transition to an entirely “new energy economy.”
Mark P. Mills is a senior fellow at the Manhattan Institute and a faculty fellow at Northwestern University’s School of Engineering and Applied Science. He is also a strategic partner with Cottonwood Venture Partners, an energy tech venture fund. He holds a degree in physics from Queen’s University in Ontario, Canada.
Recommended Reading
Novel EOR Process Could Save Shale from a Dry Future
2024-12-17 - Shale Ingenuity’s SuperEOR, which has been field tested with positive results, looks to remedy the problem of production declines.
AIQ, Partners to Boost Drilling Performance with AI ROP Project
2024-12-06 - The AI Rate of Penetration Optimization project will use AI-enabled solutions to provide real-time recommendations for drilling parameters.
As Permian Gas Pipelines Quickly Fill, More Buildout Likely—EDA
2024-10-28 - Natural gas volatility remains—typically with prices down, and then down further—but demand is developing rapidly for an expanded energy market, East Daley Analytics says.
Afterthought to Asset: How Data has Transformed Oil, Gas Decision-Making
2024-12-05 - Digital data points have transformed from a byproduct of operations to the main driver of innovation in the energy industry, says Fabricio Sousa, president of Worley Consulting.
Exclusive: Novi Labs’ Ludwig on AI Preventing Costly Drilling Mistakes
2024-12-12 - Novi Labs President and Co-Founder Jon Ludwig gives insight on how AI and machine learning allow diverse applications for oil and gas operations and less risk for cataclysmic failure, in this Hart Energy Exclusive interview.
Comments
Add new comment
This conversation is moderated according to Hart Energy community rules. Please read the rules before joining the discussion. If you’re experiencing any technical problems, please contact our customer care team.