Jonathan Koomey has spent decades studying the energy demands of computing. His work began at Lawrence Berkeley National Lab during the dot com boom, and it’s spanned the rise of datacenters as economic juggernauts and AI as an unprecedented power sink.
These days, he’s trying to separate hype from reality in how AI intersects with power infrastructure.
When people talk about computing efficiency over time, they often point to a phenomenon named after him—an observation he’s quick to downplay.
The gist of “Koomey’s Law” as it’s been dubbed is that the peak output efficiency of computing doubled roughly every 1.6 years from the 1940s until 2000. But after 2000, that rate slowed, with the doubling now taking closer to 2.6 years.
That doesn’t mean efficiency gains have stalled—it means the low-hanging fruit has been picked, and what remains requires more sophisticated engineering.
“People think datacenters use a lot of electricity just because they’re economically important, but that’s not the case,” Koomey says in the recorded interview . “They use more, sure, but not as much as people assume.”
In 2000, datacenters consumed roughly 1% of global electricity. By 2024, that figure had climbed to 1.5%, and a chunk of that increase is directly attributable to AI.
But the real energy story isn’t just in those datacenters—it’s in the potential for AI to optimize other sectors, like Google using AI to slash cooling costs in its data centers by 30-40%. That’s the efficiency paradox AI creates. It consumes more power to create systems that consume less power elsewhere.
Koomey’s skepticism about the blanket narrative of AI efficiency runs deep.
He knows the industry’s cycle: build as much as possible as fast as possible, efficiency be damned. That was the story in the .com boom, and it’s the story now.
"In the last few years, we’ve been in crisis mode. People are just installing as much AI equipment as they can get their hands on. Efficiency isn’t top of mind right now,” he said. But the more pressing question isn’t how much AI consumes—it’s what it enables.
“If that direct use makes the other 98% of energy consumption more efficient, then it could have a much bigger effect.”
The problem, of course, is that efficiency is a double-edged sword. In economics, they call it the Jevons Paradox: making something more efficient makes it cheaper, and people use more of it. Koomey frames it more simply: “If the cost goes down, people use more. That’s not an energy issue. It’s a cost issue.”
This disconnect between technical capability and physical infrastructure is where the AI story starts to fray. Tech companies can spin up a new data center in under a year, but upgrading a power grid or installing new transformers? That’s a five to ten-year investment.
“Utility transformers used to take a year and a half to acquire. Now it’s four years.”
That supply chain lag is just one crack in the foundation. Another is the myth that nuclear power is a viable, quick fix.
China and South Korea have managed to deploy reactors at reasonable costs. The US hasn’t. “If you want to build fast, focus on solar, wind, and batteries,” he noted. That’s the harsh reality: zero-emission infrastructure exists now, and it’s faster and cheaper than nuclear.
And Koomey doesn’t think the AI sector should get a pass on paying for it. “If data centers are imposing costs on the system, they should pay them. There’s no reason my mom should be subsidizing their power bills.”
The conversation pivoted to synthetic data, a scenario that Koomey finds more speculative than people realize.
If AI systems generate their own data for training, what happens to data center energy use? “How do you model something when you’ve run out of human data?” he asked.
The concern isn’t just the data volume—it’s the quality. Koomey questions whether synthetic data can capture the nuances of human behavior, and whether AI trained on AI-generated data could become untethered from reality.
Koomey’s skepticism isn’t just technical. It’s economic.
The forecasts assume infinite demand for AI, as if every AI system is inherently valuable.
“It’s a mistake to assume that just because AI becomes more efficient, it will be used limitlessly,” he said. He points to the auto industry as a cautionary tale: make cars more fuel efficient, and people drive more, but not infinitely more. “There are saturation effects. People eventually want to be with their families, not driving endlessly.”
Koomey ends on a cautiously optimistic note. The industry’s cycle is predictable. Build, overshoot, panic, optimize. “Once the frenzy calms down, the industry will do what it always does—find a smarter way to solve the problem.”
And as he knows better than most, the history of computing is the history of doing more with less. The question is whether AI will follow the same arc or break it entirely.