For most of the AI conversation, the assumption has been that the limiting factor is software — better models, smarter algorithms, more capable agents. The actual ceiling is showing up somewhere most business leaders aren't watching: the power grid. Every new data center being built to host AI workloads needs an enormous amount of electricity, on a schedule, in specific places. The infrastructure to deliver that power was not built with this scenario in mind. The shortage is starting to show, and it has knock-on effects that reach further than you'd expect.
The math is uncomfortable
A single large AI training run can use as much electricity as a small town for weeks. The data centers being built to run inference at scale are routinely sized at 500 megawatts to a gigawatt apiece — comparable to a nuclear reactor's output. Forecasts from utilities and industry analysts suggest U.S. data center electricity demand could double or triple by the end of the decade, driven almost entirely by AI. The grid was built to grow at single-digit percentages per year. It is not configured for this kind of step change.
Transformers are the unsexy bottleneck
The single most overlooked piece of equipment in the AI buildout is the large power transformer — the steel-and-copper devices that step voltage up and down between generation and distribution. Lead times on these have stretched from months to years. Companies like GE Vernova, Siemens Energy, and Hitachi Energy have order books that are full through the end of the decade. You cannot accelerate a transformer factory the way you can a software release. The implication: even if you have land, money, and chips, you may not be able to plug a new data center in for two or three years.
Why this matters for your business
You don't run a data center, so why care? Three reasons. First, AI compute pricing is going to reflect the constraint — expect price stability or upward pressure on heavy AI workloads, not the constant decline we've seen the last two years. Second, regional availability is going to matter more. Some markets will have AI capacity. Others will be waiting in line. If your business depends on a specific cloud region, that's worth checking. Third, your own electricity costs are not isolated from this. Where data centers concentrate, residential and commercial rates have started to rise. That's a budget line item that will get attention from anyone watching utility bills closely.
The honest caveat
Grids do adapt. Permitting reform, new generation, transformer manufacturing investment, and demand-shaping by hyperscalers will all chip away at the constraint over the next five to ten years. This isn't a permanent ceiling. But it is a real one for the next several years, and it shapes the rate at which AI gets deployed and how much it costs. The story of the next phase of AI isn't going to be told only in research papers. It's going to be told in substations, transformer factories, and electricity contracts. Worth paying attention to.