The AI infrastructure conversation has moved. Six months ago, the constraint story was about geographic distribution, where data centers were being built. Now it’s about whether they can get power when they get there.
Gartner’s projection is specific: 40% of existing AI data centers will face operational power constraints by 2027. That’s not a long-range forecast. It describes facilities that exist or are under construction right now. The energy demand estimate, 500 TWh per year for generative AI by 2027, per Gartner, represents approximately 2.6 times the level recorded in 2023. The grid hasn’t grown at that pace. It can’t.
The Timeline Problem
Per RBC Wealth Management’s analysis, power infrastructure takes roughly twice as long to build as the data centers it serves. Data centers are going up in 18-24 months. Substations, high- voltage transmission lines, and grid interconnections take 4-6 years. That gap is structural, not solvable through capital deployment alone. You can’t build your way out of a permitting and interconnection queue that stretches years into the future.
The pricing signal is already visible. RBC estimates wholesale electricity prices near major data center hubs have risen by up to 267%. Northern Virginia, the densest AI data center market in the US, is the primary example. Analysts project data centers could account for up to 12% of US electricity demand by 2028, a level that would stress grid capacity in concentrated markets regardless of how much capital is committed.
The Epoch AI Context
The scale of the largest facilities gives this constraint concrete form. Per Epoch AI’s compute tracking data, the Anthropic-Amazon New Carlisle facility represents approximately 1.1 gigawatts of planned capacity at a reported construction cost of approximately $35 billion. One facility. 1.1 GW. For context, a large natural gas power plant generates 400-500 MW. The New Carlisle site alone would require the equivalent of two to three major power plants – plants that take years to permit and build, not months.
What This Means for the Market
The power constraint is now a market pricing signal. Real estate in power-abundant locations commands a premium, a dynamic TJS covered in the AI Data Center Inland Growth brief, where Synergy Research data showed the geographic shift already underway. The Blackstone data center REIT brief captured the investment thesis. This brief captures what’s driving both: operational constraint is creating geographic and financial arbitrage opportunities for investors positioned ahead of the constraint, and cost and availability risk for enterprises that aren’t.
What to Watch
Watch nuclear power licensing timelines. Small modular reactors are the most-cited solution for dedicated data center power, but the licensing pathway in the US runs 5-7 years. Watch hyperscaler announcements about dedicated power infrastructure, Microsoft, Google, and Amazon have all made nuclear offtake commitments, and each new announcement narrows the supply available to mid-tier operators. Watch wholesale electricity prices in secondary markets like the Midwest and Southeast, those are the markets absorbing demand that Virginia can no longer take.
TJS Synthesis
Power is the new GPU. When Gartner says 40% of AI data centers face constraints by 2027, it’s describing a market where energy access has become a first-order competitive variable – alongside compute, model access, and talent. Organizations planning AI infrastructure deployments in the next 24 months need power availability assessments as part of their site selection process, not as an afterthought. The firms that treat energy as an input variable now will have better options than those that treat it as a given.