The number from Data Center World 2026 is 300 gigawatts. That’s the projected aggregate power gap facing US AI infrastructure by 2030, according to analysis presented at the conference and reported by Data Center Knowledge. The figure breaks into two components: approximately 200 GW of new AI-driven electricity load that will need to come online, and roughly 104 GW of existing generation capacity that is expected to retire over the same period.
Those aren’t rounding errors. Together they describe a structural mismatch that will define AI infrastructure economics for the rest of the decade.
IEA analysis provides directional support for the demand-side projections, though the specific figures originate with analysts at the conference. The IEA has tracked data center electricity demand as one of the fastest-growing components of global energy consumption, and AI workloads are driving the acceleration.
The more operationally urgent number is the cancellation rate. Some energy analysts estimate that interconnection delays may be forcing 30-50% of 2026 planned data center builds into cancellation or delay, according to Sightline Climate analysis. That figure warrants monitoring against actual groundbreaking data, it’s a projection from a newsletter-format report, not a confirmed construction survey, but the directional concern is consistent with what developers and grid operators have said publicly throughout 2025 and into this year.
The reason cancellations are happening isn’t a shortage of capital or demand. It’s that connecting new large-load customers to the grid takes years in most US markets. The interconnection queue in PJM, MISO, and Western interconnection regions has stretched from months to years as AI-scale loads have compounded on top of renewable energy project backlogs. A data center that breaks ground today in many markets can’t expect grid power for three to five years.
Industry participants at the conference cited two emerging responses: on-site natural gas generation and small modular reactors as “speed-to-power” strategies that bypass the interconnection queue, according to conference coverage. Neither is a near-term solution at the scale the 300 GW gap requires. Both signal that developers are treating the queue as a given constraint, not a solvable problem.
For infrastructure investors, the practical implication is that power-ready sites, land with permitted, available grid connection, now carry a structural premium that has nothing to do with location or construction cost. This is the same thesis that EQT Group formalized in its dedicated AI Infrastructure strategy announced April 21: power access, not capital, is the binding constraint.
What to watch: FERC’s interconnection queue reform proceedings are the most consequential near-term policy variable. Any acceleration in queue processing changes the 30-50% cancellation estimate significantly. Separately, watch for utility earnings calls over the next 30 days, grid operators will provide the most reliable near-term data on actual connection timelines.
The TJS read: The 300 GW figure is a planning number, not a crisis declaration. But it crystallizes something that infrastructure planners have been working around informally: AI’s power problem isn’t about money or technology. It’s about the gap between how fast AI scales and how slowly grid infrastructure gets permitted. That gap is the defining market constraint for AI infrastructure through the decade.