Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Markets Deep Dive

AI's Energy Bill Is Coming Due: What IEA, Rhodium, and Goldman Sachs' Data Actually Says

11 GW stalled
The numbers don't fully agree with each other, and that disagreement is the story. Independent research organizations have produced materially different projections of AI data center energy demand, each using different scope and methodology. Investors, infrastructure operators, and enterprise AI buyers need to understand what those numbers actually measure before treating any of them as settled consensus.

Eleven gigawatts of planned data center capacity is sitting in announced status, without a shovel in the ground. Not because the capital isn’t there. Because the grid isn’t ready.

That figure, drawn from Sightline Climate data cited in Tech Insider’s April 2026 reporting, puts a number on what the industry has been describing in qualitative terms for months: AI infrastructure’s real constraint isn’t compute, funding, or even regulatory approval. It’s power. And the gap between what AI’s buildout demands and what existing grid infrastructure can supply is now large enough to halt construction.

This piece synthesizes what the leading independent research organizations, the International Energy Agency, Rhodium Group, and Goldman Sachs, have actually measured, what they disagree on, and what the collective picture means for infrastructure investors, enterprise AI buyers, and the companies whose climate commitments are now colliding with their capital spending plans.


The Scale: What the Projections Actually Claim

Two figures appear frequently in coverage of AI data center energy demand, and they don’t match.

Tech Insider, citing IEA data, reported in early 2026 that AI data centers are projected to consume 1,000 terawatt-hours in 2026, comparable, the article noted, to Japan’s annual electricity consumption. Climate Change News, in a March 3, 2026 explainer on AI and the energy transition, cited a separate projection in the range of 945 TWh.

These figures are not contradictory. They’re likely measuring different things.

Energy demand projections for data centers vary depending on scope (AI workloads only vs. all data center activity), geography (U.S.-only vs. global), and methodology (nameplate capacity vs. actual utilization). A 55 TWh gap between two credible projections is entirely consistent with different methodological choices, not a sign that one source has the math wrong.

The practical implication: don’t anchor on either number as the definitive figure. The range, roughly 945 to 1,000 TWh for 2026 AI-linked data center demand, is the more useful planning input. Both figures suggest the same directional conclusion: AI infrastructure’s energy appetite is already operating at the scale of a mid-sized national grid.

The IEA’s full report, the primary source behind the 1,000 TWh figure, should be accessed directly for methodology detail. Tech Insider is a T4 intermediary; its headline claim is confirmed, but the underlying IEA documentation warrants direct review before this number anchors investment decisions.


The Climate Consequence: What Rhodium Group Found

Energy demand of this scale has a carbon footprint. Rhodium Group, an independent economic research firm with a track record in energy and emissions analysis, reportedly found that AI-driven energy demand contributed to a roughly 2.4% increase in U.S. fossil fuel emissions, according to Fortune’s March 29, 2026 reporting, though this figure should be confirmed against Rhodium’s original publication before it’s cited in investment documentation.

The Fortune piece, filed by the Associated Press and confirmed as a working source, frames the finding around a specific institutional consequence: major technology companies have revised or softened their 2030 clean energy targets in response to AI infrastructure buildout. The story isn’t that data centers use electricity. The story is that companies which made binding public commitments to clean energy timelines are now adjusting those commitments because AI’s energy demands weren’t in the original model.

This creates ESG exposure that isn’t fully priced into many infrastructure investment theses. A company that committed to 100% renewable energy by 2030 and is now building power-hungry data centers at scale faces a gap between its public commitments and its operational trajectory. That gap is now documented in independent research, not just advocacy reporting.

The reference year for Rhodium’s emissions figure, the “last year” in the Wire’s framing, is likely 2025 but has not been confirmed. Investors should access the Rhodium report directly to confirm the measurement period before drawing conclusions about trend trajectory.


The Supply-Side Constraint: Grid Capacity vs. Demand

The Sightline Climate figure, 11 GW of stalled capacity, deserves more attention than it typically receives in AI infrastructure coverage.

Eleven gigawatts represents a substantial portion of planned 2026 data center buildout sitting idle not because of permitting delays, not because of chip shortages, but because grid connection timelines and power equipment shortages have outrun construction planning. The same Tech Insider report notes that approximately 50% of global data center projects are facing delays from power limitations.

This is the infrastructure investor’s signal. If capital is flowing into AI data center construction (and per Crunchbase’s Q1 2026 data, it is, at record levels), but a meaningful share of that buildout cannot proceed because the grid isn’t ready, then the constraint on AI infrastructure returns isn’t on the compute side. It’s on the energy side. That shifts the investment thesis.

Grid interconnection queue timelines in the U.S. have extended significantly in recent years, a trend documented by energy sector analysts independent of AI. AI’s demand surge has collided with a grid infrastructure that was already operating at capacity in key markets. The result is the 11 GW backlog Sightline tracks.

For enterprise AI buyers evaluating total cost of ownership, the grid constraint adds a site selection dimension that wasn’t a primary concern in earlier AI infrastructure planning cycles. Data center availability in markets with sufficient grid capacity is now a genuine scarcity, not a logistics footnote.


The Investment Angle: Goldman Sachs’ 175% by 2030

Goldman Sachs is reported to have projected a 175% increase in data center power consumption by 2030, per TechCrunch reporting, though this figure could not be verified against Goldman Sachs’ original research note from the available source content. Treat it as a reported projection, not a confirmed Goldman Sachs publication, until the primary note is accessed directly.

If accurate, a 175% increase from current consumption levels by 2030 represents a demand trajectory that grid infrastructure, even with aggressive expansion, will struggle to absorb without structural investment in generation and transmission capacity. That’s not an energy policy observation. It’s a market signal.

The investment implication isn’t subtle. Companies positioned in grid infrastructure, energy generation (particularly modular nuclear, which several hyperscalers are actively exploring), and power equipment manufacturing sit upstream of the AI infrastructure buildout in a way that data center operators themselves do not. The constraint creates value at the point of constraint.


What This Means for Infrastructure Buyers and Investors

Three practical implications follow from the collective picture these research organizations have produced.

First, energy cost is an underappreciated total cost of ownership factor for enterprise AI deployments. Power purchase agreements, grid connection costs, and cooling infrastructure are now material inputs to AI infrastructure economics, not overhead line items.

Second, grid availability is a real site selection constraint. Markets with near-term grid capacity, whether through existing headroom or committed generation projects, carry a premium that wasn’t in most infrastructure valuation models two years ago.

Third, the ESG exposure created by the gap between hyperscalers’ clean energy commitments and their actual energy trajectory is documented and growing. Rhodium Group’s emissions data, if the 2.4% figure holds under direct source review, represents independent quantification of a gap that publicly traded hyperscalers will face in ESG reporting requirements.

The numbers from IEA, Rhodium, and Goldman don’t perfectly agree. The range they collectively define is narrower than the disagreement might suggest, and the direction they point is the same. AI’s energy demand is already large enough to strain grid infrastructure, alter corporate climate commitments, and create investment value at the point of the constraint.

That’s the picture the research shows. The gaps in what’s been confirmed from primary sources are noted throughout, and they’re worth closing before any of these figures anchor a capital allocation decision.

View Source
More Markets intelligence
View all Markets
Related Coverage

Stay ahead on Markets

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub