Two figures are circulating about OpenAI’s infrastructure plans. They don’t contradict each other, but they’re not the same thing, and the gap between them matters.
TechCrunch reports that Sam Altman has stated OpenAI holds approximately $1.4 trillion in data center commitments, a figure characterizing third-party infrastructure build-outs tied to OpenAI’s operations, not a capital expenditure budget. Separately, CNBC reports that OpenAI has reset its spend expectations, targeting approximately $600 billion in compute spend by 2030. That figure appears to reflect a more recent, narrower projection for direct spending.
One earlier Wire version of this story attributed the $1.4T figure to OpenAI co-founder Greg Brockman. Per TechCrunch’s reporting, the figure is attributed to CEO Sam Altman. That attribution has been corrected here. Human editorial review of the Brockman reference is recommended before publication.
The strategic framing behind both numbers: Altman has described AI as eventually being delivered “like a utility, metered, on demand,” according to reporting from The420.in. Infrastructure at this scale is the prerequisite for that model. AI data centers are widely reported to require substantial electricity, with some facilities drawing power comparable to small municipalities, though specific consumption figures vary significantly by facility size and configuration.
The deep-dive on this page examines what both figures mean competitively, and who is positioned to be left out of the infrastructure race.