Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Markets Daily Brief

Anthropic Expands TPU Deal With Google and Broadcom as AI Compute Demand Accelerates

TPU deal, 2
2 min read TechCrunch Partial
Anthropic expanded its compute arrangement with Google and Broadcom in early April, securing additional Google TPU capacity to meet accelerating model demand. The deal deepens Anthropic's infrastructure dependency on its primary strategic partner just months after a separate compute agreement with CoreWeave.

Anthropic has expanded its compute arrangement with Google and Broadcom, securing additional Tensor Processing Unit capacity as demand for its Claude models continues to climb. TechCrunch reported the deal on April 7, describing Anthropic as upping its compute deal “amid skyrocketing demand.” A Yahoo Finance wire corroborates the arrangement, confirming Broadcom has expanded chip supply relationships with both Google and Anthropic.

The structure here matters. This is not equity financing. It is an infrastructure supply arrangement: Broadcom manufactures custom TPU silicon for Google, and Anthropic draws on that capacity through its existing cloud relationship with Google. Anthropic does not own the chips. It buys access to compute time on hardware it cannot fully control. That distinction is easy to miss in coverage that frames these deals as straightforward investments.

Why does this deal warrant attention now? Three reasons.

First, it follows a pattern. Anthropic reached a separate compute agreement with CoreWeave earlier this year, that deal covered a different provider and a different infrastructure architecture. Two major compute partnerships in the same quarter suggests Anthropic is not simply expanding headroom; it is building redundancy across providers. That’s a deliberate infrastructure strategy, not opportunistic deal-making.

Second, the timing aligns with what Anthropic’s model roadmap implies. Frontier model training runs are growing in compute intensity each generation. If Anthropic is preparing for the next Claude training cycle, or is already running it, the demand for guaranteed TPU access becomes a competitive necessity, not an option. Companies that cannot secure compute capacity cannot ship frontier models on competitive timelines.

Third, the deal signals something about Google’s strategic positioning. Google is simultaneously building its own Gemini models and supplying the compute that powers a competitor’s training. That arrangement persists because Anthropic’s continued independence and growth is, for Google, a valuable hedge, one that also locks Anthropic into Google’s infrastructure stack. The deeper Anthropic goes into TPU capacity, the harder it becomes to migrate to alternative compute providers.

What to watch: Bloomberg reported that Anthropic’s annualized revenue run rate has reached $30 billion, up from approximately $9 billion at the end of 2025. TechJacks was unable to independently verify this figure, the Bloomberg source was inaccessible. If confirmed, it would indicate revenue growth that substantially outpaces infrastructure cost scaling. Until that figure is verified through an accessible primary source, treat it as reported but unconfirmed. Anthropic has not published official revenue disclosures.

For infrastructure investors and enterprise AI strategists, the signal to track is whether Anthropic’s compute concentration in Google infrastructure deepens further, or whether the CoreWeave agreement represents the beginning of deliberate diversification. One deal with a single hyperscaler is dependency. Two deals across different providers is a strategy. The direction of the next announcement will clarify which this is.

View Source
More Markets intelligence
View all Markets
Related Coverage

Stay ahead on Markets

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub