Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Markets Daily Brief

AI Chip Costs Doubled to $52B in 2025: Epoch AI's New Tracker Reveals HBM as the Price Driver

$52B chip spend
3 min read Epoch AI Partial
Total AI chip component spending reached approximately $52 billion in 2025, more than double the $22 billion recorded in 2024, per Epoch AI's newly launched Chip Components Explorer. High-Bandwidth Memory accounted for roughly $20 billion of that $30 billion increase.
AI chip spend 2024–2025, +136%

Key Takeaways

  • AI chip component spending more than doubled from $22B (2024) to $52B (2025), per Epoch AI's Chip Components Explorer, the fastest single-year hardware cost escalation on record
  • High-Bandwidth Memory accounted for ~$20B of the ~$30B year-over-year increase, making it the single largest cost driver in the AI hardware stack
  • HBM supply is concentrated among three manufacturers (SK Hynix, Samsung, Micron), creating a narrow supply chain dependency for the fastest-growing AI cost category
  • Nvidia B300 reportedly features ~288GB HBM3E, roughly double the H200's capacity, signaling further HBM demand growth in next-generation hardware
AI chip component spend, 2025
$52B
Up from $22B in 2024, +136% year-over-year, per Epoch AI
+136%

AI Chip Component Spend: 2024 vs. 2025 (Epoch AI)

Year Total Spend YoY Change HBM Contribution to Increase Source
2024 $22B - - Epoch AI
2025 $52B +$30B (+136%) ~$20B (~67% of increase) Epoch AI

$22 billion to $52 billion. One year.

Per Epoch AI’s Chip Components Explorer, total spending on AI chip components more than doubled between 2024 and 2025, a $30 billion year-over-year increase that represents the fastest single-year cost escalation in the hardware layer of the AI stack. Epoch AI is the primary source for this data; no comparable tracker exists at this granularity.

HBM is the bottleneck. High-Bandwidth Memory accounted for approximately $20 billion of that $30 billion increase. That’s not a supporting character in the cost story, it’s the lead. HBM is the interface between a chip’s compute cores and the data those cores need to process. As AI models have grown larger and inference workloads have intensified, HBM capacity has become the binding constraint on what chips can actually do in production.

The Nvidia B300 is reported to feature approximately 288GB of HBM3E memory, roughly double the capacity of the H200 predecessor, per Epoch AI’s component tracking data. That specification reflects where the market is heading: more HBM per chip, at higher cost per unit, with supply concentrated among a small number of manufacturers.

HBM Capacity (flagship chip)

Nvidia H200
~141GB HBM3
Nvidia B300 (reported)
~288GB HBM3E

Why this matters for infrastructure investors. HBM is produced by three companies: SK Hynix, Samsung, and Micron. Nvidia’s roadmap depends on HBM delivery schedules that these manufacturers control. When $20 billion of a $30 billion cost increase flows through three supply chain points, infrastructure cost planning becomes a function of semiconductor manufacturing capacity, not just AI model demand.

This is the third data point in the current reporting cycle connecting to AI infrastructure capital concentration, following reported hyperscaler backlog commitments and Challenger workforce attribution data. The direction is consistent: capital is concentrating in the infrastructure layer, costs are rising, and supply chain dependency is narrowing. Epoch AI’s component-level data is the most granular evidence of that concentration published in .

The practical implication. Enterprise buyers pricing AI deployment over a multi-year horizon are now working with a hardware cost baseline that rose 136% in a single year. Whether 2025’s rate of increase continues into 2026 is the open question. The Chip Components Explorer gives infrastructure planners a public tool to track that answer as new data arrives.

What to Watch

SK Hynix / Samsung HBM production announcements, 2026 capacity expansionH2 2026
Nvidia next-gen roadmap, HBM capacity figures for post-B300 chips2026
Epoch AI 2026 component spend update, does HBM share hold above 60%?Early 2027

What to watch. SK Hynix and Samsung HBM production announcements are the leading indicator for whether the $20 billion HBM cost concentration of 2025 expands, stabilizes, or contracts in 2026. Watch also for whether Nvidia’s next-generation roadmap announcements include HBM capacity figures, those disclosures will give a forward view on whether component costs are still accelerating or beginning to plateau.

TJS synthesis. Epoch AI’s tracker gives AI infrastructure analysis something it’s lacked: component-level spending data that doesn’t originate from vendor earnings calls. The $22B-to-$52B figure is the first independently sourced evidence of how fast the hardware build is actually moving. Watch the 2026 edition of this data, if HBM’s share of component spend stays above 60%, the supply chain concentration risk isn’t resolving. If it drops, that signals either manufacturing expansion or a shift in chip architecture that reduces HBM dependency.

View Source
More Markets intelligence
View all Markets

Stay ahead on Markets

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub