HBM Can Keep Micron Out Of The Next Memory Bust Cycle

HBM Can Keep Micron Out Of The Next Memory Bust Cycle

  • 24.03.2025 18:00
  • nextplatform.com
  • Keywords: AI, Market Growth

Micron Technology benefits from supplying HBM memory for AI systems, particularly in Nvidia GPUs, which helps shield it from the memory market's bust cycle. Its Compute and Networking group saw significant revenue and profit growth due to high demand for HBM and LPDDR5X memory solutions.

Nvidia NewsNvidia ProductsMUsentiment_satisfiedINTCsentiment_dissatisfied

Estimated market influence

Micron Technology

Micron Technology

Positivesentiment_satisfied
Analyst rating: Buy

Micron's AI-related products, such as HBM3E and LPDDR5X memory, are driving significant revenue growth.

Intel

Intel

Negativesentiment_dissatisfied
Analyst rating: Neutral

Intel lacks a strong AI play, leading to poor financial performance compared to competitors like Micron.

Context

Analysis of Micron Technology's HBM Memory Business and Market Implications

Overview

  • Micron's Position: Micron has emerged as a significant supplier of High Bandwidth Memory (HBM) for AI accelerators, particularly for Nvidia's GPU systems. Its eight-high and twelve-high HBM3E stacks are critical components in Nvidia's GB200 and GB300 systems.

Financial Highlights

  • Q2 2025 Results:
    • Revenue: $8.05 billion (up 38.3% YoY).
    • Operating Income: $1.77 billion (up nearly tenfold YoY).
    • Net Income: $1.58 billion (doubled YoY).
    • Cash and Investments: $9.59 billion.
    • Capital Expenditure: $3.1 billion in Q2, with a $14 billion annual target.

Compute and Networking Group Performance

  • Revenue: $4.56 billion (up 3.8% sequentially, more than doubled YoY).
  • Operating Income: $1.92 billion (68.5X YoY growth, 12.2% sequential increase).
  • HBM Memory:
    • Revenue: $1.14 billion in Q2 (up 52% sequentially, 19X YoY).
    • Market Share Projection: Micron expects its HBM market share to align with its overall DRAM market share (~20%-25%) by end of 2025.
      • TAM Forecast*: $35 billion in 2025 (up from $30 billion earlier estimate), growing to $100 billion by 2030.

Competitive Dynamics

  • Nvidia Dependency: Micron's HBM sales are heavily tied to Nvidia's AI accelerator demand. LPDDR5X memory usage in Nvidia's Grace and Blackwell systems is a key growth driver.
  • Complexity and Yield: HBM production complexity poses challenges, but Micron expects twelve-high HBM3E to outperform eight-high variants in terms of yield and margins.

Market Trends

  • AI Impact: AI-related products (HBM, high-capacity server DRAM, LPDDR5X) contributed $2.19 billion in Q2 revenue.
  • Core Memory Business:
    • DRAM: Sequential decline of 4.3% to $6.12 billion.
    • NAND Flash: Sequential decline of 17.2% to $1.86 billion.
    • Non-AI Core DRAM: Sequential drop of 26.4% to $3.94 billion.

Long-Term Effects

  • HBM Growth: Sequential HBM revenue growth expected in Q3 and beyond, with Micron poised to benefit from AI-driven demand.
  • Market Cycles: The broader memory market remains cyclical, but HBM's premium positioning may mitigate the bust cycle effects.

Strategic Considerations

  • Investment in Capex: Micron is aggressively investing in U.S. and global foundries to scale HBM production.
  • Regulatory Impact: Capital investment supported by CHIPS Act funding, though specifics were not detailed.

Conclusion

Micron's strategic focus on AI-enabled HBM memory positions it as a key player in the next-gen compute market. While challenges remain with core DRAM and NAND businesses, the long-term outlook for HBM is promising, driven by hyperscaler and cloud demand.