1. Home >
  2. Computing

Micron Sold All of Its High Bandwidth Memory Supply for 2024 and Most of 2025

The high-bandwidth memory market is exploding due to its usage in AI accelerators.
By Josh Norem
Micron HBM3e
Credit: Micron

We all know Nvidia is enjoying life as the belle of the AI ball, thanks to its hardware being the gold standard for training AI models. Now, it appears it'll be bringing its hardware partners along for the ride as well. Memory maker Micron has been tapped to provide the latest high-bandwidth memory (HBM3e) for the company's upcoming H200 accelerator, and it says it's completely sold out for the rest of 2024. Not only that, but it also says most of its HBM supply is already allocated for 2025.

Micron executives recently shared the results of a successful quarter for its memory products, which include HBM for servers, NAND for SSDs, and DDR5 for clients. The quarterly earnings statement is jubilant, with CEO Sanjay Mehrotra calling the results "well above the high end of guidance." Turning to the company's HBM memory portfolio, Mehrotra said, "Our HBM is sold out for calendar 2024, and the overwhelming majority of our 2025 supply has already been allocated." Due to the company's juicy contract with Nvidia for HBM3e chips for its H200 accelerator, which will begin shipping this year, the company expects business to be rosy for some time.

Micron roadmap
Micron is expected to roll out 12-layer stacks of HBM3e in 2025, eventually pushing bandwidth to 2TB/s and beyond in 2026. Credit: Micron

Micron says its HBM3e memory, the fastest server memory available until HBM4 arrives in 2026, has 30% lower power consumption than its competitors' offerings. It competes with both Samsung and SK Hynix in this market and is seen as the underdog, given its small market share. However, its status is on the upswing due to its partnership with Nvidia and what appears to be a competitive memory portfolio. Its HBM3e memory "cubes" offer 24GB capacity in an eight-layer design for up to 1.2TB/s of bandwidth per cube. The H200 will feature six modules for 141GB of HBM3e memory.

Tom's Hardware notes Micron beat its rivals to the HBM3e market, which likely granted it the coveted position of being added to Nvidia's second-generation H200 product. The company is readying its second salvo into the HBM3e market with a 12-layer design, increasing capacity per-cube to 36GB. In the earnings call, Mehrotra said Micron expects to begin ramping this version throughout 2025, so the eight-layer design will be its money-maker in 2024 as it rides along the gravy train that is the Nvidia H200 accelerator.

Tagged In

Data Centers Semiconductors Memory

More from Computing

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up