Samsung Electronics has secured a key supply deal with AMD, with its fifth-generation 12-layer HBM3E memory selected for the chipmaker’s upcoming MI350 AI accelerators. The move marks a breakthrough for Samsung, which has repeatedly lost out to rivals like Nvidia, and helps ease concerns about the reliability of its HBM technology.

With the deal confirmed, Samsung is buoyed by the momentum, especially as expectations grow for future HBM4 (sixth-generation HBM) supply to AMD’s MI400 series, set to launch next year. Industry observers also believe Samsung may begin supplying HBM3E to Nvidia as early as this month, potentially accelerating the company’s HBM business in the second half.

AMD announced at its Advancing AI 2025 event in San Jose, California on June 12 that Samsung and Micron’s 12-layer HBM3E will be used in its new MI350X and MI355X AI accelerators. While Samsung’s supply to AMD had been widely assumed, this marks the first time AMD has confirmed it publicly.

The HBM3E used in the MI350 series is believed to be Samsung’s 36GB 12-layer DRAM, completed last year. The chip vertically stacks 24Gb DRAM dies using through-silicon sia (TSV) technology, offering 36GB per package.

Compared to the previous eight-layer version, the new 12-layer HBM3E delivers over 50% improvement in both performance and capacity. It supports bandwidths of up to 1,280GB/s and I/O speeds of up to 10Gbps across 1,024 channels, enough to transmit about 40 UHD movies per second.

Samsung has applied its advanced thermal compression non-conductive film (TC NCF) technology to maintain the same package height as the eight-layer version while increasing vertical integration by over 20% by narrowing the inter-chip gap to 7 micrometers. Differently sized bumps were also strategically placed to separate signal and thermal paths.

AMD’s MI400 series, expected next year, may also feature Samsung’s HBM4. At the same event, AMD revealed plans to equip each MI400 GPU with 432GB of HBM4. The Helios server rack, comprising 72 MI400 GPUs, will hold 31TB of HBM4, 10 times the AI processing power of the current MI355X rack.

HBM4 is seen as a crucial battleground among Samsung, SK Hynix, and Micron for dominance in the AI memory market. With JEDEC recently finalizing standards, full-scale production is underway. Samsung and SK Hynix aim to start mass production by the end of this year. Having lost ground in the current HBM generation, Samsung hopes to regain leadership through HBM4.

An industry source said, “Samsung is betting big on HBM4 to reclaim its standing against SK Hynix and Micron,” adding, “While those rivals are using fifth-generation 10nm-class (1b) process technology, Samsung plans to use more advanced sixth-generation (1c) processes. If successful, this could give Samsung a significant competitive edge.”