Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Samsung To Supply Hbm4 For Amd's Next-gen Ai Chips

Card image cap

Samsung Electronics co-CEO Jun Young-hyun, in charge of the company's semiconductor business, left, and AMD CEO Lisa su poses for a photo after a signing ceremony held in Samsung Electronics' Pyoengtaek campus in Gyeonggi on March 18. [SAMSUNG ELECTRONICS]

 
Samsung Electronics will supply its sixth-generation high bandwidth memory, or HBM4, to U.S. chip designer Advanced Micro Devices (AMD) to power company’s next-generation AI chips, announced during AMD CEO Lisa Su's two-day visit to Korea.
 
The announcement came just over a month after the Korean chipmaker began the world’s first shipments of the advanced memory.
 
“Samsung Electronics has been named the preferred supplier of HBM4 for AMD’s next-generation AI accelerator, the Instinct MI455X GPU,” Samsung said in a statement on Wednesday.
 
The MI455X belongs to AMD’s Instinct family of data-center accelerators designed for AI and high-performance computing workloads.
 
 
“Based on industry-leading performance, reliability and power efficiency delivered by HBM4, our memory will provide an optimal solution for AMD GPUs in AI model training and inference,” Samsung said.
 
AMD Lisa Su, left, and Samsung Electronics Executive Chairman Lee Jae-yong, pose for a photo after a signing ceremony at a banquet hosted at Seungjiwon, Samsung Group's guesthouse in Seoul, on March 18. [SAMSUNG ELECTRONICS]

AMD Lisa Su, left, and Samsung Electronics Executive Chairman Lee Jae-yong, pose for a photo after a signing ceremony at a banquet hosted at Seungjiwon, Samsung Group's guesthouse in Seoul, on March 18. [SAMSUNG ELECTRONICS]

 
The deal is part of an expanded partnership between Samsung Electronics and AMD to collaborate on next-generation AI memory and computing technologies announced on Wednesday. AMD CEO Lisa Su, Samsung Electronics co-CEO Jun Young-hyun, who oversees the company’s semiconductor business, and other senior executives attended the memorandum of understanding ceremony held at Samsung’s Pyeongtaek campus in Gyeonggi.
 
Under the agreement, the two companies will also collaborate on improving the performance of AMD’s Helios rack-scale AI system, which integrates dozens of MI455X accelerators per rack for large-scale AI workloads. They will also work together on double data rate 5 (DDR5) memory upgrades to maximize the performance of sixth-generation EPYC processors, the company’s next-generation data-center CPUs.
 
AMD CEO Lisa Su tours Samsung Electronics’ Pyeongtaek campus in Gyeonggi on March 18. [SAMSUNG ELECTRONICS]

AMD CEO Lisa Su tours Samsung Electronics’ Pyeongtaek campus in Gyeonggi on March 18. [SAMSUNG ELECTRONICS]

 
"Samsung and AMD share a commitment to advancing AI computing, and this agreement reflects the growing scope of our collaboration," Jun said. "From industry-leading HBM4 and next-generation memory architectures to cutting-edge foundry and advanced packaging, Samsung is uniquely positioned to deliver unrivaled turnkey capabilities that support AMD’s evolving AI roadmap."
 
"Powering the next generation of AI infrastructure requires deep collaboration across the industry," Su said. "We are thrilled to expand our work with Samsung, bringing together their leadership in advanced memory with our Instinct GPUs, EPYC CPUs and rack-scale platforms. Integration across the full computing stack, from silicon to system to rack, is essential to accelerating AI innovation that translates into real-world impact at scale.”
 
The relationship between Samsung and AMD dates back nearly two decades, when Samsung supplied graphics double data rate 4 memory to AMD’s early Radeon graphics cards in 2007. Since then, the partnership has expanded to include memory supply, chip design collaboration and IP licensing. Samsung also supplied 12-layer HBM3E memory for AMD’s MI350X and MI355 AI accelerators in 2025.  
 
Samsung recently unveiled its seventh-generation HBM4E at Nvidia’s GTC 2026 conference in California on Monday. The HBM4 chips are built using the company’s 10-nanometer-class 1c dynamic random access memory (DRAM) and a 4-nanometer logic process, delivering processing speeds of up to 11.7 gigabits per second and maximum bandwidth of 3.3 terabytes per second.
 
AMD CEO Lisa su, second from right at the front, and Naver CEO Choi Soo-yeon, far right, tours Naver's Seongnam headquarters in Gyeonggi on March 18. [NAVER]

AMD CEO Lisa su, second from right at the front, and Naver CEO Choi Soo-yeon, far right, tours Naver's Seongnam headquarters in Gyeonggi on March 18. [NAVER]

 
Earlier the same day, Su also met with Choi Soo-yeon, CEO of Naver, at the company’s headquarters in Seongnam, Gyeonggi. The two companies signed a separate MOU to collaborate on deploying AMD’s AI chips for Naver’s AI models and cloud services.
 
 
The two companies also plan to support broader AI research by providing AI computing resources to academic researchers and pursuing joint research projects across diverse infrastructure platforms.

BY LEE JAE-LIM [lee.jaelim@joongang.co.kr]