Samsung in February unveiled HBM3E 12H, the industry’s largest capacity HBM with a 12-layer stack (Courtesy of Samsung Electronics) Samsung Electronics Co., the world’s No. 1 memory chipmaker, recently set up a high bandwidth memory (HBM) team within the memory chip division to increase production yields as it is developing a sixth-generation AI memory HBM4 and AI accelerator Mach-1.
The new team is in charge of the development and sales of DRAM and NAND flash memory, according to industry sources on March 29.
Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung, will lead the new team. It has not yet been decided how many employees will work for the division.
It is Samsung’s second HBM-dedicated team after the company launched the HBM taskforce team in January this year, composed of 100 talent from its device solutions division.
Samsung is ramping up efforts to upend its local rival SK Hynix Inc., the dominant player in the advanced HBM segment. In 2019, Samsung disbanded the then-HBM team on the conclusion that the HBM market would not grow significantly, a painful mistake it regrets.
Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung TWO-TRACK STRATEGY
To grab the lead in the AI chip market, Samsung will pursue a “two-track” strategy of simultaneously developing two types of cutting-edge memory chips: HBM and Mach-1.
Currently, HBM3E is the best-performing DRAM for AI applications and a fifth-generation DRAM memory, succeeding the previous generations: HBM, HBM2, HBM2E and HMB3.
“Customers who want to develop customized HBM4 will work with us,” Kyung Kye-hyun, head of Samsung's semiconductor business, said in a note posted on a social media platform on Friday.
“HBM leadership is coming to us thanks to the dedicated team’s efforts,” he added.
Samsung Electronics' annual general meeting on March 20, 2024
HBM is a high-performance memory chip stacking multiple DRAMs vertically and an essential component of AI chips in processing great volumes of data.
According to Yole Group, a French IT research firm, the HBM market is forecast to expand to $19.9 billion in 2025 and $37.7 billion in 2029, compared to an estimated $14.1 billion in 2024.
We use cookies to provide the best user experience. By continuing to browse this website, you will be considered to accept cookies. Please review our Privacy Policy to learn our cookie policy.