SK Hynix executives discuss AI memory leadership and future HBM market trends at a roundtable discussion in May 2024 South Korea’s SK Hynix Inc. said on Thursday it is working on next year’s supply plans for its high-bandwidth memory (HBM) chips as clients are advancing their product launch schedules to ride on the artificial intelligence boom.
At a recent roundtable discussion of newly appointed executives responsible for HBM chips, Kim Ki-tae, vice president and head of HBM Sales & Marketing at SK Hynix, said: “Looking at the current market situation, Big Tech customers are accelerating the timing of new product launches to secure leadership in the AI market. Accordingly, we are also discussing plans in advance for this year and next year to ensure timely supply of next-generation HBM products.”
SK Hynix is the world’s second-largest memory chipmaker after Samsung Electronics Co. but is the dominant supplier of HBM, a high-performance stacked DRAM chip vital for generative AI devices.
Kim Ki-tae, vice president of SK Hynix HBM Sales & Marketing The company was the first memory vendor to develop the first-generation HBM chip in 2013 and unveiled succeeding products – HBM2, HBM2E and the latest and fourth-generation HBM3 chips in following years.
In August 2023, it unveiled HBM3E, the industry’s best-performing fifth-generation HBM DRAM for AI applications, and provided samples to its client Nvidia Corp. for performance evaluation.
In March this year, SK Hynix began producing HBM3E chips in large quantities – the industry’s first to do so – and said it would advance mass production of sixth-generation HBM4 chips to 2025.
HIGH-CAPACITY NAND GETS INDUSTRY ATTENTION
“We have been able to build up a solid competitiveness by preemptively securing technology and mass production know-how,” said Kwon Un-oh, vice president and head of HBM Process Integration (PI) at SK Hynix.
Oh Hae-soon, vice president of NAND Advanced Process Integration at SK Hynix Son Ho-young, vice president and head of Advanced Package Development, urged the company to prepare for the convergence of different types of memory and system chips.
Buoyed by growing demand for high-end chips for AI learning and inference, the global DRAM market is forecast to rise 65% on-year to 117 trillion won ($85 billion) this year, according to SK Hynix.
Earlier this month, Chief Executive Kwak Noh-jung said during a press conference that its HBM chip production capacity is almost fully booked through next year.
“As demand for large-capacity AI servers increases, NAND solutions such as eSSD have begun to receive the industry attention,” she said.
SK Hynix's HBM3E, the extended version of the HBM3 DRAM chip EMERGING MEMORY CHIPS
Yi Jae-yun, vice president of SK Hynix’s Revolutionary Technology Center (RTC), said the company is also paying close attention to emerging memory chips such as selector-only memory (SOM), spin memory and synaptic memory which offer ultra-high speed, high capacity and low power consumption, as well as magnetic RAM (MRAM), resistive RAM (RRAM) and phase-change memory (PCM) chips.
Analysts said among memory chipmakers, SK Hynix is the biggest beneficiary of the explosive increase in AI adoption, as it is the top supplier of AI chips to Nvidia Corp., which controls 80% of the AI chip market.
We use cookies to provide the best user experience. By continuing to browse this website, you will be considered to accept cookies. Please review our Privacy Policy to learn our cookie policy.