▲ Prepared for the AI Revolution! Explore Samsung Semiconductor's Exclusive CES 2024 Showcase!
LAS VEGAS – Samsung Electronics Co., the world’s largest memory chipmaker, plans to more than double its high bandwidth memory (HBM) chip production volume as it aims to take the lead in the artificial intelligence chip segment.
Han Jin-man, executive vice president responsible for Samsung's US semiconductor business, said on Thursday the company is pinning high hopes on high-capacity memory chips, including the HBM series, to lead the fast-growing AI chip segment.
“We will raise our HBM chip production volume by 2.5 times this year compared to last year’s output. The pace will continue with another twofold increase next year,” he told reporters during a media session at CES 2024.
“Memory chips will play the leading role in the AI era. Samsung will not be influenced by the industry’s ups and downs. We will steadily expand our investment in the growth sector,” he said.
Han is the highest-level Samsung executive to unveil the company’s HBM chip production plans for this year and next.
Han Jin-man, executive VP of Samsung's US semiconductor business, outlines the chipmaker's HBM chip business plans at CES 2024
HBM is a high-capacity, high-performance semiconductor chip, demand for which is soaring as it is used to power generative AI devices like ChatGPT, high-performance data centers and machine learning platforms.
The HBM series of DRAM is the talk of the town these days as electronics makers are unveiling products equipped with on-device AI technology, which enables customized and personalized AI functions on smartphones and other smart gadgets.
HBM3, one of the most advanced such chips currently available, is said to have a capacity 12 times higher and a bandwidth 13 times higher than GDDR6, the latest DRAM product.
Samsung's advanced DRAM chips According to market tracker TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from an estimated $3.9 billion this year.
Samsung said it aims to raise its HBM competitiveness by offering its clients a turnkey service, in which the company packages the graphic processing unit (GPU) made by Samsung Foundry and HBM chips into a single chipset.
“We’re positively considering producing next-generation HBM chips not at the memory process but at the foundry process to maximize business efficiency as we do both memory and foundry,” Han said at CES 2024.
Samsung Electronics Chairman Jay Y. Lee (third from left) visits a Samsung chip packaging line in Korea on Feb. 17, 2023 Leading chipmakers such as foundry leader Taiwan Semiconductor Manufacturing Co. (TSMC) and Intel Corp. are fiercely competing for advanced packaging to enhance chip performance without having to shrink the nanometer through ultra-fine processing, which is technologically challenging and more time consuming.
At this year’s electronics show, Samsung is showcasing several latest memory chips currently under development or already in supply to its clients.
The HBM series of DRAM is the talk of the town at CES as they are in growing demand for use in generative AI devices To meet growing demand from generative AI chip users, the company has put on display 12-nanometer 32-gigabyte double data rate 5 (DDR5) DRAM chips; Shinebolt, its HBM3E chip; and CMM-D, a computer express link (CXL) DRAM module.
For on-device AI functions, Samsung is showcasing LPDDR5X-PIM, an advanced DRAM chip that helps process data like a central processing unit (CPU).
Samsung's H-Cube chip packaging solution The company is also showing off its 2.5D packaging technology H-Cube and I-Cube series at CES.
“From 2025, chip demand will exceed supply. Barring the unexpected, client orders will rise significantly,” Han said.
Write to Jeong-Soo Hwang at hjs@hankyung.com In-Soo Nam edited this article.
We use cookies to provide the best user experience. By continuing to browse this website, you will be considered to accept cookies. Please review our Privacy Policy to learn our cookie policy.