(Courtesy of Getty Images) South Korea’s two chip giants Samsung Electronics Co. and SK Hynix Inc. are expected to further deepen ties with rivals and customers from around the world in the new year to advance their chiplet, computer express link (CXL) and customized chip technologies to win the artificial intelligent chip battle.
They are the core chip technologies whose demand is set to rise at an exponential rate following the arrival of generative AI, which has ushered in a new AI era.
Earlier this month, Advanced Micro Devices Inc. (AMD) unveiled its new AI accelerator, the Instinct MI300X, which adopts chiplet designs.
CHIPLET MARKET TO BURGEON
Chiplet design combines different functional units, such as a graphics processing unit (GPU), a central processing unit (CPU) and an I/O, in a package. This allows the assembly of third-party chips produced in different nodes like 7-nanometer GPU, a CPU built from a 4-nm node and an I/O from a 14-nm node.
This is similar to the system-on-chip (SoC) design, which stacks different functional chips on a die.
The SoC chip has been popular until recently but it is becoming costlier to fit multiple functions in an SoC due to the rapid advancement in miniaturization to sub-7nm.
As the die sizes of CPU and GPU for servers become larger, the production yields also fall accordingly, further hiking SoC manufacturing costs.
This is why chipmakers have shifted their focus to chiplets, which allow mixing and matching components from different producers together, offering chip suppliers better yields and lower costs.
As the chiplet has become a mega trend in the global chip industry, Samsung Electronics allied last year with its rivals and customers such as AMD, Intel Corp. and Taiwan Semiconductor Manufacturing Company Ltd. (TSMC) to establish a chiplet ecosystem.
According to MarketsandMarkets Research, the global chiplet market is forecast to jump to $148.0 billion in 2028 from $6.5 billion in 2023 at a compound annual growth rate (CAGR) of 86.7%.
CXL
Competition to take the lead in CXL technology is also getting fiercer.
CXL is a unified interface standard that connects various processors, such as CPUs, GPUs and memory devices.
It is considered one of the next-generation memory solutions because it enables high-speed, low-latency communication between the host processor and devices, according to Samsung Electronics.
Samsung Electronics' CXL memory expander (Courtesy of Samsung Electronics) As the generative AI boom has sharply increased the amount of data to process, the demand for massive computing scaleup and high-responsive data communication is growing rapidly.
In response to this, Intel has formed a coalition with Samsung Electronics and SK Hynix to come up with CXL, which can speed up data processing by two-fold.
As it is built on a dynamic random-access memory (DRAM) module, its adaptation is expected to increase memory demand.
HBM3 chips by SK Hynix (Courtesy of SK Hynix) HBM is high-value, high-performance memory that vertically interconnects multiple DRAM chips, dramatically increasing data processing speed compared with earlier DRAM products.
AI accelerator producers such as US fabless chip designer Nvidia Corp., AMD and Intel are actively seeking to secure HBM chips for their processors. Their combined pre-orders are already estimated at about 1 trillion won ($770 million).
Foundry players have also joined the race and seek to advance their fabrication process to win customized high-performance chip orders from big customers such as Google and Amazon.com.
Write to Jeong-Soo Hwang at hjs@hankyung.com Sookyung Seo edited this article.
We use cookies to provide the best user experience. By continuing to browse this website, you will be considered to accept cookies. Please review our Privacy Policy to learn our cookie policy.