Skip to content
  • KOSPI 2656.33 +27.71 +1.05%
  • KOSDAQ 856.82 +3.56 +0.42%
  • KOSPI200 361.02 +4.51 +1.27%
  • USD/KRW 1379 +4 +0.29%
  • JPY100/KRW 871.32 -12.1 -1.37%
  • EUR/KRW 1474.56 -0.75 -0.05%
  • CNH/KRW 189.7 +0.19 +0.1%
View Market Snapshot
Korean chipmakers

Samsung to supply $752 million in Mach-1 AI chips to Naver, replace Nvidia

Samsung is also in talks with Big Tech firms such as Microsoft and Meta to supply its new AI accelerator

By Mar 22, 2024 (Gmt+09:00)

4 Min read

A researcher at a Samsung Electronics chip cleanroom
A researcher at a Samsung Electronics chip cleanroom

Samsung Electronics Co., the world’s top memory chipmaker, will supply its next-generation Mach-1 artificial intelligence chips to Naver Corp. by the end of this year in a deal worth up to 1 trillion won ($752 million).

With the contract, Naver will significantly reduce its reliance on Nvidia Corp. for AI chips.

Samsung’s System LSI business division has agreed with Naver on the supply deal and the two companies are in final talks to fine-tune the exact volume and prices, people familiar with the matter said on Friday.

Samsung, South Korea’s tech giant, hopes to price the Mach-1 AI chip at around 5 million won ($3,756) apiece and Naver wants to receive between 150,000 and 200,000 units of the AI accelerator, sources said.

Naver's headquarters
Naver's headquarters

Naver, a leading Korean online platform giant, is expected to use Mach-1 chips in its servers for AI inference, replacing chips it has been procuring from Nvidia.

Leveraging its sale of Mach-1 chips to Naver as a stepping stone, Samsung plans to expand its client base to Big Tech firms. Samsung is already in supply talks with Microsoft Corp. and Meta Platforms Inc., sources said.

An accelerator is a special-purpose hardware device that uses multiple chips designed for data processing and computing.

MACH-1, COMPETITIVE IN PRICING, PERFORMANCE & EFFICIENCY

Kyung Kye-hyun, head of Samsung's semiconductor business, said during the company’s annual general meeting on Wednesday that the Mach-1 AI chip is under development and it will begin mass production of a prototype by year-end.

Nvidia is the world's top AI chip designer
Nvidia is the world's top AI chip designer

Mach-1 is an AI accelerator in the form of a system-on-chip (SoC) that reduces the bottleneck between the graphics processing unit (GPU) and high bandwidth memory (HBM) chips, according to Samsung.

Kyung said Mach-1 is a product specified to fit the transformer model.

“By using several algorithms, it can reduce the bottleneck phenomenon that occurs between memory and GPU chips to one-eighth of what we are witnessing today and improve the power efficiency by eight times,” he said. “It will enable large language model inference even with low-power memory instead of power-hungry HBM.”

Unlike Nvidia's AI accelerator, which consists of GPUs and HBM chips, Mach 1 combines Samsung’s proprietary processors and low-power (LP) DRAM chips.

With that design, Mach-1 boasts fewer data bottlenecks, consuming less power than Nvidia products, industry sources said.

Besides, the price of the Mach-1 chip is one-tenth that of Nvidia's, they said.

SK Hynix developed the industry's first HBM3 DRAM chip
SK Hynix developed the industry's first HBM3 DRAM chip

TO WEAN ITSELF OFF NVIDIA 

Nvidia, the world’s largest chip design firm and AI chip provider, posted an operating profit margin of 62% in the November-January quarter. Some $18.8 billion, or 40% of its server business revenue, came from AI inference chip sales last year.

Sources said Naver will use Samsung’s Mach-1 chips to power servers for its AI map service, Naver Place. Additional Mach-1 chip supply to Naver is possible if the first batch shows “good performance,” they said.

Naver has been reducing its reliance on Nvidia for AI chips.

Last October, Naver replaced Nvidia’s GPU-based server with Intel Corp.’s central processing unit (CPU)-based server.

Intel's fourth-generation Sapphire Rapids Xeon scalable processors
Intel's fourth-generation Sapphire Rapids Xeon scalable processors

Naver’s AI server switch comes as global information technology firms are increasingly disgruntled with Nvidia’s GPU price hikes and a global shortage of its GPUs.

For Samsung, its deal with Naver would help it compete with crosstown rival SK Hynix Inc., the dominant player in the advanced HBM segment.

Kyung, chief executive of Samsung’s Device Solutions (DS) division, which oversees its chip business, said with the Mach-1 chip Samsung aims to catch up to SK Hynix, which recently started mass production of its next-generation HBM chip.

HBM has become an essential part of the AI boom, as it provides the much-needed faster processing speed compared with traditional memory chips.

Intel's fourth-generation Sapphire Rapids Xeon scalable processors
Intel's fourth-generation Sapphire Rapids Xeon scalable processors

A laggard in the HBM chip segment, Samsung has been investing heavily in HBM to rival SK Hynix and other memory players.

Last month, Samsung said it developed HBM3E 12H, the industry's first 12-stack HBM3E DRAM and the highest-capacity HBM product to date. Samsung said that it will start mass production of the chip in the first half of this year.

According to market research firm Omdia, the global inference AI accelerator market is forecast to grow from $6 billion in 2023 to $143 billion by 2030.

Write to Jeong-Soo Hwang, Chae-Yeon Kim and Eui-Myung Park at hjs@hankyung.com

In-Soo Nam edited this article.
More to Read
Comment 0
0/300