SK Hynix's HBM3E DRAM chip SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Monday it has developed HBM3E, the industry’s best-performing DRAM chip for artificial intelligence applications, and has provided samples to its client Nvidia Corp. for performance evaluation.
HBM3E, the extended version of HBM3, or high bandwidth memory 3, is fifth-generation DRAM memory, succeeding the previous generations – HBM, HBM2, HBM2E and HMB3.
HBM is high-value, high-performance memory that vertically interconnects multiple DRAM chips, dramatically increasing data processing speed compared with earlier DRAM products.
SK Hynix said it plans to mass-produce the latest DRAM chip in the first half of next year to solidify its "unrivaled leadership in the AI memory market.”
Industry sources said Nvidia will likely use SK Hynix's HBM3E in its next-generation AI accelerator GH200 due later next year.
Nvidia’s advanced graphic chips The South Korean chipmaker said the latest product not only meets the industry’s highest standards of speed – the key specification for AI memory products – but also offers better performance than rival products in terms of capacity, heat dissipation and user-friendliness.
The HBM3E chip can process data up to 1.15 terabytes (TBs) a second, equivalent to processing more than 230 full-HD movies of 5 gigabytes (GBs) in a single second.
SK Hynix said the product comes with a 10% improvement in heat dissipation by adopting technology called advanced mass reflow molded underfill (MR-MUF).
The latest chip also provides backward compatibility, enabling the adoption of the latest product even onto the system prepared for HBM3 chips without design or structure modification.
“We have a long history of working with SK Hynix on high bandwidth memory for leading-edge accelerated computing solutions,” said Ian Buck, vice president of Hyperscale and HPC Computing at Nvidia. “We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing.”
SK Hynix's HBM3E DRAM chip SK HYNIX COMPETES WITH SAMSUNG FOR HBM3 CHIPS
The HBM series of DRAM is in growing demand as the chips power generative AI devices that operate on high-performance computing systems.
Such chips are used for high-performance data centers as well as machine learning platforms that enhance the AI and super-computing performance level.
“By increasing the supply share of the high-value HBM products, SK Hynix will also seek a fast business turnaround,” said Ryu Sung-soo, head of DRAM Product Planning at SK Hynix.
SK Hynix was the first memory vendor to start mass production of the world’s first HBM3 in June 2022.
Samsung is said to be unveiling its fifth-generation HBM3P, with its product name Snowbolt, by the end of this year, followed by its sixth-generation HBM product next year.
According to market research firm TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from $3.9 billion this year. HBM chips are at least five times more expensive than commoditized DRAM chips.
(Updated with a source comment on the possibility of Nvidia using the HBM3 chip in its next-generation AI accelerator and HBM market growth forecasts)
Write to Jeong-Soo Hwang at hjs@hankyung.com In-Soo Nam edited this article.
We use cookies to provide the best user experience. By continuing to browse this website, you will be considered to accept cookies. Please review our Privacy Policy to learn our cookie policy.