Jensen Huang "Samsung HBM New Product
Currently Testing. Will Use Soon"
Samsung Electronics "Will Regain Global Semiconductor No.1 Within 2-3 Years"

Jensen Huang, CEO of NVIDIA, revealed that they are currently testing Samsung Electronics' new high-bandwidth memory (HBM) product. This has sparked evaluations that competition in the AI chip memory market is heating up.


NVIDIA CEO Jensen Huang is answering reporters' questions at a global media briefing held on the 18th (local time) at the Signia by Hilton Hotel in San Jose, California, USA. <br>[Photo by Yonhap News]

NVIDIA CEO Jensen Huang is answering reporters' questions at a global media briefing held on the 18th (local time) at the Signia by Hilton Hotel in San Jose, California, USA.
[Photo by Yonhap News]

View original image

At a media briefing held on the 19th (local time) at the Signia by Hilton hotel in San Jose, California, CEO Jensen Huang was asked, "Are you using Samsung's HBM?" He replied, "Not yet," but also said, "We are currently qualifying it and will be able to use it soon."


This attracted attention as it suggested the possibility that Samsung Electronics might supply HBM to NVIDIA in the future. Currently, the HBM market is largely divided between SK Hynix and Samsung Electronics, with the U.S. company Micron holding a small market share. The day before, SK Hynix announced that it had mass-produced the world's first ultra-high-performance AI memory product, the HBM3E 8-stack, and plans to supply it to NVIDIA starting at the end of this month. Micron also announced last month the mass production of 24GB 8-stack HBM3E to be installed in NVIDIA’s ‘H200’. However, Samsung Electronics, which succeeded in developing the HBM3E 12-stack, has not yet disclosed specific supply plans. NVIDIA, a future customer, indicated the possibility of purchasing Samsung Electronics' products on this day.


Accordingly, the competition to dominate the high-bandwidth memory (HBM) market is expected to intensify. HBM is a memory product that vertically stacks multiple DRAM chips to significantly increase data processing speed. It plays a role in accelerating the speed at which AI thinks. High-performance memory like HBM is essential to operate generative AI, which must process vast amounts of data quickly and continuously. HBM has evolved through generations: 1st generation (HBM), 2nd generation (HBM2), 3rd generation (HBM2E), 4th generation (HBM3), and 5th generation (HBM3E), with HBM3E being an extended version of HBM3.


Hot Picks Today


Samsung Electronics, SK Hynix, and Micron set up exhibition booths at this NVIDIA Developer Conference to showcase the latest memory, HBM3E.


This content was produced with the assistance of AI translation services.

© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Today’s Briefing