HBM4 procurement brought forward
SK Hynix announced in October that they aimed to launch HBM4 chips in the second half of 2025. However, it was stated that this schedule would be brought forward by six months, in line with Nvidia’s request. While SK Hynix officials refrained from providing detailed information about this accelerated supply process, industry officials state that this move is a step to further consolidate Nvidia’s strong position in the global AI chip market. Nvidia currently dominates more than 80 percent of the AI chip market.
In the development of artificial intelligence technologies, high bandwidth memory chips are vital in processing large data sets such as artificial intelligence. There are only three companies producing HBM memories: SK Hynix, Micron and Samsung. Currently, SK Hynix has the lead here, but the competition is heating up. The company plans to supply 12-layer HBM3E chips to an unnamed customer this year, and to launch 16-layer HBM3E chip samples in early 2024.
Samsung said last week that it was making progress on a supply agreement with a major unidentified customer after delays and said it would begin production of the “improved” HBM3E in the first half of next year. Samsung also plans to produce next-generation HBM4 products in the second half of next year. Micron also has a similar road map.