Insight with Haslinda Amin, a daily news program featuring in-depth, high-profile interviews and analysis to give viewers the complete picture on the stories that matter. The show features prominent leaders spanning the worlds of business, finance, politics and culture. (Source: Bloomberg)
Insight with Haslinda Amin, a daily news program featuring in-depth, high-profile interviews and analysis to give viewers the complete picture on the stories that matter. The show features prominent leaders spanning the worlds of business, finance, politics and culture. (Source: Bloomberg)
Samsung’s High Bandwidth Memory (HBM) business looks set for a major breakthrough. The company is gearing up to supply its 6th-generation HBM, HBM4, to major clients such as Nvidia and AMD, starting next month. If it pans out, the firm will be able to solidify its position in the next-generation AI memory market. Samsung is on the way to gaining a foothold in the HBM market During the early HBM3E ...
Samsung’s High Bandwidth Memory (HBM) business looks set for a major breakthrough. The company is gearing up to supply its 6th-generation HBM, HBM4, to major clients such as Nvidia and AMD, starting next month. If it pans out, the firm will be able to solidify its position in the next-generation AI memory market. Samsung is on the way to gaining a foothold in the HBM market During the early HBM3E period, Samsung struggled to meet the quality standards for one of its major clients, Nvidia. However, in the fourth quarter of 2025, the Korean company successfully passed the GPU maker’s quality tests for its 12-layer HBM3E and stepped into the supply chain. Until then, SK Hynix had already secured the majority of orders, resulting in Samsung losing substantial market share. However, in the HBM4 era, this is no longer the case. According to a report from Hankyung, Samsung has begun preparations for large-scale shipments of its HBM4 to key clients like Nvidia and AMD in February 2026. This move comes after the company successfully passed the final quality tests conducted by the clients. These are actual HBM4 chips for next-generation AI accelerators, not just samples. Samsung’s HBM4 runs at speeds of up to 11.7 gigabits per second (Gb/s), surpassing the 10 Gb/s requested by its clients. This makes HBM4 one of the fastest memory chips currently available. The new memory will feature in upcoming AI accelerators, including Nvidia’s Rubin and AMD’s MI450, expected for release in the second half of 2026. To achieve such incredible performance in its HBM4, Samsung used advanced manufacturing technology. The product uses 10nm 6th-generation (1c) DRAM as the basic material for HBM. Likewise, the logic die (which serves as the chip’s brain) uses a 4nm process, several generations ahead of its competitors. It will be interesting to see how the HBM market shapes up in the coming months. Will Samsung lead this space, beating SK Hynix? Only time will tell.