Samsung Electronics announced on Thursday that it has begun shipping its latest high-bandwidth memory (HBM) chips, HBM4, to unnamed customers as it works to catch up with competitors supplying to major AI chip buyers. The company highlighted performance improvements and said it will follow with next-generation samples later in the year.
HBM is a form of dynamic random-access memory (DRAM) designed to handle the heavy data throughput required by complex artificial intelligence applications. Demand for this memory type has been driven by a global push to build AI data centers that can process vast volumes of data.
According to Samsung, HBM4 provides a consistent processing speed of 11.7 gigabits-per-second (Gbps), representing a 22% increase over its predecessor, HBM3E. The company added that the new modules can reach a maximum speed of 13 Gbps, which it said helps alleviate growing data bottlenecks.
Samsung also disclosed plans to deliver samples of HBM4E chips in the second half of the year. The announcement coincided with a positive market reaction, with Samsung shares finishing the trading session up 6.4% on Thursday.
The broader memory industry is actively contesting the next-generation HBM market. SK Hynix stated in January that it is in volume production of HBM4 and intends to preserve what it described as an "overwhelming" market share, while also aiming to reach production yields for HBM4 similar to those for current-generation HBM3E. Micron's chief financial officer has likewise indicated the company is in high-volume production of HBM4 and has begun customer shipments.
Samsung acknowledged it had been slower to respond to the advanced AI chip market and had lagged rivals in supplying prior-generation HBM products. The company’s HBM4 shipments mark a step intended to close that gap, but competitors are already asserting production scale and yield targets for HBM4.
Market observers and industry participants will watch whether Samsung's HBM4 performance and the planned HBM4E samples shift customer sourcing decisions, particularly among companies scaling AI data center capacity. The development underscores the central role of memory suppliers in supporting AI infrastructure buildouts and the tight competition shaping the market for high-bandwidth DRAM.