Samsung on Tuesday showcased its sixth-generation HBM4 memory designed for use with Nvidia’s Vera Rubin platform, saying the chips operate at 11.7 gigabits per second and can potentially hit 13 Gbps. That performance figure sits well above an industry benchmark cited by the company of 8 Gbps. Samsung also highlighted an upgraded variant, HBM4E, which the company said runs at 16 Gbps.
The announcement was part of Samsung’s presence at Nvidia’s GPU Technology Conference, where the South Korean firm underlined its AI computing technologies and collaboration with Nvidia. Samsung said it would put its full range of AI computing products and services on display at the event.
In its remarks, Samsung described itself as the only semiconductor company offering an integrated AI solution that covers memory, logic, foundry and advanced packaging. "Samsung will exhibit its full suite of products and solutions that enable customers to design and build groundbreaking AI systems," the company said on Tuesday.
The demonstration follows Samsung’s statement last month that it was the first company to mass-produce and ship HBM4 products. The company also said it intends to provide HBM4E samples to customers during the second half of the year.
Samsung’s HBM4 reveal comes as the vendor seeks to strengthen its position in the emerging HBM4 market amid growing demand for AI-optimized memory. The company noted that despite its status as the world’s largest memory-chip maker, it had previously lagged behind smaller competitors when it came to supplying earlier-generation high-bandwidth memory to Nvidia. Specifically, Samsung said it had trailed rivals such as SK Hynix and Micron Technology in providing HBM3 and HBM3E chips for Nvidia.
The Nvidia conference itself featured a broader slate of product introductions. Nvidia Chief Executive Jensen Huang outlined new hardware and software items and said the company expects to generate $1 trillion in sales from Blackwell and Rubin chips by the end of 2027.
Samsung’s public presentation at the conference focused on performance figures for its HBM4 family and the company’s capability to supply both existing and next-generation memory components to AI system builders. The firm reiterated its strategy of offering an end-to-end semiconductor stack that spans multiple technology domains.
Sectors impacted: Semiconductors, AI hardware, data center infrastructure, and suppliers to GPU and server markets.