Stock Markets March 17, 2026

Samsung Unveils High-Speed HBM4 Memory for Nvidia’s Vera Rubin Platform

Company highlights faster HBM4 performance and HBM4E variant at Nvidia GPU Technology Conference while positioning itself as an end-to-end AI semiconductor supplier

By Marcus Reed MU NVDA
Samsung Unveils High-Speed HBM4 Memory for Nvidia’s Vera Rubin Platform
MU NVDA

At Nvidia’s GPU Technology Conference, Samsung introduced its sixth-generation HBM4 memory chips optimized for Nvidia’s Vera Rubin platform, reporting operational speeds of 11.7 Gbps with headroom toward 13 Gbps. The firm also promoted an enhanced HBM4E chip running at 16 Gbps and reiterated its role across memory, logic, foundry and advanced packaging.

Key Points

  • Samsung announced HBM4 chips operating at 11.7 Gbps with potential to reach 13 Gbps, above an industry benchmark of 8 Gbps.
  • An upgraded HBM4E variant was shown running at 16 Gbps; Samsung plans to supply HBM4E samples in the second half of the year.
  • Samsung said it was the first to mass-produce and ship HBM4 products last month and positioned itself as an integrated AI semiconductor supplier covering memory, logic, foundry and advanced packaging.

Samsung on Tuesday showcased its sixth-generation HBM4 memory designed for use with Nvidia’s Vera Rubin platform, saying the chips operate at 11.7 gigabits per second and can potentially hit 13 Gbps. That performance figure sits well above an industry benchmark cited by the company of 8 Gbps. Samsung also highlighted an upgraded variant, HBM4E, which the company said runs at 16 Gbps.

The announcement was part of Samsung’s presence at Nvidia’s GPU Technology Conference, where the South Korean firm underlined its AI computing technologies and collaboration with Nvidia. Samsung said it would put its full range of AI computing products and services on display at the event.

In its remarks, Samsung described itself as the only semiconductor company offering an integrated AI solution that covers memory, logic, foundry and advanced packaging. "Samsung will exhibit its full suite of products and solutions that enable customers to design and build groundbreaking AI systems," the company said on Tuesday.

The demonstration follows Samsung’s statement last month that it was the first company to mass-produce and ship HBM4 products. The company also said it intends to provide HBM4E samples to customers during the second half of the year.

Samsung’s HBM4 reveal comes as the vendor seeks to strengthen its position in the emerging HBM4 market amid growing demand for AI-optimized memory. The company noted that despite its status as the world’s largest memory-chip maker, it had previously lagged behind smaller competitors when it came to supplying earlier-generation high-bandwidth memory to Nvidia. Specifically, Samsung said it had trailed rivals such as SK Hynix and Micron Technology in providing HBM3 and HBM3E chips for Nvidia.

The Nvidia conference itself featured a broader slate of product introductions. Nvidia Chief Executive Jensen Huang outlined new hardware and software items and said the company expects to generate $1 trillion in sales from Blackwell and Rubin chips by the end of 2027.

Samsung’s public presentation at the conference focused on performance figures for its HBM4 family and the company’s capability to supply both existing and next-generation memory components to AI system builders. The firm reiterated its strategy of offering an end-to-end semiconductor stack that spans multiple technology domains.


Sectors impacted: Semiconductors, AI hardware, data center infrastructure, and suppliers to GPU and server markets.

Risks

  • Samsung acknowledged it had trailed smaller rivals such as SK Hynix and Micron in supplying earlier-generation HBM3 and HBM3E chips to Nvidia - this reflects competitive risks in supplier selection for AI customers, affecting the semiconductor and memory sectors.
  • Timelines for customer sampling of HBM4E (planned in the second half of the year) carry execution risk tied to customer validation cycles and production ramp-up, which could influence supply for AI hardware and data center markets.

More from Stock Markets

Amazon rolls out 1-hour and 3-hour delivery across U.S. markets to sharpen edge over Walmart Mar 17, 2026 U.S. Administration Proposes Nearly $1 Billion to End Two Offshore Wind Projects Mar 17, 2026 Honeywell Flags Near-Term Revenue Pain from Middle East Conflict but Keeps 2026 Targets Intact Mar 17, 2026 Medartis posts solid growth in 2H25 but flags constrained near-term outlook Mar 17, 2026 Viaplay Shares Rise After Report That Canal+ and PPF Are Weighing Joint Bid Mar 17, 2026