Samsung’s HBM4 Memory Chips: Faster Performance for AI

Samsung finished work on its sixth-generation HBM4 memory chips and passed Production Readiness Approval, the final step before mass production. SamMobile reports the company sent final samples to Nvidia for testing. Nvidia approval would let Samsung start making these chips right away for next-gen AI hardware like its Rubin accelerator, due out next year.
What Makes HBM4 Better
- 60% faster than HBM3E chips, according to SamMobile.
- Samsung already works on a version 40% faster than this HBM4, possible reveal by mid-February 2026.
Seeking Alpha confirms Samsung completed HBM4 development and set up systems for quick mass production once partners sign off.
Why These Chips Matter
HBM4 powers top AI accelerators from Nvidia and Google. KED Global says Samsung supplies 60% of the HBM for Google’s AI chips, making it the main provider there. Samsung lost the overall HBM lead to SK Hynix recently but fights back hard. DigiTimes notes a big internal restructure to boost AI work and grab more HBM market share.
Expect more details soon. Another DigiTimes piece reports Samsung and SK Hynix will show HBM4 advances at ISSCC 2026 in February.