Notebookcheck Logo

Samsung ships priciest AI memory Nvidia ever ordered as HBM4 exceeds key specifications

Samsung starts HBM4 memory shipments.
ⓘ Samsung
Samsung starts HBM4 memory shipments.
The next-gen HBM4 stacks exceed JEDEC and Nvidia specifications with speeds up to 13 Gbps and bandwidth of 3.3 TB/s, while delivering higher energy efficiency, improved thermals, and up to 36 GB capacity per 12‑layer stack. This is the priciest memory Nvidia has ever ordered for AI data center cards, though.

Samsung has announced the first commercial shipments of a next-gen HBM4 memory for applications like Nvidia GPUs and the respective AI data centers.

Nvidia and other clients are reportedly paying close to $500 apiece to Samsung, double what they used to pay for the previous HBM3E high-bandwidth memory generation. As a consequence, Samsung's shares are at an all-time high, and its management expects another blockbuster year riding high on the general memory shortage.

Samsung HBM4 memory specs

While memory makers are currently charging an arm and a leg for any unit they produce, Samsung says that its new HBM4 AI memory has managed to surpass both the Joint Electron Device Engineering Council (JEDEC) standard and Nvidia's requirements.

The HBM4 operating speed can reach a whopping 13 Gbps, or 46% higher than JEDEC's requirements, and its total memory bandwidth is up to 3.3 TB/s per stack, way exceeding the 3 TB/s ask of customers like Nvidia.

Samsung achieved this breakneck throughput by preemptively applying the 10nm-class 6th-generation 1c DRAM process with the base die manufactured at 4nm, as opposed to the 14nm 1a DRAM it used for the HBM3 predecessor. This gives the HBM4 AI memory generation plenty of room for process and performance improvements. To manage all the generated heat, Samsung designed the core die and data transmission with low-power and low-voltage technologies for 30% better heat dissipation and 40% higher energy efficiency when compared to the HBM3 memory generation that Nvidia is now using in its AI graphics cards in the Blackwell series.

For now, Samsung can ship to customers a maximum of 36 GB HBM4 memory capacity via a 12-layer stack, but it can also craft 16-layer stacking for a total of 48 GB whenever Nvidia is ready with the respective GPU design and purchasing power. Samsung promises enough of the expensive HBM4 chips, saying that it will "continue to secure stable supply capabilities to meet the growing mid- to long-term demand, particularly from AI and data centers" as it starts sampling its next-gen HBM4E memory in the second half of 2026.

Get the 64GB Crucial DDR5 laptop memory kit on Amazon

The Samsung HBM4 AI memory chip.
ⓘ Samsung
The Samsung HBM4 AI memory chip.

Source(s)

Please share our article, every link counts!
Mail Logo
Google Logo Add as a preferred
source on Google

No comments for this article

Got questions or something to add to our article? Even without registering you can post in the comments!
No comments for this article / reply

static version load dynamic
Loading Comments
Comment on this article
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2026 02 > Samsung ships priciest AI memory Nvidia ever ordered as HBM4 exceeds key specifications
Daniel Zlatev, 2026-02-12 (Update: 2026-02-12)