Notebookcheck Logo

SK Hynix announces launch of ~1TB/s HBM3E memory for next-gen AI puposes

More HBM3E is on the way. (Source: SK Hynix)
More HBM3E is on the way. (Source: SK Hynix)
SK Hynix has announced that it is ready to sample its latest form of high bandwidth memory (or HBM), pitched at advanced applications such as research or AI, to its customers. HBM3E is rated to keep up to 10% cooler thanks to the new production technique used to make it. It is also compatible with systems already set up to use the preceding HBM3.

NVIDIA's Hyperscale and HPC Computing unit is responsible for the company's concerns in enhancing demanding applications from "weather forecasting and energy exploration to computational fluid dynamics and life sciences" with its AI. According to the group's vice president Ian Buck, it will be doing so with SK Hynix' take on HBM3E (or extended HBM3) going forward.

The new iteration upgrade on the "highest-specification DRAM for AI applications currently available" has turned out capable of processing data "up to 1.15 terabytes" (TB, or 230+ FHD videos of 5GB in size each) a second, according to SK Hynix. It also comes with a "10%" upgrade in terms of "heat dissipation" (although compared to what is not given) thanks to the latest Mass Reflow Molded Underfill (MR-MUF) process technology.

SK Hynix' HBM3E is also rated for backward compatibility, so that it can act as a direct RAM upgrade in a system with CPUs and GPUs that might heretofore have been set up to use HBM3. The company projects that it can start mass production from the 1st half of 2024 (1H2024), although its sampling process to customers like NVIDIA is currently underway.

Buy an SK Hynix Platinum P41 on Amazon

Source(s)

Read all 1 comments / answer
static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2023 08 > SK Hynix announces launch of ~1TB/s HBM3E memory for next-gen AI puposes
Deirdre O'Donnell, 2023-08-22 (Update: 2023-08-24)