The CEO of Epic Tim Sweeney recently warned that the premium gaming computer market will be jeopardized by exorbitant graphics memory prices for the foreseeable future.
PC and laptop makers just can't compete with the price that AI juggernauts like Nvidia, Google, Meta, and others are willing to pay in order to develop their high-end GPU and AI data center projects, cautioned Sweeney. His warning came as a comment on the RAM price hikes that are happening, with one user complaining that the same 64GB Crucial RAM unit they bought for $240 a month ago, has now more than doubled in price to nearly $500. Amazon currently has the two 32GB modules on sale at a better price that is still way higher than what it used to cost in October.
Samsung and SK Hynix HBM4 memory price for Nvidia
Coincidentally, $500 is exactly what Nvidia is reportedly preparing to pay both Samsung and SK Hynix for their next-generation HBM4 graphics memory in 2026. According to industry insiders, the memory makers are charging Nvidia up to 100% more because they can. The SK Hynix HBM4 memory production costs will rise 50% since it has to produce the base die at TSMC, but all that increase will be passed on to Nvidia. Currently, SK Hynix sells its 12-layer HBM3E memory modules to Nvidia for about $350 apiece, while Samsung prices them $100 less as it was late to the party due to certification issues.
In 2026, the high-end HBM4 memory for Nvidia AI chips will be priced in the mid-$500, which for Samsung will be more than double what it charged for its HBM3E predecessor. "Nvidia's demand for HBM4 is so high that Samsung Electronics has no choice but to secure its supply at a high price," tip the insiders. Needless to say, this could result in higher Nvidia prices, too, as there is no shortage of demand for its GPUs.
SK Hynix GDDR7 and LPDDR6 memory specs
Besides the Samsung HBM4 memory price, industry sources have detailed its revised specifications. Samsung has apparently redesigned the interface and stacking architecture to hit a 3.3 TB/s bandwidth for the 36GB capacity module. The upgrades include "improved signal accuracy in high-speed sections by applying automatic compensation for the alignment signal (TDQS) of the channel-specific through-silicon via (TSV) path," or the one that processes the AI accelerator and LLM-specific traffic. For comparison, the current HBM3E modules that Samsung sells to Nvidia come with 1.2 TB/s specifications, so the HBM4 module will offer more than double the bandwidth at double the price.
Apart from the new HBM4 specification details, an SK Hynix representative has reiterated the specs of its LPDDR6 and GDDR7 memory specs, too. The LPDDR6 mobile DRAM modules offer14.4 Gb/s throughput at each pin, with new low-voltage regulator technology that keeps the signal stable at the increased speeds. The 24GB GDDR7 graphics memory modules, on the other hand, will target high-end gaming and AI inference tasks with speeds of 48 Gb/s per pin, or triple the current SK Hynix GDDR6 module bandwidth.
The next-gen HBM4, LPDDR6 and GDDR7 memory technologies of Samsung and SK Hynix will be launched at the International Solid-State Circuits Conference (ISSCC) conference in San Francisco in February. Samsung is then expected to start supplying HBM4 modules to Nvidia in Q2 on an expedited schedule at double the current price, which would likely mean more expensive Nvidia GPUs in 2026.
Get the ASUS ROG Astral GeForce RTX 5080 OC Edition gaming GPU on Amazon
Source(s)
The Elec & Dealsite via TrendForce











