Nvidia (rightfully) got a lot of flak for launching, and then preventing reviewers from benchmarking, its 8 GB RTX 5060 and RTX 5060 Ti at launch. So it was a bit surprising when AMD showed off the 8 GB Radeon RX 9060 XT variant at Computex 2025, although it got only a few seconds of screen time and no benchmarks. AMD's Frank Azor has now justified the GPU's existence on X. Frank says:
Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory options.
Steam's hardware survey from April 2025 agrees with the first part of his statement. 1080p continues to be the dominant resolution on the platform, and while Steam doesn't represent all PC gamers, it accounts for a good chunk of them. And yes, e-sports titles don't require much VRAM because most of them, like Counter-Strike 2, DOTA 2 and Valorant are more CPU bound than anything else. Heck, they can even run on an iGPU. A not-so-insignificant number of e-sports gamers will buy an 8 GB RTX 5060/RX 9060 XT just to play a handful of games for 5,000+ hours. However, the proliferation of Xbox Game Pass makes high-quality AAA titles The Elder Scrolls IV: Oblivion available to everyone.
Bethesda recently confirmed that Doom: The Dark Ages crossed 3 million players just a week after launch. Its peak Steam player count indirectly tells us plenty of people are playing it on Game Pass, and circling back to the Steam hardware survey once again, most are on an 8 GB GPU. Implying that 1080p gamers have 'no use' of more than 8 GB VRAM is incorrect. Time and again, we've seen 12/16 GB graphics cards outperform their 8 GB counterparts. The 8 GB RTX 3060 was found to be up to 35% slower than its 12 GB variant. Even the newly released RTX 5060 Ti 8 GB was struggled to keep up with its 16 GB counterpart.
In games that have ray/path tracing baked in, the situation is far worse. This is apparent in Indiana Jones and The Great Circle, where GPUs with 8 GB VRAM struggle to maintain even 30 FPS at 1080p. And it doesn't help that big-budget games like The Last of Us Part 1 often launch as unoptimized, buggy messes. The argument that 8 GB GPUs aren't meant to play AAA games doesn't hold water because what's the point of building a gaming PC if you can't play games on it? No 9060 XT buyer is expecting an 'Ultra' experience for $300, but if you can't even get to 'Medium/High' without FSR/DLSS, that's a problem.
Given it is yet it to launch, one can't say with certainty if the 9060 XT 8 GB will suffer from the same woes as its Blackwell counterpart. AMD's decision to overlook the 12 GB segment is nothing short of baffling. A 9060 XT with 12 GB VRAM and the same $300 price would have offered far better value and further cemented its position in the mid-range GPU market. Plus, it would have made for a great talking point. Something on the lines of, “Look, we offer more VRAM than our main competitor”. Even Intel, a new player in the dGPU market, had the good sense to throw in 10 GB of VRAM on its Arc B570.
Is there a market for 8 GB graphics cards? Absolutely. Gamers in emerging markets will often favour 8 GB GPUs because they often cost more than they do in the US/EU, due to import duties and local taxes. For example, the Asrock Steel Legend Radeon RX 9070 XT costs ₹74,899 ($849) in India. Its non-XT version of the same type (Asrock Steel Legend) costs ₹63,533 ($745). The price difference of about ₹11,366 ($113) might not seem much in other regions, but that's more than how much an average Indian family spends on groceries every month.
The xx60 class of GPUs are the proverbial workhorses of every generation, and to keep them fundamentally the same as their last-gen counterparts will come back to haunt PC gaming eventually. Why bother dropping ~$1,000 on a PC when you can get a PlayStation 5 Pro and four games for the same price? Selling a $200 graphics card might not be economically viable any more because of rising TSMC wafer costs, tariffs, and inflation. But a paltry $50 price difference between a 16 GB GPU and an 8 GB GPU makes the latter look much worse in comparison, regardless of who makes it.
Until now, one could understand why OEMs kept their mid-range offerings limited to 8 GB. VRAM modules maxed out at 2 GB and designing a PCB to fit more than four of them, especially on an affordable model, would not be cost-effective. But now, 3 GB (and higher) GDDR7 modules have been shown off by Samsung and others. Nvidia will probably use them on the upcoming GeForce RTX 5080 Super that is tipped to arrive with 24 GB of VRAM, and we can only hope that it trickles down to other Super-branded SKUs.
Source(s)
Own