Notebookcheck Logo

CheckMag | Nvidia's DLSS 3 was the biggest loser of CES 2024

However controversial this article might be, it probably pales in comparison to Jeff Fisher pronouncing "Ti" as "Tie" in yet another keynote. (Image: NVIDIA)
However controversial this article might be, it probably pales in comparison to Jeff Fisher pronouncing "Ti" as "Tie" in yet another keynote. (Image: NVIDIA)
NVIDIA showed off a lot at their CES keynote back on the 8th of January. They showed off their Super series of GPUs. They showed off their renewed belief in saying "AI" a lot to boost their stock market performance. But they also showed off the fact that their vaunted performance-boosting frame generation technology is pretty performance-hungry, too.
Views, thoughts, and opinions expressed in the text belong solely to the author.

The RTX 4080 Super was introduced with the claim of a 1.4x performance increase over the RTX 3080 Ti, and it has a litany of impressive specifications to back up those performance claims. But shortly after, gameplay footage of Alan Wake 2 showed off framerates with the incoming 4080 Super sitting at approximately double those of the outgoing 3080 Ti. While this might seem like an impressive uplift, it's less so once you think about Frame Generation; accounting for every frame rendered by the 4080 Super being padded by another from DLSS 3 means that the two graphics cards come out scoring exactly the same non-generated framerate!

And this isn't a freak outlier making the 4080 Super look bad – though the presence of one in NVIDIA's own presentation would be pretty terrible too – as more marketing material from the green team shows off the same outcome. Some examination of their graphs finds that the average performance improvement between the two cards on games without Frame Generation sits at 41%, consistent with NVIDIA's claims, but the gains on games with it averages out at 97%, again confirming that once you take the generated frames out of the equation the 4080 Super is only matching the 3080 Ti.

Some statistics I pulled from NVIDIA's performance charts, using the highly sophisticated cutting-edge technique known as "counting how long the bars are in pixels". (Image: NVIDIA, edited; own)
Some statistics I pulled from NVIDIA's performance charts, using the highly sophisticated cutting-edge technique known as "counting how long the bars are in pixels". (Image: NVIDIA, edited; own)

To add insult to injury, this number is skewed upwards by the worst performers – if you want at least a 40 FPS "native framerate" to keep input latency in check, that average improvement falls to 81% (and, for those who might call this cherrypicking, you might want to note that AMD also suggest a native 60 FPS for their tech, and the dynamic duo at Hardware Unboxed recommend double that again!).

What this seems to reveal is an inconvenient truth that NVIDIA have left unsaid – that DLSS 3 is pretty dang computationally expensive. Looking at frame times instead of frame rates – that is, (milli)seconds per frame rather than frames per second – this becomes clearer. The RTX 4080 Super does, of course, take less time to render frames than the RTX 3080 Ti, but once accounting for the generated frames it takes more time than it does if you expected a uniform 41% uplift over the 3080 Ti. The difference in those times – how much longer the 4080 Super takes than it should – is just how long DLSS 3 takes to work its magic.

That difference is... 5.32 ms.

Hey, let it never be said I don't show my working! (Image: Own)
Hey, let it never be said I don't show my working! (Image: Own)

Five milliseconds might not sound like a lot, but it's added fairly uniformly across every game that NVIDIA picked to showcase the RTX 4080 Super at its best. It's an overhead, a parasitic drag, a performance ceiling, and in games like Starfield it makes up nearly half of the time the 4080 Super spends rendering the frame.

Frame generation appears to be here to stay, and it seems like the processing time spent on it is going to be the next battleground. AMD's keynote presentation introduced the RX 7600 XT to supplement the existing RX 7600 (which is still a great value buy on Amazon), and in those charts it went from beating the RTX 4060 by 31% rendering natively in Call of Duty: Modern Warfare 3 to more than double that with FSR 3 and DLSS 3 turned on, and doing the same in Avatar: Frontiers of Pandora turned a 9.1% lead for NVIDIA's card into a neck-and-neck tie. Either FSR 3 gained ground or DLSS 3 was losing it, and that points to one solution being a good bit heavier than the other.

Yes, these are first-party benchmarks and shouldn't be taken as gospel. But so were the NVIDIA ones - and they still contained a good amount of useful information. (Image: AMD)
Yes, these are first-party benchmarks and shouldn't be taken as gospel. But so were the NVIDIA ones - and they still contained a good amount of useful information. (Image: AMD)

With the red team's FSR 3 and FMF technologies potentially having much less overhead than DLSS 3 – despite the latter running on dedicated, optical-flow-accelerator silicon – and Intel starting to make noises in the framegen space too, NVIDIA has some hard choices no doubt being made right now. Gamers trying to squeeze more power out of a low-end graphics card will probably take substantially improved performance over a clearer resolve, while those at already-high framerates aren't as impacted by artefacts when each individual generated frame is only on-screen for a short period; a good-looking but slow solution doesn't win the cost-benefit analysis in either situation.

So. How much image quality is really worth five milliseconds?

Read all 5 comments / answer
static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
Matthew Lee, 2024-01-23 (Update: 2024-01-23)