While both the PlayStation 5 and Xbox Series X were heavily promoted as true 4K consoles, capable of 60 FPS experiences, yet another AAA title recently arrived, set to run at a meagre 1440p 30 FPS. When it launched in 2019, Control was a revelation on PC. Top-end Turing cards like the GeForce RTX 2080 Ti could deliver a 4K/60 experience (with DLSS), paired with one of the first full-featured implementations of hardware ray-tracing. This game was supposed to give us a glimpse of what ninth-gen titles would look and play like.
And while Control is certainly impressive on PC, the console outings raise a lot of questions. A lot of that has to do with a single aspect of the game's technical pipeline: hardware-accelerated ray tracing. On PC, Control ran like a charm with ray-tracing disabled, dishing out a native 4K experience on a range of hardware platforms.
The ray-traced reflections and ambient occlusion were noticeable, but enabling RT wasn't a (pardon the pun) game-changer that justified the performance hit. With RT off, Control was still a fine-looking game, but there wasn't that much, in terms of model poly-count, textures, and materials, that was truly "next-gen."
Therein lies the problem. On current-gen platforms, and even on top-end PCs with the GeForce RTX 3080 and GeForce RTX 3090, ray-tracing "works," but only with a profound hit to performance. DLSS 2.0 alleviates this on NVIDIA's latest graphics cards, delivering image quality that's objectively as good as or better than native rendering. AMD has yet to release their Super Resolution solution, however. This means that developers have just one way of getting ray-tracing to work on the consoles: drop resolution, framerate, and scale back their ambitions in terms of core asset quality.
Ray-tracing is incredibly taxing and current-gen platforms are simply not up to the task without compromises being made elsewhere. While it's bad enough that a ninth-gen "4K/60 consoles" are running AAA games at 1440p/30 FPS or lower, the real problem is in terms of developer ambitions.
There's so much room to improve, as far as pure raster graphics are concerned. Take a look at the Unreal Engine Paris apartment demo, created without the help of ray-tracing. Asset quality is incredible. The polygon counts, even on incidental details like bathroom towel hangers, are extremely high. Material quality is impeccable and scene lighting is accurate, even if it is plain old global illumination.
Indie horror game Visage delivers photorealistic visuals without ray-tracing, and while running at over 100 FPS at 4K on the GeForce RTX 3080. Developer SadSquare focused on core asset quality, using techniques like photogrammetry to recreate real-world objects with extremely high fidelity.
In stark contrast, next-gen games like Control and The Medium (which drops to 900p on the Series X), feature assets that are only marginally better than the eighth-gen standard. Characters, objects, and animations don't look that much better than what we've been seeing for the past 7 years. While ray-tracing obviously enhances in-game scenes, it's evident that core asset quality was pared back to enable ray-tracing.
If games are already dropping to 1440p/30 and lower on the ninth-gen consoles because of ray-tracing, things do not bode well for the future of ninth-gen asset quality. The profound performance hit of ray-tracing makes it something of an either-or choice: developers could potentially double or triple asset quality and scene complexity, or they could add ray-traced reflections, while working with eighth-gen equivalent assets.
A lot of this likely has to do with the hype ray-tracing's received since it debuted with Turing cards in 2018, and widespread misunderstanding of how profound the impact of hybrid ray-tracing is.
Full path-tracing - what we see in Quake II RTX and Minecraft RTX - absolutely is the future of video game graphics, albeit at some point in the next two decades. "Hybrid ray-tracing," where some parts of the render pipeline utilize RT, can offer somewhat better visuals, in specific use cases: hybrid RT means somewhat better lighting and shadows, and significantly better reflections than current raster techniques.
But because ray-tracing isn't used in all aspects of the rendering pipeline, it isn't a panacea: it won't make low-poly models magically better; it won't improve environmental destructibility, and (with the exception of reflective surfaces), it won't have a major impact on material quality. To put it briefly, hybrid ray-tracing does some things a bit better than rasterization, but comes with a performance hit that, in many cases doesn't remotely justify the visual improvement.
Because consumer audiences equate ray-tracing with "good graphics," developers implement ray-traced shadows, reflections, and AO to tout next-gen visuals in games with mediocre asset quality. When developers try to do both things, as with Cyberpunk 2077, performance flatlines, regardless of the platform.
Why does this matter, though? If the market continues to prioritize ray-tracing, developers will continue to add expensive hybrid RT effects to games that ship on the PlayStation 5 and Xbox Series X. This'll result in sub-native, 30 FPS experiences. But it'll also prevent developers from meaningfully improving core asset quality, since they simply won't have the performance overhead, thanks to those ray-traced effects. In contrast, when ninth-gen devs choose to prioritize assets over RT, the results are phenomenal. The Demon's Souls remake on PlayStation 5 stands head and shoulders above just about any ray-traced title on Sony's console. Bluepoint prioritized assets over unnecessary RT effects and the results speak for themselves. Performance does, too, with the game running at a native 4K/30.
Will developers continue to add ray-traced effects to games at the expense of other visual elements? It's too early to tell, now. But, with the cross-gen period gradually reaching a denouement, we should know soon enough.
Preorder Control: Ultimate Edition on Xbox Series X/S here on Amazon