The Last of Us Part I review: Laptop and desktop benchmarks
Are you a techie who knows how to write? Then join our Team! Wanted:
- Specialist News Writer
- Magazine Writer
- Translator (DE<->EN)
Details here
Join our Support Satisfaction Survey 2023: We want to hear about your experiences!
Participate here
Technical aspects
A waiting simulator with absurd VRAM consumption: This is how the PC version of The Last of Us can currently be summed up. No wonder the action-adventure game is being bombarded with negative reviews on Steam. The confusion starts within the first few seconds: It takes at least half an hour for the shaders to be installed - even on top-of-the-range PCs. This is a new record - and not in a good way. But the long loading times feel harmless in comparison.
The exorbitant VRAM consumption turns out to be just as much of a nuisance as the time it takes to install shaders. Specifically graphics cards with 6 GB or less quickly get into a lot of trouble. This led our test device with a GeForce RTX 3060 to consistently crash when trying to run the game at 1,920 x 1,080 pixels with the ultra preset. This isn't helped by the fact that Nvidia have given their graphics cards too little VRAM for many years now - although, it has to be said that The Last of Us is just generally badly programmed in this way.
At first glance, this poorly performing game comes as a surprise, as Sony's previous PC adaptations (e.g. Spider-Man Miles Morales, Days Gone & God of War) all made quite good impressions. The problem here seems to be Iron Galaxy, the developer responsible for the PC port, who haven't always delivered clean work in the past (keyword: Batman Arkham Knight). But regardless of who is responsible for this technical issue, our opinion is that The Last of Us should never have been released on PC in this shape. We just have to hope that future updates will deal with all of these problems.
When it comes down to it, the game definitely has the potential to be enjoyed - as long as you have a strong graphics card with lots of VRAM. The remake's graphics generally appear quite modern and up-to-date. The textures, as well as the effects at higher levels (smoke, fire, reflections, …) and characters, are great. The game scores some real points in its level of detail and great atmosphere. We loved its cinematic production and for the most part dynamic transitions between video sequences and gameplay. Another positive aspect is the game's captivating storytelling.
Top 10 Laptops
Multimedia, Budget Multimedia, Gaming, Budget Gaming, Lightweight Gaming, Business, Budget Office, Workstation, Subnotebooks, Ultrabooks, Chromebooks
under 300 USD/Euros, under 500 USD/Euros, 1,000 USD/Euros, for University Students, Best Displays
Top 10 Smartphones
Smartphones, Phablets, ≤6-inch, Camera Smartphones
The producers definitely went overboard with the graphics menu. The Last of Us offers so many options (see screenshots) that you have to scroll for ages and very quickly lose track of what is what. Thankfully, there are some practical presets that take some of the hard work out of it and improve the game's quality in one click. Changes are made instantaneously, so without having to restart the game. As expected from a modern game, it supports various upscaling options, such as FSR and the Nvidia-exclusive DLSS (however, ray tracing is missing).
Compared to the console version, the FPS rate is only limited if you select to do so. Another positive is the detailed descriptions with CPU, GPU and VRAM influence, which sometimes even include handy comparison images. The only thing we missed was a "classic" full-screen mode. The large CPU requirement at moderate resolution is probably to blame for the game barely performing any better despite reducing the settings, which leads to visibly worse graphics.
Benchmark
We used a mix of in-game video sequences as well as "real" gameplay for our benchmark measurements. You can see the exact passage we used, which takes place near the beginning of the game and features lots of passers-by and effects, in the video below. We recorded the frame rate for roughly one minute using the tool CapframeX.
Results
FHD (1,920 x 1,080)
When it comes to using iGPUs, you can pretty much forget The Last of Us. Even the Radeon 680M failed to hit the 30 FPS mark in Full HD using minimal settings (1,280 x 720 pixels were just about playable). You will need at least a mid-range device to play the game at 1,920 x 1,080 pixels. A Radeon RX 6600M manages to reach almost 60 FPS using the presets medium and high.
The Last of Us | |
1920x1080 Low Preset 1920x1080 Medium Preset 1920x1080 High Preset 1920x1080 Ultra Preset | |
NVIDIA GeForce RTX 4080, i9-12900K | |
AMD Radeon RX 7900 XTX, i9-12900K | |
AMD Radeon RX 7900 XT, i9-12900K | |
NVIDIA GeForce RTX 3080, i9-12900K | |
NVIDIA GeForce RTX 3080 Ti Laptop GPU, i9-12900HX | |
NVIDIA GeForce RTX 3070, i9-12900K | |
AMD Radeon RX 6800M, R9 5900HX | |
NVIDIA GeForce RTX 3060 Laptop GPU, i7-12700H | |
AMD Radeon RX 6600M, R7 5800H | |
AMD Radeon 680M, R9 6900HS |
QHD (2,560 x 1,440)
A combination of 2,560 x 1,440 pixels and the ultra preset already makes the hardware and especially the VRAM break a sweat. A GeForce RTX 3070 was the first to successfully reach over 40 FPS.
The Last of Us | |
2560x1440 Ultra Preset | |
NVIDIA GeForce RTX 4080, i9-12900K | |
AMD Radeon RX 7900 XTX, i9-12900K | |
AMD Radeon RX 7900 XT, i9-12900K | |
NVIDIA GeForce RTX 3080, i9-12900K | |
NVIDIA GeForce RTX 3080 Ti Laptop GPU, i9-12900HX | |
NVIDIA GeForce RTX 3070, i9-12900K | |
AMD Radeon RX 6800M, R9 5900HX | |
AMD Radeon RX 6600M, R7 5800H |
UHD (3,840 x 2,160)
If you are playing the game at 3,840 x 2,160 pixels and maximum settings, even top-of-the-range chips such as the GeForce RTX 3080 Ti really struggle. Here, it makes sense to utilise DLSS for example, as laid out in the table below.
The Last of Us | |
3840x2160 Ultra Preset + Quality DLSS 3840x2160 Ultra Preset | |
NVIDIA GeForce RTX 4080, i9-12900K | |
AMD Radeon RX 7900 XTX, i9-12900K | |
AMD Radeon RX 7900 XT, i9-12900K | |
NVIDIA GeForce RTX 3080, i9-12900K | |
NVIDIA GeForce RTX 3080 Ti Laptop GPU, i9-12900HX | |
NVIDIA GeForce RTX 3070, i9-12900K | |
AMD Radeon RX 6800M, R9 5900HX | |
AMD Radeon RX 6600M, R7 5800H |
Note
Because gaming tests are very time-consuming and are often constrained by installation or activation limits, we are only able to provide you with part of the benchmark results at the time of publishing this article. We will be adding more graphics cards over the coming days and weeks.
Overview
Test systems
We currently use these laptops for our gaming benchmarks. Clicking on the photos will take you to the respective manufacturer's page. All other test systems (tower PCs, mini PCs, ...) can be found in the following list.
Device | Graphics card | Processor | RAM |
---|---|---|---|
Laptops | |||
MSI Titan GT77 12UHS | Nvidia GeForce RTX 3080 Ti @175 W TGP (16 GB GDDR6) | Intel Core i9-12900 | 2 x 16 GB DDR5 |
Lenovo Legion 5 Pro | Nvidia GeForce RTX 3060 @140 W TGP (6 GB GDDR6) | Intel Core i7-12700H | 2 x 8 GB DDR5 |
Asus ROG Strix G15 | AMD Radeon RX 6800M (12 GB GDDR6) | AMD Ryzen 9 5900HX | 2 x 8 GB DDR4 |
Lenovo Legion 5 | AMD Radeon RX 6600M (8 GB GDDR6) | AMD Ryzen 7 5800H | 2 x 8 GB DDR4 |
Asus ROG Zephyrus G14 | AMD Radeon 680M | AMD Ryzen 9 6900HS | 2 x 16 GB DDR5 |
Tower PCs | |||
Custom I | AMD Radeon RX 7900 XTX (24 GB GDDR6) AMD Radeon RX 7900 XT (20 GB GDDR6) MSI GeForce RTX 3080 (10 GB GDDR6X) MSI GeForce RTX 3070 (8 GB GDDR6) |
Intel Core i9-12900K | 2 x 16 GB DDR4 |
Custom II | Palit GeForce RTX 4090 GameRock OC (24 GB GDDR6X) Nvidia GeForce RTX 3090 FE (24 GB GDDR6X) Nvidia GeForce RTX 3060 Ti (8 GB GDDR6X) Nvidia Titan RTX (24 GB GDDR6) lNvidia GeForce RTX 2070 Super (8 GB GDDR6)Nvidia GeForce RTX 2060 Super (8 GB GDDR6) KFA2 GeForce GTX 1660 Super (6 GB GDDR6) PNY GeForce GTX 1660 (6 GB GDDR5) KFA2 GeForce GTX 1650 Super (4 GB GDDR6) AMD Radeon RX 6950 XT (16 GB DDR6) AMD Radeon RX 6800 (16 GB DDR6) AMD Radeon RX 6700 XT (12 GB DDR6) AMD Radeon RX 6650 XT (8 GB GDDR6) AMD Radeon RX 6600 (8 GB GDDR6) AMD Radeon RX 5700 XT (8 GB GDDR6) AMD Radeon RX 5700 (8 GB GDDR6) AMD Radeon RX 5600 XT (6 GB GDDR6) AMD Radeon RX 5500 XT (8 GB GDDR6) |
AMD Ryzen 9 7950X | 2 x 16 GB DDR5 |
Custom III | Nvidia GeForce RTX 2080 Super FE | Intel Core i9-11900K | 2 x 8 GB DDR4 |
Mini PCs | |||
Zotac ZBOX CI665 Nano | Intel Iris Xe Graphics G7 (96 CUs) | Intel Core i7-1165G7 | 2 x 8 GB DDR4 |
Morefine S500+ | AMD Radeon RX Vega 8 | AMD Ryzen 9 5900HX | 2 x 16 GB DDR4 |
Minisforum NUCXi7 | Nvidia GeForce RTX 3070 @125 W TGP (8 GB GDDR6) | Intel Core i7-11800H | 2 x 8 GB DDR4 |
Minisforum HX99G | AMD Radeon RX 6600M @100 W TGP (8 GB GDDR6) | AMD Ryzen 9 6900HX | 2 x 16 GB DDR5 |
4K monitors | Operating system | Nvidia driver | AMD driver |
---|---|---|---|
Philips Brilliance 329P9H, Gigabyte M32U | Windows 11 | ForceWare | Adrenalin |