God of War in test: Notebook and desktop benchmarks
Technology
Due to the Norse mythology, God of War is at first very reminiscent of Assassin's Creed Valhalla. While the Ubisoft counterpart mainly relies on its open world (which can quickly become repetitive due to a lot of generic content), God of War scores with detailed character drawings and - for gaming standards - intensive storytelling.
In contrast to various other PC ports, which often give a lot of reason for criticism, Sony delivers a surprisingly mature product. We did not have to deal with a single crash or other inconsistencies during the test. Apart from the successful menu control, the crisp battle system should also be mentioned on the positive side.
The developers also deserve praise for the graphics options. Although there are only 7 quality settings in the graphics menu, they can be comfortably adjusted via preset and do not require a restart when changes are made. The middle preset, which is called "Original", is supposed to be on the same level as the console version. The presets High and Ultra bring even more graphics splendor. All settings are accompanied by explanatory texts and sometimes by example pictures.
Most of the picture options are in the display menu. Here, you can not only adjust the resolution (unfortunately, there is no classic full-screen mode), but also the motion blur, the graininess and the render scaling. In addition, there is V-Sync, an optional FPS limit and DLSS or FidelityFX. The package is rounded off by a VRAM display. Analogous to most current games, God of War requires quite a lot of video memory. In Full HD, it should be around 4 GB for high details and around 5 GB for maximum details (even minimum settings allow almost 3 GB).
Benchmark
Since God of War does not include a built-in benchmark function, we had to find our own sequence for the speed measurements. As you can see in the following video, we run after the main character's son for about 30 seconds, who is trained to hunt and find deer shortly after the game starts.
Results
For the benchmarks, we mainly use rental devices from our partners at XMG (Schenker Technologies) and MSI.
FHD (1,920 x 1,080)
Assuming that around 40 FPS is enough for a third-person title, even mid-range GPUs can handle God of War properly when using 1,920 x 1,080 pixels. For example, a GeForce GTX 1650 (Ti) displays the presets Low and Original smoothly. The preset High, meanwhile, yearns for a GeForce GTX 1660 Ti, RTX 3050 Ti or Radeon RX 5600M. The Ultra Preset is only supported by a GeForce RTX 3060 with more than 40 FPS.
God of War | |
1920x1080 Low Preset 1920x1080 Original Preset 1920x1080 High Preset 1920x1080 Ultra Preset | |
NVIDIA GeForce RTX 3090, i9-9900K | |
NVIDIA GeForce RTX 3080, i9-9900K | |
NVIDIA GeForce RTX 3070, i9-9900K | |
NVIDIA GeForce RTX 3080 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3070 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3060 Laptop GPU, i7-11800H | |
AMD Radeon RX 5600M, R7 4800H | |
NVIDIA GeForce GTX 1660 Ti Mobile, i7-9750H | |
NVIDIA GeForce RTX 3050 Ti Laptop GPU, i7-11800H | |
NVIDIA GeForce GTX 1650 Ti Mobile, i7-10750H | |
NVIDIA GeForce GTX 1650 Mobile, i7-10750H |
QHD (2,560 x 1,440)
When using a QHD display (2,560 x 1,440), the system requirements only increase moderately compared to Full HD. Again, we would recommend a GeForce RTX 3060 or better for the Ultra Preset.
God of War | |
2560x1440 Ultra Preset | |
NVIDIA GeForce RTX 3090, i9-9900K | |
NVIDIA GeForce RTX 3080, i9-9900K | |
NVIDIA GeForce RTX 3070, i9-9900K | |
NVIDIA GeForce RTX 3080 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3070 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3060 Laptop GPU, i7-11800H | |
AMD Radeon RX 5600M, R7 4800H | |
NVIDIA GeForce GTX 1660 Ti Mobile, i7-9750H |
UHD (3,840 x 2,160)
God of War only becomes really demanding in the 4K range. A mixture of 3,840 x 2,160 pixels and maximum details even brings the current notebook front-runner, the GeForce RTX 3080, to its limits.
God of War | |
3840x2160 Ultra Preset | |
NVIDIA GeForce RTX 3090, i9-9900K | |
NVIDIA GeForce RTX 3080, i9-9900K | |
NVIDIA GeForce RTX 3070, i9-9900K | |
NVIDIA GeForce RTX 3080 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3070 Laptop GPU, i7-11800H | |
NVIDIA GeForce RTX 3060 Laptop GPU, i7-11800H | |
AMD Radeon RX 5600M, R7 4800H | |
NVIDIA GeForce GTX 1660 Ti Mobile, i7-9750H |
Note
Since gaming tests are very time-consuming and often hindered by installation or activation limits, we can only provide you with a part of the benchmarks for the publication of the articles. More graphics cards will be installed in the coming days and weeks.
Overview
Test systems
Device |
Graphics card | Processor | Memory | Operating system |
---|---|---|---|---|
XMG Neo 15 | Nvidia GeForce RTX 3080 @165 W TGP (16 GB GDDR6) | Intel Core i7-11800H | 2 x 16 GB DDR4 | Windows 10 64 Bit |
XMG Neo 17 | Nvidia GeForce RTX 3070 @140 W TGP (8 GB GDDR6) | Intel Core i7-11800H | 2 x 16 GB DDR4 | Windows 10 64 Bit |
XMG Core 15 | Nvidia GeForce RTX 3060 @130 W TGP (6 GB GDDR6) | Intel Core i7-11800H | 2 x 16 GB DDR4 | Windows 10 64 Bit |
XMG Focus 17 | Nvidia GeForce RTX 3050 Ti @75 W TGP (4 GB GDDR6) | Intel Core i7-11800H | 2 x 16 GB DDR4 | Windows 10 64 Bit |
MSI GP65 | Nvidia GeForce GTX 1660 Ti (6 GB GDDR6) | Intel Core i7-9750H | 2 x 8 GB DDR4 | Windows 10 64 Bit |
MSI GP75 | Nvidia GeForce GTX 1650 Ti (4 GB GDDR6) | Intel Core i7-10750H | 2 x 8 GB DDR4 | Windows 10 64 Bit |
MSI GL75 | Nvidia GeForce GTX 1650 (4 GB GDDR6) | Intel Core i7-10750H | 2 x 8 GB DDR4 | Windows 10 64 Bit |
Dell G5 15 SE | AMD Radeon RX 5600M (6 GB GDDR6) | AMD Ryzen 7 4800H | 2 x 8 GB DDR4 | Windows 10 64 Bit |
MSI Prestige 14 Evo | Intel Iris Xe (96 CUs) | Intel Core i7-1185G7 | 1 x 16 GB DDR4 | Windows 10 64 Bit |
MSI Prestige 14 Evo | Intel Iris Xe (96 CUs) | Intel Core i7-1195G7 | 2 x 8 GB DDR4 | Windows 11 |
Asus ROG Strix G15 | AMD Radeon RX 6800M (12 GB GDDR6) | AMD Ryzen 9 5900HX | 2 x 8 GB DDR4 | Windows 10 64 Bit |
Acer Swift 3 SF314-42 | AMD Radeon Vega 7 | AMD Ryzen 7 4700U | 2 x 4 GB DDR4 | Windows 11 |
Desktop PC I | MSI GeForce RTX 3090 (24 GB GDDR6X) MSI GeForce RTX 3080 (10 GB GDDR6X) MSI GeForce RTX 3070 (8 GB GDDR6) |
Intel Core i9-9900K | 4 x 8 GB DDR4 | Windows 10 64 Bit |
Desktop-PC II | Nvidia GeForce RTX 3090 FE (24 GB GDDR6X) Nvidia GeForce RTX 3060 (12 GB GDDR6X) Nvidia Titan RTX (24 GB GDDR6) Nvidia GeForce RTX 2070 Super (8 GB GDDR6) GDDR6)Nvidia GeForce RTX 2060 Super (8 GB GDDR6) KFA2 GeForce GTX 1660 Super (6 GB GDDR6) PNY GeForce GTX 1660 (6 GB GDDR5) KFA2 GeForce GTX 1650 Super (4 GB GDDR6) KFA2 GeForce GTX 1650 (4 GB GDDR5) AMD Radeon RX 6700 XT (12 GB DDR6) AMD Radeon RX 6600 XT (8 GB GDDR6) AMD Radeon RX 6600 (8 GB GDDR6) AMD Radeon RX 5700 XT (8 GB GDDR6) AMD Radeon RX 5700 (8 GB GDDR6) AMD Radeon RX 5600 XT (6 GB GDDR6) AMD Radeon RX 5500 XT (8 GB GDDR6) |
AMD Ryzen 9 5900X | 2 x 32 GB DDR4 | Windows 10 64 Bit |
Desktop PC III | Nvidia GeForce RTX 2080 Super FE | Intel Core i9-11900K | 2 x 8 GB DDR4 | Windows 11 |
4K monitors | Nvidia drivers | AMD drivers |
---|---|---|
Asus PB287Q, Philips Brilliance 329P9H, Acer Predator XB321HK | ForceWare 511.23 | Adrenalin 22.1.1 |