Nvidia’s Project G-Assist AI assistant is now available via the Nvidia app. First showcased at Computex 2024, the assistant is meant to optimize game and system settings for RTX desktop users, apply overclocks, launch games, as well as control various peripheral settings and more, all through basic voice and text commands. However, it is only available for RTX GPUs with 12GB or higher VRAM and does come at the cost of some GPU performance, but that is only temporary.
Available for RTX 30, 40, and 50 series desktop GPUs only, Project G-Assist AI assistant uses a third-party SLM or Small Language Model that runs locally. Because of its scale, it is not meant for broad conversations but for very specific tasks. Since it runs locally, it will take up GPU resources but for a very short time, during which users may face “a short dip in render rate or inference completion speed.” This means that when gaming or running GPU-heavy apps, the performance may drop for a few seconds when the AI assistant is activated.
Speaking of activating, G-Assist can be woken up with the Alt+G shortcut and asked to do things like optimize graphics settings or check temperatures. Some of the other supported functions include giving information about Nvidia technology, GPU overclocking, launching games, saving clips, checking for driver updates, and more. Not only these, the assistant can also control peripheral and room lighting through a plugin, of course, for supported devices.
There are quite a lot of functions that this SLM is capable of already, and they can be found here.
G-Assist is only available for desktop RTX GPUs for now, and laptop GPU support will come in a future update. Nvidia has not shared if it will be possible to support GPUs with lower than 12GB VRAM, especially since the upcoming RTX 5060 and 5060 Ti have 8GB VRAM variants.