Notebookcheck Logo

Nvidia Optimus Review

by Klaus Hinum 02/09/2010

Switchable Graphics 3.0

Nvidia Optimus allows automatic switching of graphic cards, depending on the application and, theoretically, makes a hurdle between performance and battery life possible. Aside from that, Optimus is easier and cheaper for the manufacturer to implement. A detailed review of Optimus on an Asus UL50VF can be found in the following column.

Nvidia Optimus

Dilemma: Performance versus battery life

Without switchable graphics, the potential laptop buyer usually has to choose between performance (with a strong graphic card) and a long battery life (using a graphic card integrated into the processor or chip set). To combine both advantages, the user also had the possibility to choose a laptop with switchable graphics.

Up to now switchable graphics (from Nvidia and ATI) use multiplexers to route the between the graphic card outputs and the display ports. These multiplexers demand a lot of place on the mainboard, extend the signal path and, also always need a bit of electricity. Besides that, these solutions use a proxy driver (required by Windows XP and Vista), which unite the graphic drivers from Intel and Nvidia. For each laptop this proxy driver was built again with a significant effort and never updated again. Because of that, buyers had to live with the included drivers and can't benefit from performance improvements and troubleshooting of the newer driver versions.

The "switchable graphics" have improved over the generations but a few grave system-related disadvantages remained. Switching the video outputs always produced an annoying flickering on the display. Aside from that, the graphic solution couldn't be switched when a DirectX application was running (e.g. Windows 7's Solitary was already enough).

Current and former implementations of switchable graphics us Multiplexers (MUX) to redirect the signals from the graphic card to the displays.
Current and former implementations of switchable graphics us Multiplexers (MUX) to redirect the signals from the graphic card to the displays.
A proxy driver united the Nvidia and Intel/AMD drivers to overcome the limits of Windows XP and Vista (which is why Optimus won't be released for XP and Vista).
A proxy driver united the Nvidia and Intel/AMD drivers to overcome the limits of Windows XP and Vista (which is why Optimus won't be released for XP and Vista).

Nvidia Optimus – The solution?

Nvidia now has presented the third generation of its switchable graphic solution called Optimus. It allows a dynamic switching to the dedicated graphic card without having to reboot, close applications or intervene manually. At the moment, Optimus only works under Windows 7 with Nvidia graphic cards manufactured in 40 nm (most of the latest GeForce 200M and 300 graphic cards – see our graphic card comparison) in combination with the integrated Intel graphic solutions (GMA 4500MHD, 4500M, Intel HD graphics, GMA 3150). Nvidia's newer graphic cards integrate a dedicated part in hardware that's necessary for Optimus (which is why Optimus doesn't work on the G92 based GeForce GTX 285M, 260M,… models). Theoretically, the chip set graphic isn't limited to Intel, but currently Nvidia supports only Intel's integrated graphics cards.

Nvidia's graphic card or the slower and more frugal Intel graphic card calculates the window's content depending on the performance requirement (is defined in the driver).
Nvidia's graphic card or the slower Intel graphic card calculates the window's content depending on the performance requirement (is defined in the driver). The output is always done by the Intel IGP.

How does it work? – The Technology

Technically, only the integrated graphic card is attached to the display connections (internal and external) and always outputs the displayed images. As soon as an application needs the performance of the discrete Nvidia graphic card, the driver enables it and computes the images with it. The result isn't (as it would usually be) directly outputted but is simply copied into the output memory (frame buffer) of the integrated graphic card via PCI-E bus. In other words, the integrated graphic card simply renders a window with a content that originates from the dedicated graphic card. The PCI-E bus' return channel normally isn't used much and should, therefore, easily cope with this task. According to Nvidia's engineers, the copying creates a latency of 0.2 of a frames. Thus, the delay shouldn't be a major problem. As soon as the stronger graphic card isn't needed any more, the driver simply disables it again completely and leaves all the work to the Intel graphic card. Nvidia showed us in a demonstration that the dedicated graphic card theoretically can even be unplugged when it's not in use.

All output devices are only connected to the integrated graphic card.
All output devices are only connected to the integrated graphic card.
The Nvidia graphic card uses the PCI-E bus to copy the data into the frame buffer.
The Nvidia graphic card uses the PCI-E bus to copy the data into the frame buffer.

A new driver architecture

The user can define the preferred graphic card manually in the driver profile (and thus overwrite the standard).
The user can define the preferred graphic card manually in the driver profile (and thus overwrite the standard).
In Windows 7's hardware manager you'll find both installed graphic cards.
In Windows 7's hardware manager you'll find both installed graphic cards.

One problem that previously turned up in the switchable graphic solutions was "the user". Only few were actually using the switching of the graphics cards when it made sense. Some customers weren't even aware of the feature and therefore the dissatisfaction with the laptop was often because of the complicated usage of switchable graphics. In order to solve this problem and always automatically  provide the best suited graphics solution, Nvidia had to introduce a new driver architecture.

The driver can use a list of profiles which contain the best graphics solution for a given application. At the start of every application, the driver looks for a fitting profile and then automatically decides which graphics card should be used. The listing system only works well when the list is, obviously, always up-to-date. But because only few users update their graphic cards and updates for notebook drivers are released less frequently, this list would hardly be improved. That's why Nvidia uses drivers that automatically updates the profile list from the internet. Thus, even an older driver should theoretically recognize the newest application and run with the best graphic solution. Additionally, the user can also alter the profiles manually or select the graphic card to be used via the context menu. Both didn't quite work reliably in the early version we had but that should be because of the early Beta stage of the driver. The method with the context menu, for example, had problems with shortcuts on the desktop and the profiles couldn't be changed permanently. by the user.

The new driver architecture allows automatic profile updates over the internet using an encrypted connection. Therefore, Optimus should be able to recognize new applications and games.
The new driver architecture allows automatic profile updates over the internet using an encrypted connection. Therefore, Optimus should be able to recognize new applications and games.

When will Optimus find its way into notebooks?

According to Nvidia, 50+ Optimus equipped laptops will be available as of summer. The available system range from small Pine Trail netbooks (with ION 2 graphic cards) up to full-fledged desktop replacements / gaming notebooks (with GTS 360M for example). Asus will be a release partner and shortly launch the notebooks UL50Vf, N82Jv, U30Jc, N71Jv, N61Jv onto the market.

Furthermore, Nvidia showed us a demonstration of a working Lenovo Thinkpad T model, but Lenovo is still assessing the use of Optimus and will for now start without this technology.

Optimus is theoretically also usable for desktop graphics cards. Especially the upcoming Fermi based high-end cards could profit from this feature (regarding energy saving and noise level).

Asus N61 with Optimus
Asus N61 with Optimus
Asus N82 with Optimus
Asus N82 with Optimus
Asus U30 with Optimus
Asus U30 with Optimus

Practical test with Asus' UL50VF

Nvidia provided us with an Asus UL50VF notebook (basically identical to the UL50Vt, but with Optimus) with a discrete Nvidia GeForce G210M and integrated GMA 4500MHD graphic chip set for a practical test. The G210M belongs to the entry level graphic cards, but has at least a 3x better gaming performance in comparison to the old GMA 4500MHD. Many demanding games even only run smoothly with the G210M (altough only with minimum detail settings). Moreover, the G210M supports CUDA and DirectX Compute for using the GPU shader cores for other computing tasks (i.e. video transcoding).

To see the currently active video card, Nvidia has provided us with a still unofficial tool, with which the GeForce graphic's status can be seen. This tool isn't intended for the consumer, but a similar tool is later planned to be available for end-users (possibly in the course of a new driver release).

All games and applications recognized by the driver ran as expected in our test. As soon as a game like World of Conflict started, the G210M was automatically enabled and took over the rendering. The graphic card was also disabled immediately after the application was closed. During the starting or stopping we didn't notice any image flickering or longer waiting times. We could only see the switching with Nvidia's test tool and the due to the expected performance. Optimus also already works well with Flash 10.1 and enables the Nvidia graphic card just when a flash video is started (however, Intel's graphics remains responsible for other flash applications).

Therefore, the driver is also able to swith the graphics card because of other triggers than starting an application (if the application supports Optimus).

Because Optimus isn't based on any driver "hacks", you don't have to anticipate any unusual errors in games or applications. We've tested the most common 23 games with the Asus UL50VF (see gaming list) and weren't faced with a single error. Even switching always went well, except for one time (this error is probably due to the early beta driver). If a game wasn't listed in the driver, it had to be registered manually for the G210M via Context Menu entry or permanently by assigning it in the driver (game profile). Due to the driver architecture always updating automatically, this should only hardly ever happen in the future. The only problem that we had sometimes was to start a game deliberately with the integrated graphics card (overrule the driver). However this problem will probably be easy to live with because the user won't likely start a 3D application with the slower GMA graphic on purpose.

A special case is the built-in HDMI port of the Asus UL50Vf. It appears not to be connected to the integrated graphic card but to the Nvidia's GeForce G210M (perhaps to integrate the SPDIF sound into the signal). Thus, the G210M always starts when a HDMI monitor / TV is connected. This is a bit annoying if you want to use an external monitor via HDMI permanently (because of the higher power consumption). Still the also available VGA port worked well with both the GMA4500MHD and G210M (including switching).

By the way, UL50VT's dedicated hardware switch to change the graphics card is without a function under Windows (because manual switching is not yet supported by the driver).

The battery test of the UL50VF proved that the GeForce graphic can be actually disabled. Therefore, an excellent 11 h and 47 min were reached in the BatteryEater Reader's test. Using the G210M, the UL50VF ran only 8 h and 20 min in the same benchmark. Under full GPU and CPU load, the difference was almost 3 h to short of 4 h with the GMA under load. However, most gains are to be expected in reveryday mixed work as the automated switching process should only activate the G210M when really necessary. Here the automatic switching can prove its strength and accomplish the hurdle between performance and battery life.

The advantage of Optiums is also seen in the power consumption. We measured a difference of about 4 watts with an enabled GeForce 210M without load (10.8 watt using the GMA). Under high load we measured about 6 watts  (Furmark + Prime95 max. 53.7W using the G210M).

Asus Ul50VF with Nvidia Optimus
Asus Ul50VF with Nvidia Optimus
Asus Ul50VF with Nvidia Optimus

Verdict

In short, we were thrilled by Optimus. Switchable graphic solutions are finally growing up and have evident advantages for the customer in all notebook categories. On average, a system with a switchable Optimus graphic card runs more economical, cooler, quieter and has a longer battery life than a comparable notebook, which is equipped with only one discrete graphic card. Compared to current switchable graphic solutions, Optimus works more convenient and is cheaper. The new driver architecture and the concept of an automatic switching already works astonishingly well in the first release and usually don't require a manual intervention.

Of course the optimum would be a single graphic card in a system that works in all load states with minimum power consumption. But as such a solution is currently not available (and not really on the horizon), Optimus is the weapon of choice for users who need more graphic performance as an integrated graphic card can offer.

Advantages of Optimus

  • Dynamic starting and stopping of the discrete GPU when required (if the driver recognizes the application)
  • No display flickering during switching
  • Starting the Nvidia GPU is also possible if a 3D application is already running.
  • Very fast switching
  • Inexpensive to incorporate in comparison to other switchable graphic solutions
  • Based on standards (no proxy driver needed)
  • Driver updates are intended by Nvidia
  • No higher power consumption with use of the integrated graphic card
  • New driver architecture with automatic profile updates from the internet

Disadvantages

  • Low latency due to display content being copied when Nvidia's GPU is used (0.2 of a frame)
  • Driver doesn't yet recognize all games / demos / applications (should improve considerably due to the new driver architecture)
  • Currently no hardware switch implemented in first notebooks
  • Optimus currently only supports Windows 7 (Windows XP and Vista wont be possible)
  • Integrated graphic chip can't be disabled
  • HDMI port in the Asus UL50Vf notebook can only be used with Nvidia's graphic card (not an Optimus problem)

 

As many questions already arose, Optimus cant be installed on current laptops that are not shipped with this feature advertised.

Please share our article, every link counts!
Redaktion, 2010-02-11 (Update: 2012-05-26)