NVIDIA Optimus Technology: Performance and Battery Life for your Notebook
NVIDIA Optimus Technology
NVIDIA has come up with a solution that they think answers all of these problems with a combination of hardware, software and driver work. The idea behind this new Optimus technology is easy enough to envision: use the discrete NVIDIA graphics solution when the application or task being run would see performance or quality enhancements from it and let the IGP handle everything else. Things aren't always that easy though, as enthusiast PC users are well aware, so the totality of what NVIDIA has done should impress anyone.
For all intents and purposes, NVIDIA's goals for Optimus appear to have been met. On our Optimus-powered system the GPU was in its OFF state when we booted it up and were resting on the desktop. It is important to note that by off, NVIDIA truly means off: the GPU is completely powered down as are the PCI Express lanes from the chipset or CPU to the GPU. The system is effectively running in the exact same fashion as a system without a discrete graphics solution at all. When we open up an application that utilizes the GPU, for example a FireFox window with Flash video playing, the GPU is powered on and takes over the processing of that application. When we close that tab or navigate to a different web page, the GPU powers back down (again, completely OFF) and the IGP takes over once again.
The custom NVIDIA utility tells us if the discrete GPU is on or off. Here the GPU is off since we aren't doing anything the GPU can really improve on.
Because the technology is so incredibly seamless a user will likely never know that the GPU is switching on or off. Because of that NVIDIA built a custom application for demonstration purposes (that they will eventually release to the public) that indicates to the user the state of the GPU. Of course, for most users, that makes Optimus all the more appealing - they will never have to be made aware of the background processes going on at all.
Now that we have started playing our H.264 HD Flash video the GPU turns on (seamlessly) and begins to accelerate the decode process.
Our testing system for Optimus was built on the Intel Core 2 architecture though NVIDIA claims that it works identically with the new Arrandale processors (even with its integrated GPU) and the Atom processor as well. This means that any and all notebooks, from the smallest of netbooks to the largest of gaming rigs, will be able to take advantage of the benefits of Optimus. NVIDIA claims that they have integrated the necessities of Optimus into the existing 40nm GPUs it has built including the GeForce 200M and 300M series and of course the upcoming Fermi-based derivatives so not only will any class of notebook based on CPU power be able to use Optimus but so should a notebook with any class of discrete GPU.
How does Optimus address the other complaints from the hardware vendors? For one, notebooks no longer need to integrate hardware multiplexers to switch the display between the two graphics solutions. Instead, NVIDIA Optimus technology actually utilizes the IGP nearly full time by using it as a display controller so the only GPU that needs to be connected to the monitors is the integrated one. This also means that NVIDIA is using new technology that actually shares the system memory-based frame buffer between the discrete and integrated GPUs: NVIDIA is integrating a new copy engine that moves the contents of its frame buffer to the system memory to be output by the IGP. I have some more details on that a bit later.
This design is much simpler for hardware vendors to implement than the previous switchable graphics method.
If you didn't see our full video review of the NVIDIA Optimus Technology (shame!) then I at least have included some video screencasts that demonstrate the Optimus technology at work on our test notebook.