NVIDIA Optimus Technology: Performance and Battery Life for your Notebook
Current Switchable Graphics Options
The current generation of switchable graphics was originally introduced in 2007 from both NVIDIA and AMD. At the time, IT was labeled as the "best of both worlds" just as Optimus is being called today, so obviously we need to scrutinize any and all claims. Before 2007, switchable graphics was even less impressive in that it had a physical or hardware switch on the notebook itself and it required a full system reboot. This was because the change was actually setting a BIOS option to switch between default graphics cards and only a reboot would allow the operating system to see the new graphics solution and utilize it.
Things did improve with the 2007 iteration as the switch process became a software-only operation. That did not mean it was seamless however; switching between the two graphics modes (integrated or discrete) would require the user to make a conscious decision that they would need to make the change. This might seem like an easy task for most PC Perspective readers but keep others in your life in mind when considering it. Would your mom, dad or sister really know that in order to play Sims 3 with the best experience that they would need to adjust something called a graphics card?
When you made that decision, the next step was to find the method to do so which varied from manufacturer to manufacturer and was called wildly different things. On the Mac, you had to enter into System Preferences then go into Energy Saver options and change between "high performance" and "longer battery" settings while on the ASUS UL50 we are comparing today you need to force the power profile from "Balanced" to "High Performance". The confusion amongst these options was definitely a problem for everyday consumers and for those of us that are supposed to be teaching them.
After both deciding to and actually switching the hybrid graphics mode, you might not even be done then. After some screen flickering (that will likely scare even the most hardcore users) and a 5-10 second wait, if you happen to have applications open that were using resources on the GPU, you'll have to close them. If you had a few browser windows open with a few tabs each and maybe were running Excel and sneaking some Solitaire, all of them would have to be shut down in order to actually enable the discrete GPU.
Let's take a look at this process on the current generation switchable graphics in what I would consider a "best case" scenario.
Here you can see the system is in IGP graphics mode, called "Save Power" - only a single of the NVIDIA logo arrows is illuminated. There is an indication that you can increase performance but there isn't much more information provided at a glance.
As we mentioned above, with this ASUS machine you can switch to discrete GPU mode by enabling the high performance profile.
However, because we had some programs open (as most consumers tend to do) we are notified that we haven't actually switched into high performance mode. By clicking the exclamation mark we get this information:
Well, isn't that intuitive. We need to close Solitaire and my YouTube video in order to run at higher performing setting.
After closing those applications and hitting the appropriate button in the dialogue box we get a flashing screen and this quickly seen indicator that we are moving into the world of the discrete graphics solution.
The change is now complete so we can re-open our applications and take advantage of the GPU. I don't think its a stretch to say that this is not the most user-friendly of processes. NVIDIA claims that less than 1% of users with switchable graphics actually take advantage of the feature - a surprising statistic that I somehow believe: I have two machines with switchable graphics that I have never changed out of IGP mode until this article testing required it.
Here is a video that goes over the annoying process of changing the GPU in a switchable graphics solution:
Current Switchable Graphics Architecture
The current array of switchable graphics options are a mish-mash of hardware and software that appears to have been hacked together. Take the software overview for example:
Because Windows Vista only allowed a single GPU driver to exist in the system at one time it required cooperation between the two graphics vendors in order to make one driver for the system to use. The "Display Driver Interposer" essentially would be interpreting information from the operating system and passing it on to the correct GPU depending on which mode the system was in - balance or high performance. This meant that drivers for each independent GPU (the IGP and discrete) could not be updated on their own and consumers that wanted to make sure they had the latest NVIDIA driver had to wait for their notebook vendor (and NVIDIA and Intel) to play nice and release it. Good luck with that, right?
The hardware situation wasn't any nicer; in fact notebook designers had to deal with additional hardware multiplexers between the displays (both integrated and external connections) and both GPU options.
These muxes electrically changed which graphics solution was connected to the displays based on the current configuration of the system. This is what caused the flickering of the screen for the user and also forced additional design consideration and added cost for the OEMs.
This design is basically a hassle to the consumer and a hassle to the hardware vendor and people on both sides could really utilize an improvement in the technology.