NVIDIA and Havok Bring SLI Physics to Life
Possible Issues, ATI and Final Thoughts
Some Possible Problems
Life is not all roses for NVIDIA in physics-land. There are some additional issues that come up in this implementation of physics processing that don't exist when using a separate card as in AGEIA's implementation. Most notable is the issue of balance; how do NVIDIA's driver, the game developer and the end user control the amount of processing on the GPU that is set for graphics processing and how much of it is set for physics processing. With AGEIA's single PPU product, the developer can assume the level of physics processing in each system is the same, but NVIDIAS's and Havok's solution has support for various GPUs as well as support for different modes of operation (Single, SLI, Quad, etc). This means that developers can't just turn a switch in their game for physics on/off. It has to be much more fluid than that.
We can't just leave it to the game developer, as we have seen time and again that the 'optimal' settings that a game detects on hardware are often very under-inspiring and don't utilize the hardware fully. Adding more profiles to the NVIDIA driver is probably the best option and the high-end power user would surely want some detailed control as well. It could get confusing quickly, but it isn't a big enough problem that I feel can't be addressed with some careful programming and engineering on both game developer's and NVIDIA's ends.
It should be noted that though NVIDIA is the first to make a business partnership in order to get physics on the GPU rolling, they weren't the first to talk publicly about a way to compete against AGEIA with graphics cards. During the X1800 and X1900 launches, ATI demonstrated physics calculations being done on their GPUs and even had a researcher from Stanford coming to talk about the GPGPU era (general purpose graphics processing unit).
We don't yet have any information on a similar partnership with ATI and another developer, or if the Havok FX API will work with both NVIDIA and ATI GPUs, though I would expect that it is very possibly the case that it will. In the spirit of open standards and competition, I hope that both ATI and NVIDIA are able to work with Havok on accelerating physics for everyone. Not only would end users get the best of that, but we would be able to accurately test physics processing on NVIDIA's versus ATI's GPUs and determine performance leaders for this and future generation GPUs.
This week at GDC, both Havok and NVIDIA will be showing some technical demos of Havok FX in action, running on NVIDIA GPUs. Though I have not seen the demos yet, NVIDIA gave a brief synopsis of them and told us not to expect anything dramatic quite yet. There probably won't be any game announcements or any that are far enough into development to show off the physics technology as the Havok FX API has not been finalized yet.
While we don't have many details on SLI Physics quite yet, I imagine we will get much more information during the week at GDC in San Jose. We should also hear much more from AGEIA and their PhysX processor during GDC as well, and it will be interesting to see what their response is to NVIDIA's attempt to overshadow them today. All that is known for sure is that NVIDIA has just upped the entry fee for the physics processing market and any competitors better come with a strong product and solid support for all sides.
It looks like physics might be a whole lot of fun in 2006!
Be sure to use our price checking engine to find the best prices on the NVIDIA 7900 GTX, and anything else you may want to buy!
Get notified when we go live!