Review Index:
Feedback

The State of NVIDIA: For better or for worse

Author:
Manufacturer: NVIDIA
Tagged:

A week's worth of thoughts on NVIDIA

No other company fields more questions about its future than NVIDIA.  Since the first scent of the AMD/ATI merger, NVIDIA was basically put on a death clock as media, analysts and consumers tried to figure out when the once-dominant graphics chip designer would find itself swallowed by an industry that is moving to places it simply could not.  Since then NVIDIA has fought diligently against those preconceptions and has recently made considerable moves from a fundamental corporate direction to prevent any such company-culling. 

During last weeks' NVIDIA GPU Technology Conference, we talked with a host of NVIDIA executives, designers and more importantly, developers, to get a sense of where NVIDIA stands in the current market, where the upcoming Fermi architecture will take it 2010 and where the company will be in 5-10 years. 

A brief history...

Before we start looking at the future, NVIDIA has some issues to deal with today.  As it stands now, its primary competitor has the best performing GPU, the most power efficient GPU and arguably the most interesting features on a graphics card.  The GT200-based GPUs that exist today are not really bad parts - the GTX 285 in particular can stand up against the Radeon HD 5870 and HD 5850 from AMD in some regards but PC Perspective's overall opinion is that NVIDIA's options are the second pick out of two.  Obviously NVIDIA doesn't like being in that position, but until the next generation of parts is available in early 2010 that will likely be the case.

NVIDIA does have some differentiating features for its GPUs including PhysX, CUDA and 3D Vision.  While each of these are interesting, they have some faults and issues that keep them from having enough pull with gamers to overcome the power of the Radeon HD 5000-series. 



NVIDIA 3D Vision glasses

3D Vision is probably my personal favorite of these three features - the ability to get a very smooth and eye-catching 3D gaming experience from the PC is impressive.  The problems with the technology lie in price and screen size; as of this writing only 22-in monitors are available with 3D Vision technology support and many gamers have moved past that size to 24-in, 27-in and even 30-in displays.  Once you are used to a larger screen it is VERY unlikely that you will want to move backwards for any reason.  Price is also prohibitive - getting a 3D Vision monitor and 3D glasses kit will cost you about $500 - considering you can get standard 22-in monitors for well under $200 that is a significant price premium for the ability to play in 3D.  We are hearing rumors that larger screens will be available around the CES time frame, but if cost doesn't decrease as well I still don't think 3D Vision will get the user base NVIDIA is hoping for.

The story of CUDA is another beast completely - we have spent quite a few articles looking at various GPU computing applications and the advantages they provide to users.  In truth, the landscape today is still very limited and only in very select circumstances are GPU-based applications available and recommended.  This is likely to change as things continue to evolve with news from the GPU Technology Conference on the Fermi architecture and has OpenCL and DirectCompute become pervasive.

PhysX is probably the most controversial feature that NVIDIA has in its war chest.  PhysX was originally created by AGEIA, then acquired by NVIDIA and integrated to run on the NVIDIA GPU.  The same problems exist for NVIDIA as they did for AGEIA just to a lesser degree - getting developers to implement PhysX support and hardware acceleration in an interesting enough way to get gamers' attention.  Without diving too much into the debate of the success there, let's just say we don't think we have been convinced. 

Some recent controversy has been stirred up with PhysX though thanks to Windows 7 and its ability to run multiple graphics drivers on the same system - a feature that existed in Windows XP but was not available in Windows Vista.  Since PhysX was never officially supported by NVIDIA on Windows XP, NVIDIA never had to deal with the dilemma of running PhysX in a multi-vendor system.  With Windows 7 a user could in theory have a Radeon HD 4890 installed for graphics rendering and a GeForce 9800 GT for PhysX hardware acceleration - and in fact with early drivers this was possible.  Now however, NVIDIA has disabled that feature as of the 185.85 driver set and caused uproar with consumers. 



Batman: Arkham Asylum with PhysX effects

NVIDIA makes claim that the reason for the change was to keep the user experience with PhysX as high as possible and that they don't have the ability to test all permutations of NVIDIA GPU, AMD GPU, NVIDIA driver and AMD driver to ensure correct communication between the PhysX computing and graphics computing.  While they may have a valid point there, and everyone I have talked to at NVIDIA believes they are doing the right thing for the gamer, I would argue that most gamers disagree - as do I.  By not letting the feature continue as an "unsupported configuration" (much like early overclocking registry mods) NVIDIA is not only defiling their reputation with the community but is keeping PhysX from a significant audience that might eventually have been convinced to buy more NVIDIA cards in the future.

Another recent controversy around NVIDIA (they seem to find them frequently) has to do with the company's The Way It’s Meant to Be Played (TWIMTBP) program and the relationship between it and game developers.  Recently a marketing rep from AMD claimed that some recently released titles were "proprietary" and NVIDIA's involvement in development caused those software teams to unfairly tweak their games for NVIDIA's hardware.  The titles in question?  Batman: Arkham Asylum, Resident Evil 5 and Need for Speed: Shift.  Batman was especially controversial as it enabled in-game antialiasing support ONLY on NVIDIA hardware while AMD cards were required to use control panel based AA.  The difference there is that there are definite performance advantages to letting the game engine itself decide how and where to apply the image quality feature rather than "brute forcing" the entire scene.

In reality, I think this claim from AMD is pretty much unfounded - NVIDIA has long been accused of doing things like this but AMD has similar relationships with developers - see games like Battle Forge, DiRT 2 and Tom Clancy's H.A.W.X.  The truth is that both sides of the coin work as closely as possible with developers to make sure the latest titles work as well as possible on their own hardware.  But without a doubt, NVIDIA's development efforts in this area are much more extensive.  The developer relations team at NVIDIA is significantly larger, has a significantly larger budget and in general works with more developers than AMD's.  In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that.  NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware.  Eidos told us in an email conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined. 

Unfortunately for NVIDIA though, public perception has been clouded and all that corporate negativity will add up over time. 

Looking solely that the status of NVIDIA today, they are in a very tough position.  The GeForce line of graphics cards isn't competing very well against AMD's new HD 5800-series and NVIDIA's prices will have to drop significantly to move product off the shelves.  The secondary features that NVIDIA is hoping to convince gamers are worth sacrificing raw performance are themselves underperforming at best - PhysX, CUDA and 3D Vision each have their own problems discussed above.  And, probably worst, the general trend in community opinion about NVIDIA and their attitudes towards gamers is on the decline.  Whether or not that opinion shift is warranted isn't important - NVIDIA needs to do something to fix it if they want to earn the complete trust of gamers back.



 

No comments posted yet.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.