NVIDIA Reiterates Dedication to Gaming on PC, Working with Developers

Manufacturer: NVIDIA

A necessary gesture

NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC.  But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem.  But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.

View Full Size

The slide above shows NVIDIA targeting for each segment – expect for consoles obviously.  NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces.  I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?

The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers.  AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets.  But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.

View Full Size

With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine.  They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software.  They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.

View Full Size

This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more. 

View Full Size

Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming. 

Continue reading our editorial on NVIDIA's stance on it's future in PC gaming!!

View Full Size

NVIDIA looked at 143 of the most recent PC games from the last 12 months and wanted to find out how much in-house developed technology was actually adopted in real-world titles.  The slide above shows what NVIDIA thinks is the result – a 40-56% adoption rate for NVIDIA technologies but a much smaller result for AMD.  Tony Tomasi, NVIDIA Senior VP of Content and Technology, did mention that they removed things like “multi-monitor” that didn’t “change the game” like FXAA and TressFX.  I would argue whole heartedly though that simply removing technologies like Eyefinity, the FIRST consumer multi-display PC gaming technology is a PR move and puts these results in question. 

The right half of the slide tells us that NVIDIA sees that even with the games are in the AMD Gaming Evolved platform, they still have impressive performance.  Based on results from four different websites (one from each game) the GTX 680 is running faster than the HD 7970 GHz Edition on Far Cry 3, Crysis 3, Tomb Raider and Bioshock Infinite.  They aren’t showing what settings, what resolutions and what segments, but because these results are from third parties they can kind of skirt around the details.

View Full Size

Along with technologies is market share – something that NVIDIA is proud of.  Despite the impressive push of the Radeon HD 7000 series of cards NVIDIA claims an increase in the discrete GPU market share at 66% today.  I’m not sure if this only counts only DESKTOP discrete graphics or if it might include mobile graphics as well – I’ll get more details soon.  UPDATE: NVIDIA did confirm that this slide was representative of both desktop and mobile discrete.

View Full Size

We have given NVIDIA a lot of flak in the past year or so about its lack of effort in game development.  AMD has seemingly dominated that field with games like Far Cry 3, Sleeping Dogs, Bioshock Infinite and many more.  Even taking the bundles out of the picture, AMD has been more active than it has ever been.  NVIDIA hasn’t been totally lame though, working with titles like Metro: Last Light, Borderlands 2 and Assassin’s Creed III.  Even with this slide in our face, it is hard not to see AMD as really grasping the gaming mantle for its own over the last 12 months.

View Full Size

But maybe the future will change?  NVIDIA showed titles like Assassin’s Creed IV: Black Flag, Watch_Dogs, Splinter Cell: Blacklist, The Witcher 3 and the upcoming Batman: Arkham Origins as NVIDIA-sponsored and integrating NVIDIA technologies.  All five of them use DX11, tessellation and even TXAA – a highly efficient NVIDIA-only anti-aliasing technology. 

View Full Size

Another important factor NVIDIA is touting is integration of some of its core technology with engines like CryEngine 3, Unreal Engine 4, Frostbite and more.  This is incredibly important for NVIDIA as dominance in an engine will span multiple games throughout the technology generation.  Those engines are going to be targeting the new PS4 and Xbox One that obviously use AMD APUs.  NVIDIA will be working much harder in the future to maintain the same level of dominance in engine support. 

View Full Size

Even though NVIDIA isn’t in any of the new consoles, Tomasi thinks that this is good for NVIDIA in the long run.  The new consoles will step up the performance baseline of gaming going performance and decreasing that gap from 13x to something more manageable like 2.5x, games will ship to the PCs with more features and better performance.  Increasing that change is the move to the similar x86 architecture that the PC uses – though AMD will drastically benefit from being the core technology in the Xbox One and the Playstation 4, NVIDIA will benefit tangentially. 

View Full Size

And because software developers like Epic are building games ON PCs and then publishing them on consoles, and they have been doing that for years, the higher performance PCs will benefit from the ability to easily translate from consoles to PC. 


NVIDIA then cantered up quite a few game developers to talk to the press about their relationship with NVIDIA and how they are using NVIDIA technologies to improve gaming on the PC.  We got to see some game play from the upcoming Witcher 3 title as well as Assassin’s Creed IV and Splinter Cell.

View Full Size

View Full Size

The Witcher 3 demo did include integration of a physics based fur technology.  It was pretty damn impressive but is hard to demonstrate in static photos.  You should try to find some videos online of the title if you can – it is more than worth the effort!

Another big partner brought on stage was Epic showing the Infiltrator demo running on a single NVIDIA graphics card (likely a Titan).  This is the video of what we saw:

No, the demo isn’t new, but that doesn’t make it any less impressive that it was running on a single graphics card, in real time.


Obviously NVIDIA has a lot of work to do in the coming months, through no fault of its own really.  The inherent response from the media with AMD’s dominance in the console market is that this will carry over to the PC as well in due time.  NVIDIA has thus taken a hit by not at least forcing its way into one of the consoles to force the same 50/50 split that existed in the previous generation.  From a PR perspective NVIDIA will be getting game developers up on stages for the foreseeable future to make sure gamers and the industry knows that it still values the PC gaming markets and is dedicated to it with the GeForce brand. 

Being somewhat on the inside of this business though I believe that NVIDIA continues to be committed to the PC ecosystem, regardless of the success of projects like Tegra and SHIELD directly.  The GeForce Experience software technology is meant to address some of the PC-specific complaints for casual games – that it’s too hard to setup – and does with a grace I didn’t initially expect.  Don’t be surprised if later in 2013 you see some campaigns from NVIDIA telling everyone that PC gaming can be just as easy, just as “plug and play” as the consoles, as long as you are using GeForce graphics. 

For the green team, the next 12 months are going to be interesting - that includes the lead up to and then the first wave of next-gen console games.  How will developers react?  How will gamers?  More importantly, how will NVIDIA?

June 11, 2013 | 11:05 PM - Posted by Anonymous (not verified)

They need to fix there drivers with 320.18 WHQL bricking cards left and right. It reminds me of the Mac problem where it was there GPU failures and took years for them come to a settlement only to pay up with a replacement that was 1/3 of the original product price.

As for Shield is a lame attempt to counter their failed attempts at securing any next gen consoles. Going by there own charts out of 77% Nvidia only has 30%. 6% In console PS3 and say half of PC 24%. That leaves 47% to AMD. Its comical they are try'n to bash X-Box One and PS4 for using AMD Jaguar but they hype Shield up to no end. Where are the comparisons between Tegra 4 Wayne and AMD Jaguar SoC ?

Technology leadership? The 600 & 700 GeForce series cards dont even support full DirectX 11.1. I'm wondering if Nvidia hired all the marketing and driver team AMD let go last year.

June 12, 2013 | 03:23 AM - Posted by DeadOfKnight

Your entire post is erroneous.

June 12, 2013 | 01:01 PM - Posted by ddg4005

I'm using the 320.18 drivers right now and they are as solid as the previous version. You're either mistaken on this point or trolling.

June 13, 2013 | 12:51 AM - Posted by Tri Wahyudianto (not verified)

i'm using the 320.18 drivers too, very stable until now and not to mention that increase performance and stability in many new game.

June 13, 2013 | 01:48 AM - Posted by Anonymous (not verified)

Seams a lot of people trolling on various forums then.

Nvidia Forums

EVGA Forums

Guru3D Forums

Just browse Nvidia AIB partner forums and well traffic tech/gpu forums you'll find lots of examples.

June 13, 2013 | 07:44 PM - Posted by renz (not verified)

there is no perfect driver even if is WHQL one. do you ever wonder why nvidia have driver feedback thread on their forum for each WHQL and beta driver they released? it is so the user can report their problem and the driver team can investigate the problem and reproduced it in their lab. remember that the TDR problem with gtx560 ti that make nvidia asking user with problem to send their card directly to nvidia? it is because they care to solve the problem

June 13, 2013 | 08:51 PM - Posted by Anonymous (not verified)

Actualy thats bad Quality Control. The article and slide taut "With several global testing facilities". What good are these Global Testing Facilities if they don't improve the next driver but instead make it worse. One would think they have a variety of systems to test different variables before it makes its way to the end user and cause havoc if its bad.

"it is because they care to solve the problem" A problem they've created. Any company would want to fix this issue to not have angry customers that experince bricking a card, BSOD and game artifacts when playing.

They care to solve the problem so they wont loose money.

June 14, 2013 | 09:01 AM - Posted by Parge (not verified)

Actually, he's telling the truth, the 320.18 drivers are totally borked for a great many people. I've suffered major artifacting, but luckily, unlike my friend, not a bricked card.

June 14, 2013 | 06:27 PM - Posted by renz (not verified)

i have been using the driver since it came out and so far it works well with my 660 SLI. anyway how did the driver physically break the card?

June 15, 2013 | 07:48 AM - Posted by Ytazbddj (not verified)

Actually driver can damage physical Card. Example : If driver some how provides wrong info of current-flow, it might damage your capacitor and break the Card.

June 17, 2013 | 10:22 AM - Posted by renz496 (not verified)

i mean specifically for the 320.18 branch driver itself. what kind of damage it causes to the card and if possible which part of the driver causing the problem (before this nvidia had once messed up with fan control which cause some cards got damaged by overheating)

June 12, 2013 | 12:55 AM - Posted by cjflex1

This is laughable... How dear Nvidia rely on their laurels? This disgust me!Showing all this nonsense. --Nvidia needs to stop living in the past, and look at what's happening now, take a look at the bigger picture-- hence'' do something about it. This whole charade only shows how desperate they are!

June 12, 2013 | 01:53 PM - Posted by Klimax (not verified)

AMD would love to be this desperate... (Look at how much NVidia earns)

June 13, 2013 | 01:35 AM - Posted by cjflex1

The winds of time is changing, my-friend. Sooner or later, even a king/queen gets overthrown. There's only 1 almighty. Everything else are lower creatures on equal grounds. You take 2 steps ahead of me, and i'll find a way to get past you.

June 12, 2013 | 02:14 AM - Posted by pdjblum

The numbers from their latest strong quarterly report are primarily due to the sale of discrete gpu's for pc's, as stated in their results presentation. There more recent trumpeted efforts in arm soc's contributed little. PC gaming is their bread and butter, and they have every reason to give that business all the attention and effort it deserves. Too bad they had to set the msrp on the 780 so high.

June 12, 2013 | 10:04 AM - Posted by mAxius

ah poor nvidia i see they are feverishly trying to spin themselfs into a good position against the amd onslaught... sad

June 12, 2013 | 10:39 AM - Posted by Irishgamer01

Geforce experience failed to ID any of my games (All legit) on two on different machines.....

Me thinks they need to do some more QC.

June 12, 2013 | 11:29 PM - Posted by ThorAxe

Are your games on different drives? You can manually add the game to GeForce Experience.

June 12, 2013 | 10:57 AM - Posted by Anonymous (not verified)

Nvidia will have more to worry about when Kaveri is out. If the chips can perform well enough then the requirement for discrete GPUs would decrease even further at lower price points. 35W TDP for the GPU+CPU for decent gaming performance equivalent to mid-level dGPU sounds good if rumors are true.

They will continue having problems selling their Tegra chips after they failed to meet expectations several times in the past.

The market for high-end gaming is still strong and their dominance there will not be tested for some time. They just make'em drivers better than AMD.

The PS4 console market outside the US will grow if Sony comes good with their promises. A healthy gaming market at lower price points would be a much better deal for the consumers here who don't really have the dough to toss around on a good PC. Looks to me like Microsoft has shot itself in the foot this time around.

June 12, 2013 | 11:29 AM - Posted by renz496 (not verified)

uh isn't that there are rumor about Watch Dog and the new Splinter Cell will be under AMD Gaming Evolve Banner?

June 12, 2013 | 07:22 PM - Posted by Ryan Shrout

It looks like things are still up in the air...?

June 13, 2013 | 08:01 PM - Posted by renz (not verified)

well anything can change over time just like crysis 3. i was sure it the game was TWIMTBP title. if not why nvidia have the right to distribute the alpha key and not amd if the title was under gaming evolved from the very beginning. also the use of TXAA which only available under TWIMTBP title

June 12, 2013 | 11:24 PM - Posted by Anonymous (not verified)

After reading about AMD`s framerating results on here my next GPU is Nvidia.

June 13, 2013 | 12:12 AM - Posted by Anonymous (not verified)

Unless your buying 2 GPUs that makes sense.

Since those same Frame-Rating results show that single card setup AMD cards are better.

June 12, 2013 | 11:31 PM - Posted by Anonymous (not verified)

PC/Nvidia FTW !

June 13, 2013 | 01:05 AM - Posted by Tri Wahyudianto (not verified)

you all missing the point in here, while everybody leave PC platform to optimize Console or Mobile, ex : Intel processor who not increase it's peak performance but lowering their power consumption (look at new macbook air with haswell) and AMD who not launching their 8000 series discrete GPU but launching console
You and me as PC gamer should thank you Nvidia for their attention to PC gamer and it will be trigger good competiting in PC gaming. And in the end we as consumers will be in benefit.

June 13, 2013 | 04:43 AM - Posted by Panta

damn fan boys, you should ban them all :-)

the invention of fan boy phenomenon & the encouragement of it
i think it's more of Nvidia/AMD/hardware forums felt,
then a natural occurring phenomenon.

guys don't fall to that pit,
its silly & does not serve us the consumers.

June 13, 2013 | 07:48 AM - Posted by Metwinge (not verified)

Well said budski.

I have to agree fanboyism is a pile of crap and have been let down buy Nvidia, AMD and Intel on occasions when using certain components.

Right now my 4.8Ghz 2600k, Trifire 7970 setup is doing me proud, when they show they're age I'll do my research and buy what's best for my budget

June 13, 2013 | 11:48 AM - Posted by TwinShadow (not verified)

Only thing I see here is marketing marketing marketing Nvidia knows they dropped the ball with the console thing and now are trying to play damage control. the PC marketing is declining that's why they created the Shield

June 13, 2013 | 01:15 PM - Posted by Anon (not verified)

I see a lot of butthurt over AMD fanboys.

June 14, 2013 | 02:03 PM - Posted by Anonymous (not verified)

i have 2 geforce cards but they all finished, some of the caps burnt! im going to try amd now. my friend says its more durable, can withstand 24/7 ops. i hope so.

June 17, 2013 | 08:07 AM - Posted by nobody special (not verified)

NV doesn't make these cards...That manufacturer can just as easily make an AMD piece of crap. NV just makes the chips, get it? MSI, Asus, XFX etc etc make the cards. If they suck its their fault not AMD or NV....ROFL. Fanboys...I digress...

Buy a better AIB maker!

June 17, 2013 | 10:25 AM - Posted by renz496 (not verified)

it reminds me about the black screen problem that happen to early batch of Sapphire 7870

June 15, 2013 | 07:35 AM - Posted by amadsilentthirst

Being an Nvida guy myself, I cringe at the classic misrepresentation of the data in that graph image.
They arbitrary cut it off at a point that looks better for them.
It should start at 0 no 0.8

Imagine if they cut it off at 0.99 how would that look visually?

Respect was lost here.

June 21, 2013 | 03:18 AM - Posted by Anonymous (not verified)

Anyway i'm more confident in the future support of nVIDIA tech by the developers than ATI. Plus, i play in 3DVision. It's a great step for visual : playing Tomb Raider this way, is what you can proudly call almost photorealistic 3D animations.

June 27, 2013 | 08:03 PM - Posted by ezjohny

The new Xbox One and the PS4 is not that beefy in the hardware aspect for the future of gaming, I would of like 2.5 TeraFlops in the consoles.
I'm wondering if game developers could do 2.5 TeraFlops on the PC and port that down to consoles, so PC gamers would have the next generation in game play!!! The Samaritan Demo.
The Radeon HD 7870 GHz edition could pull the Samaritan demo! This graphic card is on the cheap end to!!!

I'm a PC gamer.

December 20, 2014 | 05:31 PM - Posted by Anonymous (not verified)

Well I'm never coming back to his site...

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.