Review Index:

NVIDIA 337.50 Driver Analysis - Single GPU and SLI Tested

Manufacturer: NVIDIA

SLI Testing

Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games. 

As I wrote then:

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II interesting story. More on that on the next page.)

View Full Size

Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?

Continue reading our analysis of the new NVIDIA 337.50 Driver!!

In an ideal world, we would have a dozen editors that are able to run an assortment of hardware and software tests on these two drivers (335.23 and 337.50) to give you 100% of the available data. I do not have those editors, and instead relied on me and my weekend to gather the data you see here. I have included graphics cards that range from the extreme enthusiast level (GTX 780 Ti) to the high-end (GTX 770) and even mainstream (GTX 750 Ti). Most tests were run on a Core i7-3960X Sandy Bridge-E platform though I did run a handful on the Core i7-4770K + Z87 platform with the GTX 780 Ti cards in SLI for sanity.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Intel Core i7-4770K Haswell
Motherboard ASUS P9X79 Deluxe
ASUS Z87 Pro
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
NVIDIA GeForce GTX 750 Ti 2GB
Graphics Drivers NVIDIA: 335.23 WHQL, 337.50 Beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

The games used are also somewhat varied as I attempted to find cases where the NVIDIA 337.50 driver excelled while also giving the other side of the story with games that see little to no benefit to the changes. Tested titles include Batman: Arkham Origins, Battlefield 4, Crysis 3, Hitman: Absolution, Bioshock Infinite, Borderlands 2, Sniper Elite V2, Call of Duty: Ghosts, Sleeping Dogs, Metro: Last Light and GRID 2. Not every game was tested with every hardware configuration, but we'll explain as we go.

Our first page of results is going to focus on multi-GPU performance between the two drivers, starting with the GeForce GTX 780 Ti on the Core i7-3960X + X79 platform.

With the high-end processor and graphics card combination, the biggest performance gains were seen with Hitman: Absolution and Batman: Arkham Origins.

View Full Size

View Full Size

Our Hitman testing scaled a healthy 30% at 1080p and Ultra settings while at 2560x1440 we saw scaling at 36%! Batman benchmarks actually showed a better gain at 2560x1440 than at 1920x1080, hitting 5.2% in the better result. Battlefield 4 and Crysis 3 saw less than a 2% frame rate increase.

Next, we took the same GPU configuration and plugged it into a different platform, this time based on the Core i7-4770K. Call of Duty: Ghosts scores increased by a significant amount, hitting 21% at 1920x1080 on the Max preset and with 4xMSAA enabled.

View Full Size

View Full Size

Bioshock Infinite at 25x14 was able to improve by more than 6% with the 337.50 driver and Sniper Elite V2 improved by 4.39% at 1920x1080 on its highest quality settings. Hitman: Absolution again sees impressive increases in frame rate, a jump of 35% at 1920x1080 on the Ultra preset. 

Even Thief, a game that AMD has been using to help promote Mantle, sees a sizeable performance gain in SLI with a pair of GTX 780 Ti cards at 1920x1080 on the Very High preset. A scaling rate of 31% is close to the performance gains we have seen with AMD's Mantle API on its Radeon graphics cards, but NVIDIA was able to do this inside the existing DX11 API.

Our third set of SLI results moves back to the Core i7-3960X platform but replaces the GeForce GTX 780 Ti cards with GeForce GTX 770 models.

View Full Size

View Full Size

These cards retail for close to $330, compared to the $699 of the flagship offerings, which should give us a better look at configurations that more of our readers likely have in their PCs. 

This time the biggest scaling is seen with Sleeping Dogs, an older DX11 title that still impresses visually. Improvements over 22% at 1920x1080 and 14% at 2560x1440 are definitely worthwhile performance advantages for a single driver release and indicate there is some magic under the hood of 337.50. Batman: Arkham Origins scales by about 3.5% at both 1080p and 25x14, not nearly as big of an improvement but it is something. 

On the next page we'll take a look at a couple of single GPU configurations and compare 337.50 and 335.23 driver results.

Video News

April 16, 2014 | 05:40 PM - Posted by Humanitarian

"Well, as it turns out, NVIDIA was incorrect"

Well, it's nice that got officially cleared up, that was a pretty shoddy move by them.

April 16, 2014 | 06:06 PM - Posted by nashathedog (not verified)

I haven't seen much to shout about myself, I'm on a 780 and a 4770k @ 1080p and I've tested quite a few games and a lot like Arkham origins and Thief are scoring no different in there benchmarks than before, There's a few games with around 2 extra frames on the averages like Sleeping dogs and Last Light but that's it for me.

Here's my Thief benchmark scores

Old driver: 46.3/ 59.5/ 62.2.
New driver: 49/ 59/ 63.4

April 17, 2014 | 05:41 PM - Posted by Daniel Masterson (not verified)

Well from what I have read you don't get big advantages on 1080p, its when you move to 1440 and up is where you start to see the gains.

April 16, 2014 | 06:16 PM - Posted by JohnGR

What Nvidia wanted was an anti-Mantle driver buzz. They succeeded. Any mention of Mantle here and comparison of this driver with Mantle is a victory for Nvidia's marketing department. No matter the conclusion.

April 16, 2014 | 06:20 PM - Posted by Jeremy Hellstrom

Sorry, I forget which of the tested games support Mantle?

Wouldn't this be more a moral victory for AMD as NVIDIA's knee jerk reaction more or less ended up lodging their own foot in their mouth?

April 17, 2014 | 05:19 AM - Posted by JohnGR

You missed the paragraph about Thief in the article. The fact that Mantle is mentioned a dozen times and Nvidia's slide in the second page.

Anyway I didn't really had problem with this article. The prior article was another story and I had made a few comments there. In this article I was just talking about Nvidia's marketing. That marketing was aiming it's loyal customers and who ever is fighting for Nvidia on the message boards to have something to say against the Mantle buzz. In those people's minds, no one's foot ended in any mouth. For them Mantle was obliterated with just DX11 optimizations. Someone quoted me yesterday in the prior article saying to me how fanatic and with sh!t in my mouth I am, because I couldn't see that these optimizations where in ALL DX11 games. The marketing campaign was aiming people like the one that quoted me there.

PS Didn't liked the "The 7850K is better than i3 in gaming" mini videos either. I am living in a fantasy world where ALL articles are objective.

April 16, 2014 | 06:26 PM - Posted by Anonymous (not verified)

The GeForce R337.61 Hotfix Display Driver is available for download. This driver is being released by NVIDIA as a hotfix driver to resolve the following issues:

• Half display or no display on Dell UltraSharp UP3214Q/UP2414Q LCD monitor - [1453685]
• Code 43 error message after installing driver 337.50 on a PC with Hyper-V enabled - [1495541]

April 16, 2014 | 07:17 PM - Posted by Anonymous (not verified)

Thanks for info on Dell UltraSharp UP2414Q. Think I'll wait till drivers come out of beta.

April 16, 2014 | 08:42 PM - Posted by Ryan Shrout

Thanks for the heads up!

April 16, 2014 | 07:08 PM - Posted by Edkiefer (not verified)

The problem with this kind of optimization , standard BM generally are very GPU limited , not much going on on CPU side .

BF4 in singleplayer not much, but with MP should show gains but problem for testing is repeatability .

Ryan, did you notice any added frame variance with 337 . In your last review of the 290x2 there was a game that looked bad , worse than older tests, this was SLI .

April 16, 2014 | 08:42 PM - Posted by Ryan Shrout

Nothing substantial yet but we have a lot more testing to do on that.

April 16, 2014 | 07:10 PM - Posted by N3n0 (not verified)

Looks like the performance gains from any other driver release.

Not buying it Nvidia.

April 16, 2014 | 07:33 PM - Posted by Anonymous (not verified)

still with 3960X cpu, and lower end gpu, now it's definitly on purpose, trying to push gpu to be the bottleneck to avoid showing cpu overhead, hoped ryan would take some of the old comments ppl made into consideration, it's pointless.
looks like we will never get at the bottom of this driver with pcper.

April 16, 2014 | 08:43 PM - Posted by Ryan Shrout

I'm sorry...what?

April 16, 2014 | 09:18 PM - Posted by Edkiefer (not verified)

I think what he means is they wanted tests on slower, lesser CPU than a 4770 .

I disagree with these being normal driver improvements (though I am sure there are some in there too) .
If it was normal drivers fixes, it would scale different, it would scale towards fastest CPU+vid .
but we are seeing across the board improvements, even on 750 .
It would be good to see even lesser CPU, maybe i3 if Ryan has any .

April 16, 2014 | 09:26 PM - Posted by Ryan Shrout

Let me see what I can do the rest of this week. Thank your for wording it so I could grok it! lol

April 17, 2014 | 07:15 AM - Posted by Anonymous (not verified)

All the talk seems to be focused on DX11, is there any indication as to gains on DX10 (Fermi) architecture? A thorough test that I would like to see if time and money were no object:

Core 2-era quad
Phenom-era x4
Modern Pentium/Celeron
Modern i5
Modern i7

GTX 480
GTX 580
GTX 680
GTX 750
GTX 780

April 17, 2014 | 09:39 AM - Posted by Klimax (not verified)

Fermi is DX11 too. (Also will get DX12)
200 was DX10.

April 17, 2014 | 07:14 PM - Posted by Anonymous (not verified)


April 16, 2014 | 09:47 PM - Posted by Anonymous (not verified)

with Mantle release alot of benchs were made, and it showed that there isnt much cpu overhead, on high end cpu, or high resolutions or settings, where the gpu is the bottleneck, where gains from mantle becomes low in these situations, because there is no cpu overhead.
now it's weird to see these test claiming cpu overhead focus where there is none, either cpu is powerfull enough to not have it, or gpu is low end or settings of the game are too high so that the gpu is the bottleneck, and in the end well, absolutly nothing usefull in term of cpu overhead claim.
the simple fact that gains would lower when cpu is low end means there is no overhead optimisation.
still the negative point of the article is blending multi-gpu scalling and regular driver optimisation for a claim about cpu overhead optimisation.
this is becoming weird for me, if i see this, i am pretty sure you see it too.

April 16, 2014 | 07:41 PM - Posted by Anonymous (not verified)

total war Rome II SLI gain 60%, single Gpu 2%, guess that settles the fact that there is a bug in SLI solved with this driver, makes all commensts flaming at you with the previous article legitimate.
still sad though for pcper sale out to nvidia's propaganda, could have made a very in depth article about this driver to answer cpu overhead claim, but you sticked with the 3960X, really sad.
*removes pcper from bookmark, add anandtech*

April 16, 2014 | 08:44 PM - Posted by Ryan Shrout

Yeah, dog, you should totally RTFA because it addresses the previous story.

April 17, 2014 | 04:16 AM - Posted by Pete (not verified)

Dont let the trolls get to you Ryan. Your doing a great job!

April 16, 2014 | 07:59 PM - Posted by Searching4Sasquatch (not verified)

I guess "Anonymous" skipped the article and just came here to shill for AMD. Maybe he gets a free AMD T-shirt or something from that AMD forum program.
If he had read it, he'd see these guys ran the 3960 and the 4770K. Anytime I can get 30% perf for free, I'm all for it! I wonder if the WHQL version of this Beta will look exactly the same or better.

April 16, 2014 | 09:32 PM - Posted by snook

listen, they lied, simple.
ryan called them on it, and they used a half-truth
to cover their ass. now people can take a breath and kinda believe Nv doesn't pay for ryan's articles? you still should have crushed them for lying, but I get it.
this is the reason Nv is BS, they have pulled this stuff since
quake III arena days.

before it's said, i sold my soul to AMD, fanboi 4lyfe: EVGA 780SC card in my pc though.

April 16, 2014 | 09:34 PM - Posted by Searching4Sasquatch (not verified)

So this guy is lying too?

April 16, 2014 | 09:54 PM - Posted by snook

yep. if you are asking me. plus, I didn't say ryan was lying, Nv is, not a debate. his results are no better than the ones here.

April 16, 2014 | 09:57 PM - Posted by snook

for what it's worth 5-10 FPS more isn't a godsend unless you are on the edge of 30/60 FPS, maybe 120. ex: 60Hz monitor you get 70 FPS min, new driver 80 FPS literally adds nothing to the experience.

April 16, 2014 | 10:04 PM - Posted by collie (not verified)

WHY THE FUCK ARE THERE SO MANY HATERS ON THIS POST?!?!? Ryan, we your people love you, we love your sight, keep up the great work. Good article, hope you have the chance to test this on a few different configurations, even on that AMD based budget gaming system you built, but no, clearly you are not a shill. if you were you'd clearly be playing both sides from week to week and mastering way to much payoffs and who has the time to worry about all that and puppys and game.... like seriously

April 16, 2014 | 10:17 PM - Posted by snook

two words: anger management.

April 17, 2014 | 09:08 AM - Posted by diggs (not verified)

It's easy to hate when your anonymous.

April 16, 2014 | 11:50 PM - Posted by Anonymous (not verified)

Update the article with the newfound info, Ryan. Just add a line and a link for this article.

April 16, 2014 | 11:55 PM - Posted by Trey Long (not verified)

Great work as always, Ryan. Nvidia's driver teams are obviously really motivated and that's good for the industry!

April 17, 2014 | 12:07 AM - Posted by Anonymous (not verified)

Nvidia SLIes

April 17, 2014 | 02:14 AM - Posted by hodgoes (not verified)

I wish someone would benchmark these drivers with a high end GPU and a AMD 8150 or something like that.

that would really show if the driver has fixed some CPU bottlenecking..

I know personally i have noticed a 40-60% performance improvement with my 3 way GTX 480 SLI setup with my FX-8150

April 17, 2014 | 04:17 AM - Posted by Anonymous (not verified)

you know that the driver is only for GTX700 series ?

April 17, 2014 | 08:51 AM - Posted by Edkiefer (not verified)

No its not for 7xx only , I believe cards from 4xx, 5xx, 6xx 9ones that are full dx11 ) should see improvements . The older ones might not see as much as there slower so would be more GPU bound .

April 17, 2014 | 02:58 AM - Posted by Hameedo (not verified)

Please do a benchmark where you test 290X and 290 vs 780Ti and 780 in the 3 Mantle applications : BF4, Thief, and Star Swarm. The results should enlightening. is DX11 optimizations better than a new API?

April 17, 2014 | 04:44 AM - Posted by Wijkert

Very nice work Ryan! I appreciate the work you put into this article and the free time you sacrificed to release it relatively soon.

I started using this driver right after it came out and noticed increased amount of (micro)stuttering in Diablo 3, Battlefield 4 and Titanfall (didn’t test/played any other games). The Fraps counter stayed at 60 fps while playing, but it intermittently felt like 20-30fps. I am using a slightly overclocked GTX Titan, a 2500k at 4.4 and playing at 2560x1080. This problem went away when I rolled back to 335.23.

I know it is takes a lot more work, but would really like to see some Fcat testing tot verify this issue. Any other users having the same problem?

April 17, 2014 | 08:33 AM - Posted by Anonymous (not verified)

Very little gains if any here with my 780 sli
TBH im really disappointed when playing with older games.

Nvidia needs to go back and update sli profiles for lot of older demanding. Feel like im jumping through hoops trying to keep consistent 120fps with all eye candy and GPUs only running 50-60 percent

April 17, 2014 | 09:29 AM - Posted by Anonymous (not verified)

I don't think that the explanation from Nvidia is correct and that the new driver does reduce the CPU load.

Other sites have done test directly focusing on Nvidias claims (that the new driver does reduce the draw calls or at least the load created by them) by testing mainly in CPU bottleneck scenarios. One example was testing modern games in 1280 x 720 with a Core 2 Quad and a GTX 780 Ti - the results were interesting, because the framerates did only increase by less than 3%, in some cases they even decreased.

Using the same testing conditions, but a i7-3770K instead of the Core 2 Quad, the increase in fps became more noticeable, but still not nowhere near Nvidia's claim and still most games did not gain any frames from the new drivers.

April 17, 2014 | 09:46 AM - Posted by Searching4Sasquatch (not verified)

Seems like their tweaks work best when the CPU is a moderate CPU bottleneck. If you're using an ancient CPU and severely bottlenecked, even zombie baby Jesus himself can't magically turn that into a $1k Core i7.

April 17, 2014 | 01:10 PM - Posted by akaronin (not verified)

Greetings Ryan,

On Batman: AO (19x10 - Max) with 2 GTX 780Ti in SLI changing the 4770K with a 3960X will produce 72 more fps !?!?

Are you sure about this ?...

April 17, 2014 | 05:09 PM - Posted by Anonymous (not verified)

So this is just a perfectly normal driver update, doing what perfectly normal driver updates do (improving a few games hotspots here and there) + a slight higher temp to get a slight 1-2% across the board because hell, you never know.

All this surrounded with bullshit slides, and open lies to draw attention away from what is actually a real thing.

All this with the help of websites blinded by their "nVidia can't lie to us because they're so great ppl" mantra.

Well they lied, and disrespected ppl even more than with the GTX 680, 780 and Titan rip offs / jokes.

April 17, 2014 | 07:20 PM - Posted by Anonymous (not verified)


Update the article with the newfound info, Ryan. Just add a line and a link for this article.

April 18, 2014 | 10:07 AM - Posted by Ryan Shrout

I added it this morning!

April 18, 2014 | 10:03 AM - Posted by BassZealoth

I've done a simple FarCry 2 benchmarks on my PC - Core2duo E4300, GF 8600GT, 4gb DDR2, Win7SP1-64bit.

Settings: Demo(Ranch Small), 1280x800 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Custom), Vegetation(High), Shading(High), Terrain(High), Geometry(High), Post FX(High), Texture(Medium), Shadow(High), Ambient(High), Hdr(Yes), Bloom(No), Fire(Medium), Physics(Medium), RealTrees(Medium)

Average Framerate:
335.23 - 28,93
337.50 beta - 29,24

April 25, 2014 | 07:00 PM - Posted by Anonymous (not verified)

Reading forums like this makes me wonder if some people just need to get a life.

PC Per already explained the mistake about the DX11 improvements versus the SLI Profile on Total Rome 2... they retracted their previous story and posted an update explaining the circumstance. This is what I would expect from a professional journalist. Even large news organizations occasionally make a mistake and they have MUCH larger research staffs than any site about hard ware news will ever be able to afford.

And yet you see crap like this... "still sad though for pcper sale out to nvidia's propaganda".


How on earth does anyone reach this conclusion from this situation? And yet I see this kind of outrage about tiny, basically irrelevant details all the time.

Keep up the good work PC Per, despite these people that apparently need to find new hobbies.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.