NVIDIA GeForce Driver 337.50 Early Results are Impressive

Subject: Graphics Cards | April 11, 2014 - 03:30 PM |
Tagged: nvidia, geforce, dx11, driver, 337.50

UPDATE: We have put together a much more comprehensive story based on the NVIDIA 337.50 driver that includes more cards and more games while also disputing the Total War: Rome II results seen here. Be sure to read it!!

When I spoke with NVIDIA after the announcement of DirectX 12 at GDC this past March, a lot of the discussion centered around a pending driver release that promised impressive performance advances with current DX11 hardware and DX11 games. 

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

View Full Size

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

Lofty goals to be sure. This driver was released last week and I immediately wanted to test and verify many of these claims. However, a certain other graphics project kept me occupied most of the week and then a short jaunt to Dallas kept me from the task until yesterday. 

To be clear, I am planning to look at several more games and card configurations next week, but I thought it was worth sharing our first set of results. The test bed in use is the same as our standard GPU reviews.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
Graphics Drivers NVIDIA: 335.23 WHQL, 337.50 Beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there. 

First up, let's take a look at the GTX 780 Ti SLI results, the flagship gaming card from NVIDIA.

View Full Size

View Full Size

View Full Size

With this title, running at the Extreme preset, jumps from an average frame rate of 59 FPS to 88 FPS, an increase of 48%! Frame rate variance does increase a bit with the faster average frame rate but it stays within limits of smoothness, but barely.

Next up, the GeForce GTX 770 SLI results.

View Full Size

View Full Size

View Full Size

Results here are even more impressive as the pair of GeForce GTX 770 cards running in SLI jump from 29.5 average FPS to 51 FPS, an increase of 72%!! Even better, this occurs without any kind of frame rate variance increase and in fact, the blue line of the 337.50 driver is actually performing better in that perspective.

All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.

Of course, not all games are going to see performance improvements like this, or even improvements that are measurable at all. Just as we have seen with other driver enhancements over the years, different hardware configurations, image quality settings and even scenes used to test each game will shift the deltas considerably. I can tell you already that based on some results I have (but am holding for my story next week) performance improvements in other games are ranging from <5% up to 35%+. While those aren't reaching the 72% level we saw in Total War: Rome II above, these kinds of experience changes with driver updates are impressive to see.

Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!

April 11, 2014 | 03:46 PM - Posted by BBMan (not verified)

RTWII is GPU memory intensive and the game will complain if you don't have enough for all the battle objects that have to be rendered with your settings. I'm betting the improvements revolve around that. Still. I'm impressed- done with a cards < 4GB- which is what I would recommend for this title.

Thanks for the heads-up- will try it out soon.

April 11, 2014 | 04:19 PM - Posted by Anonymous (not verified)

You sure these Total War Rome II gains aren't attributed to being the first driver to support a SLI profile for the game ?

If previous drivers didn't support SLI for Total War Rome II there wouldn't be SLI performance gains in them until this driver.


•SLI Technology
Total War: Rome II – added profile
◦War Thunder – added profile
◦Watch Dogs – updated profile
◦Diablo III – updated profile

Did you check if the temperature threshold was increased in these drivers ?

April 11, 2014 | 06:47 PM - Posted by Rick (not verified)

I agree with this. Nvidia is trying to put a fast one over us. Ryan seen to be pushing their spin.

Sli was broken and they fixed it. Now they say look at the low level improvements we've made with this new driver.

April 11, 2014 | 04:34 PM - Posted by BagFullOfSharts (not verified)

This driver is a load of crap. All they did was enable proper SLI scaling for Rome: Total War 2. I have a Gigabyte GTX 770 windforce and the only thing it accomplished was driving up the temps. It would seem that most of what this driver does is allow the gpu to pull more power so it can try to boost more.

In Battlefield 4 I lock my frame rate at 59.9 in the user.cfg. Before this driver my card stayed in the high 60's and low 70's. After installing this driver my temps jumped into the low 80's. I feel like Nvidia is lying about their vaunted DX11 optimizations.

April 16, 2014 | 01:30 AM - Posted by Anonymous Gerbil (not verified)

If the optimiations unblock a previously CPU-limited scenario because draw calls or whatever then yes, higher temps are to be expected.

More stuff for the GPU to work on > GPU works harder > GPU gets hotter.

April 16, 2014 | 04:35 PM - Posted by Anonymous (not verified)

If the GPU is working harder and reaching threshold faster it would throttle more at a faster rate.

Nvidia would have to raise the thermal threshold. Something that was noticed during testing with the latest drivers.

- AMD Radeon R9 295X2 Video Card Review

On the temps, we have noticed with these latest drivers the 780 Ti is able to go up to 87c now, versus 84c with previous drivers. The observed clock speed also seemed slightly higher than we've experienced before, 1019MHz sustained clock speed on the GTX 780 Ti's while gaming. In the past, we have seen about 1006MHz clock speeds on these cards. This new 337.50 driver may have changed the thermal profile slightly to allow slightly higher clock speeds.

All these small percentage increase could be a result of the threshold being increase rather then any optimization at all.

normal, all previous drivers

new thermal threshold

April 11, 2014 | 04:38 PM - Posted by Searching4Sasquatch (not verified)

NVIDIA shared a TON of data in their article on GeForce.com. http://www.geforce.com/whats-new/articles/nvidia-geforce-337-50-beta-per...

If you look at the leaked preso they gave press, they give a ton of details there too.

BOTH drivers were tested with the latest Rome II patch. NVIDIA said SLI profiles were identical between the two drivers. Only difference is that the new Beta has the tweaks to speed up the CPU bottleneck so the two GPUs can get fed faster.

Thx for the free performance NVIDIA!

April 11, 2014 | 05:09 PM - Posted by Anonymous (not verified)

Yes, thank you Nvidia for only making us wait 4yrs for a DX11 driver improvement.

DX11 2009 came out. Nvidia realizes they can update their DX11 driver in 2014

I look forward to their DX12 enhanced driver come 2019. I can't wait!!! Free performance 4yrs later!!! Wooohooooo!!!!

April 12, 2014 | 11:32 AM - Posted by HeavyG (not verified)

Well, you can always use the old driver. Nobody is making you use this driver.

Sorry, but I get sick of gamers that complain when things get better for them. It is like nothing is ever good enough, and all you do is troll and look for something to complain about.

April 12, 2014 | 01:34 PM - Posted by Anonymous (not verified)

good bench, definitly this driver is kicking Mantle's ass on cpu overhead on a 3960X, where cpu is almost never the bottleneck just epic.

April 11, 2014 | 05:11 PM - Posted by Anonymous (not verified)

seriously total war Rome II benchmark??? the Only game that has a bug that kill 20fps, and guess what is the gain from the driver in this bench 20fps , epic " Based on our synthetic results the majority of these numbers are logical, but a few may leave you scratching your head. I’ll try to fill in the gaps. A bug has been discovered with the latest Total War: Rome II update which kills CrossFire support. This means that 20 frames per second average was being produced on a single GPU. Hopefully AMD can work with the developers on a fix since that graphics engine gets very resource hungry at higher resolutions."

the full review can be found here : http://www.forbes.com/sites/jasonevangelho/2014/04/08/radeon-r9-295x2-re...

where AMD said that there was a glitch on Total War II. and you picked specificly this game, makes me wonder was it your pick or Nvidia's Pick, honest question Ryan

April 11, 2014 | 05:18 PM - Posted by Anonymous (not verified)

just a reminder this review was made 4 days ago, the glitch was known, that any dual gpu crossfire or sli, gain 20 fps by fixing the bug............ what bothers me is that it's used to give illusion of a lie that cpu overhead optimisation gave it.
i really wanna know if there was any other games benched, and if this game was suggested by nvidia for a test, or just random pick.

April 11, 2014 | 06:38 PM - Posted by Anonymous (not verified)

i tested few games with my 980x and 2 780 SLI with speedboost off and didn't get anything close to what they claimed.
Maybe 2-5fps max.

Someone mentioned in other post and had good point the driver seemed to up the threshold on kepler boost speeds resulting in higher clocks/fps
Nvidia talking out ass the driver is pretty much working the GPU work harder to get more frames not that its actually optimized.
This would explain why SLI is seeing the biggest gains over single GPU

That would explain why i didn't see any improvements with my cards because i have speedboost disabled.

April 11, 2014 | 06:13 PM - Posted by Buyers

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there.

NVidia made a performance claim about this game specifically, so Ryan tested that game to verify said claim. The answer to your question was right up there in the article.

April 11, 2014 | 07:04 PM - Posted by Anonymous (not verified)

yea i saw that, but i guess it was just a happy accident.
althoug still wondering about single gpu perf results far away from multi-gpu bug, and also lower setup other than gpu bound to see, the miracle of cpu overhead of this wonder driver.

April 11, 2014 | 06:20 PM - Posted by Ryan Shrout

This fixed version of this game was used in both test sets here. I used the same version of the game with 335.23 and 337.50.

April 11, 2014 | 06:57 PM - Posted by Anonymous (not verified)

sorry ryan this feels to me like a sponsored review, no offense.
i mean having the results hiding behind, multi-gpu scalling and a bug, on a rig with 3960x, at very high resolutions where gpu is the bottleneck, to insert a paragraph about Mantle API vs Nvdia Driver bottleneck eraser....
nothing in this article makes sense sorry, if only you waited for more results of next week, or avoided bringing Mantle into the comparaison, but this feels weirdly wheeled to a specific direction, without anything convincing, and just sketchy at best.

April 11, 2014 | 07:28 PM - Posted by Anonymous (not verified)

Its a common theme with Ryan.

Check out the latest podcast. In the 295x2 review discussion Josh had to point out that Nvidias frame variance has gotten worse. I think he said, "Someone threw-up on your graphs and it wasn't orange"

Ryan was quick to say, "Oh we talk to Nvidia and they are working on it". Not one article about it or anything. Compare that to the weekly updated articles of AMD.

April 11, 2014 | 07:49 PM - Posted by Rick (not verified)

It didn't always seen this way. Since frame pacing has come out, pcper has turned green. It's sad too because this used to a good unbiased site. And this seems like another example.

April 12, 2014 | 12:49 AM - Posted by Allyn Malventano

Every time some new test method points out a flaw with a specific vendor, sites are accused of siding with their competition. You guys do realize that we reported on frame pacing issues with Nvidia cads *before* we reported on that (worse and longer-running) issue with AMD cards, right?

If Ryan was biased, I would not be working for him.

April 12, 2014 | 01:23 AM - Posted by Anonymous (not verified)

Its not about a new testing methodology. It's about consistency.

In order to have integrity you have to be consistent. If Ryan were to brush off the AMD pacing problems as quick as he was to brush off the results Josh pointed out to him then fine. At least he would be consistent.

Seams he is more than willing to take Nvidia word then to follow up on it. Unlike what he did with AMD.

That's not integrity nor consistent reeks of bias and hypocrisy.

April 12, 2014 | 03:45 AM - Posted by JohnGR

This is typical with most sites lately. They are easily unforgiving when talking about AMD, they are happy to be less offensive towards Nvidia. This isn't strange to tell you the truth. You have to be careful with Nvidia because tomorrow they will be here, but AMD, who knows? On the other hand we all know by now I guess, that Nvidia plays dirty while AMD has a more naive politic. Say something bad about AMD and tomorrow the sun will shine again. Say something bad about Nvidia and then search for a deep hole to hide.

April 14, 2014 | 08:29 AM - Posted by Anonymous (not verified)

I think you hit the nail on the head, But are these sites really to blame? I think AMD are too slack for not pulling out their stick as Nvidia do.

I also think favouritism is also at play here. Most sites get caught out by not being consistent (Sorry was busy) enough.

What shocks me is that he had time for SLI but not single ROFLMAO

April 12, 2014 | 03:40 AM - Posted by JohnGR

I have told Ryan in the past that you are doing it WAY TOO OBVIOUS. Maybe this is a necessary game you HAVE to play with big companies, either to become bigger or just survive the competition between hardware sites. But it is really easy to spot the articles that are biased, or just look more like press releases/payed advertising thank objective. They have become just too many lately.

For example all those little videos promoting 7850K as a gaming APU. Then to balance a little the situation you go out and propose gaming platforms for Titanfall where AMD is nowhere. Absolutely nowhere for a game that isn't cpu dependent if I am not wrong which means you can lower the cost of the platform by using AMD hardware and invest most in gpu(AMD or Nvidia). Now you play Nvidia's marketing game with the "Wonder driver", the "Mantle killer" driver showing just one game and calling this impressive. We aren't born yesterday. I remember 40%+ gains from drivers from a decade back. Generally there are double digits in about every new driver out there in one or more titles. And are we really comparing this driver with Mantle using a 12 threads monster? This is ridiculous for a site that have proved many times that the people who are behind this are real experts. And of course the WHOLE article with ALL the wonderful and amazing charts that show the massive performance improvements in Rome II is visible without needing to open it.

April 12, 2014 | 01:02 AM - Posted by Allyn Malventano

It didn't always seen this way. Since frame pacing has come out, pcper has turned green. It's sad too because this used to a good unbiased site. And this seems like another example.

Yeah, even our front page is just so green:

April 14, 2014 | 08:33 AM - Posted by Anonymous (not verified)

Is it me or has the colour of the site got a bit more greener also to match Nvidia? ROFLMAO

April 13, 2014 | 06:19 PM - Posted by Jeremy Hellstrom

That was my comment about the orange barf; then again if our pointing out an issue with a driver we reported on in an article is proof we are covering up for NVIDIA then I suppose your observation is not too surprising.

April 13, 2014 | 10:07 PM - Posted by Anonymous (not verified)

The proof is plain and simple. Nvidia provided the proof.

NO SLI support for Total War Rome II

SLI support for Total War Rome II

other tech sites have already pointed this out.

ExtremeTech - Nvidia’s questionable GeForce 337.50 driver, or why you shouldn’t trust manufacturer-provided numbers


Funnily enough, you see that huge performance boost for Total War: Rome II under SLI? That’s because Rome II didn’t support SLI until this version of the driver. Yes, that’s how Nvidia claimed an “up to 71%” performance boost for SLI in the 337.50 driver. Sneaky, eh? Kind of like saying “our new driver boosts performance by 71%… if you also throw in a second graphics card!”

So Ryan is either lazy not to even read the driver release notes or so incompetent to just take Nvidias word for it.

April 11, 2014 | 05:44 PM - Posted by Anonymous (not verified)

thanks ryan for the article, disapointing but still apreciate the work.
so to explain why i was disapointed, i will share few reasons with you :
1-picked a buged game for sli/crossfire (probably didnt know about it)
2-the article is focusing on cpu overhead claim, while the cpu is 3960x instead of lower end cpu, so that we can see how efficient it is(many ppl commented on this specific point on the 1st release news, was hoping you would have taken them into consideration).
3-redusing cpu overhead doesnt require sli, and all other benchs shown that this driver is basicly multi-gpu scalling, nothing else.
so if you were you, and wanted to investigating cpu overhead, i would have started by benching single gpu, and low/mid end cpu, then with these results moved on to a conclusion about cpu overhead in this driver.
then i would have used sli on few games ( you probably didnt know about the bug in Total War Rome II ), to move and conclude about multi-gpu scalling.
but honestly i think this article is miss leading in way too many points, that strips it of any value of research or objectivity, and in the end i still have the same questions as the day Nvidia announced this driver( although other more in dept reviews showed, no cpu overhead, and good multi-gpu scalling, while this article adresses neither), i was hoping to get an in dept review by pcper, hope ryan will still take my humble opinion into consideration, and maybe investigate this further.(and maybe take the article down for further work, because this one seems like a wast sorry)

April 12, 2014 | 12:52 AM - Posted by Allyn Malventano

1. newest version of the game used for both sides of the comparison.

2. game does not use all threads of the CPU. more specifically, the driver / API is usually a single thread.

3. cpu overhead is greater in SLI.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.