NVIDIA GeForce Driver 337.50 Early Results are Impressive

Subject: Graphics Cards | April 11, 2014 - 03:30 PM |
Tagged: nvidia, geforce, dx11, driver, 337.50

UPDATE: We have put together a much more comprehensive story based on the NVIDIA 337.50 driver that includes more cards and more games while also disputing the Total War: Rome II results seen here. Be sure to read it!!

When I spoke with NVIDIA after the announcement of DirectX 12 at GDC this past March, a lot of the discussion centered around a pending driver release that promised impressive performance advances with current DX11 hardware and DX11 games. 

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

View Full Size

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

Lofty goals to be sure. This driver was released last week and I immediately wanted to test and verify many of these claims. However, a certain other graphics project kept me occupied most of the week and then a short jaunt to Dallas kept me from the task until yesterday. 

To be clear, I am planning to look at several more games and card configurations next week, but I thought it was worth sharing our first set of results. The test bed in use is the same as our standard GPU reviews.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
Graphics Drivers NVIDIA: 335.23 WHQL, 337.50 Beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there. 

First up, let's take a look at the GTX 780 Ti SLI results, the flagship gaming card from NVIDIA.

View Full Size

View Full Size

View Full Size

With this title, running at the Extreme preset, jumps from an average frame rate of 59 FPS to 88 FPS, an increase of 48%! Frame rate variance does increase a bit with the faster average frame rate but it stays within limits of smoothness, but barely.

Next up, the GeForce GTX 770 SLI results.

View Full Size

View Full Size

View Full Size

Results here are even more impressive as the pair of GeForce GTX 770 cards running in SLI jump from 29.5 average FPS to 51 FPS, an increase of 72%!! Even better, this occurs without any kind of frame rate variance increase and in fact, the blue line of the 337.50 driver is actually performing better in that perspective.

All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.

Of course, not all games are going to see performance improvements like this, or even improvements that are measurable at all. Just as we have seen with other driver enhancements over the years, different hardware configurations, image quality settings and even scenes used to test each game will shift the deltas considerably. I can tell you already that based on some results I have (but am holding for my story next week) performance improvements in other games are ranging from <5% up to 35%+. While those aren't reaching the 72% level we saw in Total War: Rome II above, these kinds of experience changes with driver updates are impressive to see.

Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!


April 11, 2014 | 03:46 PM - Posted by BBMan (not verified)

RTWII is GPU memory intensive and the game will complain if you don't have enough for all the battle objects that have to be rendered with your settings. I'm betting the improvements revolve around that. Still. I'm impressed- done with a cards < 4GB- which is what I would recommend for this title.

Thanks for the heads-up- will try it out soon.

April 11, 2014 | 04:19 PM - Posted by Anonymous (not verified)

You sure these Total War Rome II gains aren't attributed to being the first driver to support a SLI profile for the game ?

If previous drivers didn't support SLI for Total War Rome II there wouldn't be SLI performance gains in them until this driver.

http://www.geforce.com/drivers/results/74636

•SLI Technology
Total War: Rome II – added profile
◦War Thunder – added profile
◦Watch Dogs – updated profile
◦Diablo III – updated profile

Did you check if the temperature threshold was increased in these drivers ?

April 11, 2014 | 06:47 PM - Posted by Rick (not verified)

I agree with this. Nvidia is trying to put a fast one over us. Ryan seen to be pushing their spin.

Sli was broken and they fixed it. Now they say look at the low level improvements we've made with this new driver.

April 11, 2014 | 04:34 PM - Posted by BagFullOfSharts (not verified)

This driver is a load of crap. All they did was enable proper SLI scaling for Rome: Total War 2. I have a Gigabyte GTX 770 windforce and the only thing it accomplished was driving up the temps. It would seem that most of what this driver does is allow the gpu to pull more power so it can try to boost more.

In Battlefield 4 I lock my frame rate at 59.9 in the user.cfg. Before this driver my card stayed in the high 60's and low 70's. After installing this driver my temps jumped into the low 80's. I feel like Nvidia is lying about their vaunted DX11 optimizations.

April 16, 2014 | 01:30 AM - Posted by Anonymous Gerbil (not verified)

If the optimiations unblock a previously CPU-limited scenario because draw calls or whatever then yes, higher temps are to be expected.

More stuff for the GPU to work on > GPU works harder > GPU gets hotter.

April 16, 2014 | 04:35 PM - Posted by Anonymous (not verified)

If the GPU is working harder and reaching threshold faster it would throttle more at a faster rate.

Nvidia would have to raise the thermal threshold. Something that was noticed during testing with the latest drivers.


[H]ardocp
- AMD Radeon R9 295X2 Video Card Review


On the temps, we have noticed with these latest drivers the 780 Ti is able to go up to 87c now, versus 84c with previous drivers. The observed clock speed also seemed slightly higher than we've experienced before, 1019MHz sustained clock speed on the GTX 780 Ti's while gaming. In the past, we have seen about 1006MHz clock speeds on these cards. This new 337.50 driver may have changed the thermal profile slightly to allow slightly higher clock speeds.

All these small percentage increase could be a result of the threshold being increase rather then any optimization at all.

335.23
84c
normal, all previous drivers

337.50
87c
new thermal threshold

April 11, 2014 | 04:38 PM - Posted by Searching4Sasquatch (not verified)

NVIDIA shared a TON of data in their article on GeForce.com. http://www.geforce.com/whats-new/articles/nvidia-geforce-337-50-beta-per...

If you look at the leaked preso they gave press, they give a ton of details there too.

BOTH drivers were tested with the latest Rome II patch. NVIDIA said SLI profiles were identical between the two drivers. Only difference is that the new Beta has the tweaks to speed up the CPU bottleneck so the two GPUs can get fed faster.

Thx for the free performance NVIDIA!

April 11, 2014 | 05:09 PM - Posted by Anonymous (not verified)

Yes, thank you Nvidia for only making us wait 4yrs for a DX11 driver improvement.

DX11 2009 came out. Nvidia realizes they can update their DX11 driver in 2014

I look forward to their DX12 enhanced driver come 2019. I can't wait!!! Free performance 4yrs later!!! Wooohooooo!!!!

April 12, 2014 | 11:32 AM - Posted by HeavyG (not verified)

Well, you can always use the old driver. Nobody is making you use this driver.

Sorry, but I get sick of gamers that complain when things get better for them. It is like nothing is ever good enough, and all you do is troll and look for something to complain about.

April 12, 2014 | 01:34 PM - Posted by Anonymous (not verified)

good bench, definitly this driver is kicking Mantle's ass on cpu overhead on a 3960X, where cpu is almost never the bottleneck just epic.

April 11, 2014 | 05:11 PM - Posted by Anonymous (not verified)

seriously total war Rome II benchmark??? the Only game that has a bug that kill 20fps, and guess what is the gain from the driver in this bench 20fps , epic " Based on our synthetic results the majority of these numbers are logical, but a few may leave you scratching your head. I’ll try to fill in the gaps. A bug has been discovered with the latest Total War: Rome II update which kills CrossFire support. This means that 20 frames per second average was being produced on a single GPU. Hopefully AMD can work with the developers on a fix since that graphics engine gets very resource hungry at higher resolutions."

the full review can be found here : http://www.forbes.com/sites/jasonevangelho/2014/04/08/radeon-r9-295x2-re...

where AMD said that there was a glitch on Total War II. and you picked specificly this game, makes me wonder was it your pick or Nvidia's Pick, honest question Ryan

April 11, 2014 | 05:18 PM - Posted by Anonymous (not verified)

just a reminder this review was made 4 days ago, the glitch was known, that any dual gpu crossfire or sli, gain 20 fps by fixing the bug............ what bothers me is that it's used to give illusion of a lie that cpu overhead optimisation gave it.
i really wanna know if there was any other games benched, and if this game was suggested by nvidia for a test, or just random pick.

April 11, 2014 | 06:38 PM - Posted by Anonymous (not verified)

i tested few games with my 980x and 2 780 SLI with speedboost off and didn't get anything close to what they claimed.
Maybe 2-5fps max.

Someone mentioned in other post and had good point the driver seemed to up the threshold on kepler boost speeds resulting in higher clocks/fps
Nvidia talking out ass the driver is pretty much working the GPU work harder to get more frames not that its actually optimized.
This would explain why SLI is seeing the biggest gains over single GPU

That would explain why i didn't see any improvements with my cards because i have speedboost disabled.

April 11, 2014 | 06:13 PM - Posted by Buyers

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there.

NVidia made a performance claim about this game specifically, so Ryan tested that game to verify said claim. The answer to your question was right up there in the article.

April 11, 2014 | 07:04 PM - Posted by Anonymous (not verified)

yea i saw that, but i guess it was just a happy accident.
althoug still wondering about single gpu perf results far away from multi-gpu bug, and also lower setup other than gpu bound to see, the miracle of cpu overhead of this wonder driver.

April 11, 2014 | 06:20 PM - Posted by Ryan Shrout

This fixed version of this game was used in both test sets here. I used the same version of the game with 335.23 and 337.50.

April 11, 2014 | 06:57 PM - Posted by Anonymous (not verified)

sorry ryan this feels to me like a sponsored review, no offense.
i mean having the results hiding behind, multi-gpu scalling and a bug, on a rig with 3960x, at very high resolutions where gpu is the bottleneck, to insert a paragraph about Mantle API vs Nvdia Driver bottleneck eraser....
nothing in this article makes sense sorry, if only you waited for more results of next week, or avoided bringing Mantle into the comparaison, but this feels weirdly wheeled to a specific direction, without anything convincing, and just sketchy at best.

April 11, 2014 | 07:28 PM - Posted by Anonymous (not verified)

Its a common theme with Ryan.

Check out the latest podcast. In the 295x2 review discussion Josh had to point out that Nvidias frame variance has gotten worse. I think he said, "Someone threw-up on your graphs and it wasn't orange"

Ryan was quick to say, "Oh we talk to Nvidia and they are working on it". Not one article about it or anything. Compare that to the weekly updated articles of AMD.

April 11, 2014 | 07:49 PM - Posted by Rick (not verified)

It didn't always seen this way. Since frame pacing has come out, pcper has turned green. It's sad too because this used to a good unbiased site. And this seems like another example.

April 12, 2014 | 12:49 AM - Posted by Allyn Malventano

Every time some new test method points out a flaw with a specific vendor, sites are accused of siding with their competition. You guys do realize that we reported on frame pacing issues with Nvidia cads *before* we reported on that (worse and longer-running) issue with AMD cards, right?

If Ryan was biased, I would not be working for him.

April 12, 2014 | 01:23 AM - Posted by Anonymous (not verified)

Its not about a new testing methodology. It's about consistency.

In order to have integrity you have to be consistent. If Ryan were to brush off the AMD pacing problems as quick as he was to brush off the results Josh pointed out to him then fine. At least he would be consistent.

Seams he is more than willing to take Nvidia word then to follow up on it. Unlike what he did with AMD.

That's not integrity nor consistent reeks of bias and hypocrisy.

April 12, 2014 | 03:45 AM - Posted by JohnGR

This is typical with most sites lately. They are easily unforgiving when talking about AMD, they are happy to be less offensive towards Nvidia. This isn't strange to tell you the truth. You have to be careful with Nvidia because tomorrow they will be here, but AMD, who knows? On the other hand we all know by now I guess, that Nvidia plays dirty while AMD has a more naive politic. Say something bad about AMD and tomorrow the sun will shine again. Say something bad about Nvidia and then search for a deep hole to hide.

April 14, 2014 | 08:29 AM - Posted by Anonymous (not verified)

I think you hit the nail on the head, But are these sites really to blame? I think AMD are too slack for not pulling out their stick as Nvidia do.

I also think favouritism is also at play here. Most sites get caught out by not being consistent (Sorry was busy) enough.

What shocks me is that he had time for SLI but not single ROFLMAO

April 12, 2014 | 03:40 AM - Posted by JohnGR

I have told Ryan in the past that you are doing it WAY TOO OBVIOUS. Maybe this is a necessary game you HAVE to play with big companies, either to become bigger or just survive the competition between hardware sites. But it is really easy to spot the articles that are biased, or just look more like press releases/payed advertising thank objective. They have become just too many lately.

For example all those little videos promoting 7850K as a gaming APU. Then to balance a little the situation you go out and propose gaming platforms for Titanfall where AMD is nowhere. Absolutely nowhere for a game that isn't cpu dependent if I am not wrong which means you can lower the cost of the platform by using AMD hardware and invest most in gpu(AMD or Nvidia). Now you play Nvidia's marketing game with the "Wonder driver", the "Mantle killer" driver showing just one game and calling this impressive. We aren't born yesterday. I remember 40%+ gains from drivers from a decade back. Generally there are double digits in about every new driver out there in one or more titles. And are we really comparing this driver with Mantle using a 12 threads monster? This is ridiculous for a site that have proved many times that the people who are behind this are real experts. And of course the WHOLE article with ALL the wonderful and amazing charts that show the massive performance improvements in Rome II is visible without needing to open it.
Seriously????................

April 12, 2014 | 01:02 AM - Posted by Allyn Malventano

It didn't always seen this way. Since frame pacing has come out, pcper has turned green. It's sad too because this used to a good unbiased site. And this seems like another example.

Yeah, even our front page is just so green:

April 14, 2014 | 08:33 AM - Posted by Anonymous (not verified)

Is it me or has the colour of the site got a bit more greener also to match Nvidia? ROFLMAO

April 13, 2014 | 06:19 PM - Posted by Jeremy Hellstrom

That was my comment about the orange barf; then again if our pointing out an issue with a driver we reported on in an article is proof we are covering up for NVIDIA then I suppose your observation is not too surprising.

April 13, 2014 | 10:07 PM - Posted by Anonymous (not verified)

The proof is plain and simple. Nvidia provided the proof.

335.23
NO SLI support for Total War Rome II

337.50
SLI support for Total War Rome II

other tech sites have already pointed this out.

ExtremeTech - Nvidia’s questionable GeForce 337.50 driver, or why you shouldn’t trust manufacturer-provided numbers

http://www.extremetech.com/gaming/180088-nvidias-questionable-geforce-33...

Funnily enough, you see that huge performance boost for Total War: Rome II under SLI? That’s because Rome II didn’t support SLI until this version of the driver. Yes, that’s how Nvidia claimed an “up to 71%” performance boost for SLI in the 337.50 driver. Sneaky, eh? Kind of like saying “our new driver boosts performance by 71%… if you also throw in a second graphics card!”

So Ryan is either lazy not to even read the driver release notes or so incompetent to just take Nvidias word for it.

April 11, 2014 | 05:44 PM - Posted by Anonymous (not verified)

thanks ryan for the article, disapointing but still apreciate the work.
so to explain why i was disapointed, i will share few reasons with you :
1-picked a buged game for sli/crossfire (probably didnt know about it)
2-the article is focusing on cpu overhead claim, while the cpu is 3960x instead of lower end cpu, so that we can see how efficient it is(many ppl commented on this specific point on the 1st release news, was hoping you would have taken them into consideration).
3-redusing cpu overhead doesnt require sli, and all other benchs shown that this driver is basicly multi-gpu scalling, nothing else.
so if you were you, and wanted to investigating cpu overhead, i would have started by benching single gpu, and low/mid end cpu, then with these results moved on to a conclusion about cpu overhead in this driver.
then i would have used sli on few games ( you probably didnt know about the bug in Total War Rome II ), to move and conclude about multi-gpu scalling.
but honestly i think this article is miss leading in way too many points, that strips it of any value of research or objectivity, and in the end i still have the same questions as the day Nvidia announced this driver( although other more in dept reviews showed, no cpu overhead, and good multi-gpu scalling, while this article adresses neither), i was hoping to get an in dept review by pcper, hope ryan will still take my humble opinion into consideration, and maybe investigate this further.(and maybe take the article down for further work, because this one seems like a wast sorry)

April 12, 2014 | 12:52 AM - Posted by Allyn Malventano

1. newest version of the game used for both sides of the comparison.

2. game does not use all threads of the CPU. more specifically, the driver / API is usually a single thread.

3. cpu overhead is greater in SLI.

April 12, 2014 | 01:46 PM - Posted by Anonymous (not verified)

1- not when the driver fixes the multi-gpu scalling bug, where 20 fps are being are being generated on a 1st gpu instead of the 2nd, still a single bpu results would have given more relevant results far away from the shady state of sli in this game.
2-why use a 3960x to test overhead reduction? isn't it better with a low-mid cpu, and much more relevant for the topic of the article ?
3-not when you bench at high settings and ultra high resolution, and the R9 295X shows how the 780Ti SLI reachs it's limits due to the low memory, everyone have tested Mantle overhead on single GPU much more accuratly and clearly, why are you failing to do the same with this driver then ?

April 11, 2014 | 05:59 PM - Posted by JohnGR

Here we go again with a new driver that fixes a game and that is revolutionary. Impressive at least.

Oh oh wait. I forgot. It also fixes cpu bottlenecks when using a 12 threads ultra fast Intel cpu.

April 11, 2014 | 08:32 PM - Posted by Lou (not verified)

I fail to see the point of so much bitching. You all sound like a bunch of old lady's that didn't get they're complementary lemonade at a bingo tourney. Wait for the damn drivers, and test them out your selves. Nothing in this "Review" reads like it's set in stone, nor do i think it was the intention.

April 11, 2014 | 09:21 PM - Posted by Rick (not verified)

The drivers are out and the benches look nothing like the nvidia slides.

"All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push."

Now Ryan, do you really believe that?

April 11, 2014 | 09:24 PM - Posted by renz (not verified)

and do you have solid proof in that nvidia is lying in this regard?

April 11, 2014 | 09:41 PM - Posted by Rick (not verified)

Solid proof, no. Common sense, yes.

You really think they pulled that off without changing the sli profile? If there was such great improvements, where is it in non sli set ups?

April 11, 2014 | 08:34 PM - Posted by Cristian (not verified)

NVIDIA is insulting us, at least. Think a little about this. If there were no pressure from the competition "mantle etc" they'll never released these. Again, insulting with all NVIDIA clients.

April 11, 2014 | 08:36 PM - Posted by Edkiefer (not verified)

check this link on tests, it proves there lower CPU utilization .

http://www.computerbase.de/2014-04/geforce-337.50-cpu-skalierung-benchma...

Google English translation

http://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&i...

April 11, 2014 | 09:44 PM - Posted by Anonymous (not verified)

dont make me laugh, cpu utilisation ? on SLI with 3960x and at resolution 2.5k/4k/6k and all maxed out ? in what planet does a gpu not bottleneck to leave cpu to be the bottleneck ?

April 11, 2014 | 09:52 PM - Posted by Edkiefer (not verified)

The link I posted is single 780ti at 1920x1080 using 2core , 4 core and full 6 cores .
the K in the graph is core count .
shows less cores getting more performance than higher cores .

Sure if you are setting and game that is totally GPU loaded ,your not going to see improvement ,just like increasing CPU won't help either in that circumstance .

April 11, 2014 | 10:32 PM - Posted by Anonymous (not verified)

use a 3960x to test overhead reduction by cores why not use low/mid end cpu ? or these drivers works only on 3960X i dont get it

April 12, 2014 | 12:54 AM - Posted by Allyn Malventano

The link I posted is single 780ti at 1920x1080 using 2core , 4 core and full 6 cores .

the K in the graph is core count .

shows less cores getting more performance than higher cores .

This is because the threads that are limiting the game are just that - threads. A given thread only runs on a single core.

April 12, 2014 | 01:06 PM - Posted by Anonymous (not verified)

yes i got that from the answer above, i thought it was resolutions XD, because to me doesnt make sense to use thousand dollar cpu with less core, when each core out perform all the rest of cpu and have way too much memory, just doesnt give you an accurate scalling for multi core scalling perf, which is pointless in the end.

April 14, 2014 | 08:42 AM - Posted by Anonymous (not verified)

That`s bcoz the higher clocks on the "lower end CPU" and the game doesn't use all cores on the 6 core

April 12, 2014 | 12:41 AM - Posted by Allyn Malventano

On this planet - and with that game apparently.

April 12, 2014 | 01:28 PM - Posted by Anonymous (not verified)

i guess thats sarcasm XD, ok i deserved it.
now i still stand by my opinion, that introducing this article of a wonder driver coming to fight Mantle on it's territory, CPU Overhead Reduction, while we all saw how profesional all sites became when it was time to disprove AMD's Claim, everyone used low-mid end cpus for benching.
but when it's Nvidia that makes a crazy claim, everyone becomes amateurs benching on 3960X, going even to disabling core on this specific cpu to get some results, and most of the other sites who used different cpu got the average perf of any other driver to none ( along with some new sli profiles additions ).
i know that Mantle phenomenon is a good selling point right now, that any article gets good views going about it.
i also understand that Nvidia needs to take some of the focus away off Mantle, and try to undermine it's qualities, due to a lack of proper reply pending 2years for DX12.
but ryan doesnt have to play Nvidia's game, by missleading viewers, the driver itself is good enough, with good sli scalling, why did he have to go and bring up Mantle in his article, using absolutly no objective bench to back him up, with a buged game (fixed by the last driver for SLI that gains exactly around 20fps, as AMD said that), with not even single GPU results(nvidia claimed 60% gain with single Gpu on Rome II).what he couldn't disable a gpu to rerun the same test ?
seriously nothing of what i said makes sense to anyone else at pcper ?

April 14, 2014 | 08:40 AM - Posted by Anonymous (not verified)

English Please

April 11, 2014 | 11:21 PM - Posted by Edkiefer (not verified)

Well you could run 1150 with dual core i3 vs i5-4670k vs i7-4770 but that more work .
disabling core is easier to test and every driver installed for HW an OS is same .
results should still give good test .

April 11, 2014 | 11:45 PM - Posted by Anonymous (not verified)

Not the same.

i7, i5, i3 are different. By disabling cores you still have clock and cache differences

Your going to compare a 3960X 3.3GHz-3.9GHz 15mb cache to a 2130 3.4GHz no turbo 3mb cache by just disabling cores.

That's just lazy and inaccurate results.

April 12, 2014 | 12:09 AM - Posted by Anonymous (not verified)

honestly, i start to think that most of the optimisation if there is any, is exclusive to 3960X, why is every benchmark using it to test cpu overhead reduction in this driver, not only doesn't make sense, but looks really fishy that these many using them, who excpects ppl to have a 1000$ cpu ?

April 12, 2014 | 07:24 AM - Posted by Edkiefer (not verified)

Ok , I see your point an agree , it was only test I came across so far .
Maybe Ryan will test on broader system spec , then we will see .

But there are many who have reported increases and there all not top end CPU many were 35xx and even older quad core (Qxxxx) .

There coming out with WHQL with supposedly even more optimizations so we will see .

April 12, 2014 | 03:10 AM - Posted by trent (not verified)

extreme tech published a very interesting article about nvidia's rome total war claims. In the article they claim that there was no sli profile before the new driver and that the 71% performance improvement is because only a single gpu was ever being utilized before the new driver, meaning that sli didn't work properly before. Nvidia are dirtbags and i'm glad I sold my 780 and went with amd.

April 12, 2014 | 03:16 AM - Posted by trent (not verified)

If there was an oversight I'm not so sure ryan was aware of it. If ryan did speak with someone from nvidia about the results I believe they would purposely deceive him if they thought they could get some positive press and get away with it at the same time. Nvidia know how to market their products, if you remember the performance slides they released comparing the 780ti to the 290x before the 780 ti released, they showed the 780ti around 30% faster in all situations but failed to clarify that they were using reference 290x cards in quiet mode. Turns out that it didn't quite destroy the 290x.

April 12, 2014 | 03:57 AM - Posted by Klimax (not verified)

Will there be also Civilization V? Either I am getting strange results (observing bug or regression) or can't read log file.

April 14, 2014 | 07:06 PM - Posted by Anonymous (not verified)

funny that you would bring that up.
the new Civilization: beyond earth will be running on Mantle
source : http://www.hardware.fr/news/13653/mantle-nouveau-civilization.html

April 12, 2014 | 06:20 AM - Posted by diggs (not verified)

The problems of the first world. I wonder what the third world worries about.

April 12, 2014 | 11:44 PM - Posted by BCOW (not verified)

They worry about having more kids and when the USA is going to send them more food. So let us worry about are graphics cards. As long as we send them food.

April 12, 2014 | 11:44 PM - Posted by BCOW (not verified)

They worry about having more kids and when the USA is going to send them more food. So let us worry about are graphics cards. As long as we send them food.

April 12, 2014 | 07:57 AM - Posted by Rick (not verified)

It is really this one paragraph I can't get by...

"All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push."

Does anyone even really believe that?

Say that is exactly word for word what nvidia told Ryan, he wrote that without even questioning if it's true? There is no sentence saying we need to look into this further to make sure it is true.

It's like a news reported publishing a store about something without checking their sources facts. It is wrong and irrersponsibe.

April 12, 2014 | 08:20 AM - Posted by Relayer (not verified)

Notice that Ryan never said it? Can't call him a liar if he never made the claim. Pretty bad though calling an nVidia marketing claim a review. Call it an advertisement, like it is.

April 12, 2014 | 08:21 AM - Posted by Relayer (not verified)

Notice that Ryan never said it? Can't call him a liar if he never made the claim. Pretty bad though calling an nVidia marketing claim a review. Call it an advertisement, like it is.

April 13, 2014 | 11:21 AM - Posted by Ryan Shrout

Trust me, I questioned it. I have seen some stuff that leads me to believe it is true.

That being said, how can "check my sources" in this case? With this kind of driver level issue on the debate table, you pretty much can only believe or not believe what the company says.

April 13, 2014 | 01:08 PM - Posted by Anonymous (not verified)

well ryan, let me disagree with you!
you apparently heard of an issue with Rome II multi-gpu before hand, then why didnt you supply further results on single gpu, since nvidia claimed 60% gains with it, seem to me weird that you would stop at sli without disabling a gpu for single one test, since everything is up and running.
and no you dont have to believe what they say thats why ppl need websites like yours to trust, for you to run test and prove or disprove these claims, otherwise what's the point ?
Nvidia's BS claim isnt multi-gpu scalling, but cpu overhead reduction on DX11, that's why you started the article by comparing to Mantle.
are you telling me you dont know how to test if this is effective or not with cpu limited situations ?
run single gpu, low-mid cpu , at lower resolutions, and prove or disprove the claim.
then run sli for multi-gpu scalling, then give us a conclusion at what this driver really is doing, reducing cpu effectivly(very much doubt it), or scalling well multi-gpu or enabling new profiles ( probably the case)
but saying we cant really know but what they tell us that BS sorry, you can have more relevant article about the driver if you wanted.
no one is asking you to take sides, just be consistant and neutral.

April 13, 2014 | 03:31 PM - Posted by Anonymous (not verified)

So much for running a tech site which reviews hardware and has a wide variety of CPUs at its disposal to test.

Might as well just be a bulletin board for press releases if your stance is to just believe what they tell you.

April 13, 2014 | 04:09 PM - Posted by Edkiefer (not verified)

How about you wait for full review before making assumptions .
He didn't even post full results yet .

Wait it doesn't matter as I bet most will refute results whatever they are .

April 13, 2014 | 04:41 PM - Posted by Anonymous (not verified)

this is one of the points exactly for which i didnt like the review, you feel like they picked the only questionable game for an ali bench on a 1000$ cpu to validate the wrong claim of nvidia, if only they waited for full results, why specificly these, and why bring mantle overhead if this bench doesnt represent that in any way.
the point is this article shoudn't have been up at all, untill the results are finished, or make an article commenting on the results without comparing to Mantle, but ryan choosed neither.

April 13, 2014 | 05:11 PM - Posted by Anonymous (not verified)

Apparently it doesn't matter to him either.

All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.

Everyone able to read driver release notes knows 335 didn't support a SLI profile for Total War Rome II.

He goes on to say.

Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!

Best case scenario equal one driver which support SLI and one that doesn't.

What kind of testing is that?

Single GPU improvements should be measures in Total War Rome II case. With one driver having SLI support over the other, your not measure improvement one over the other but non-SLI support to SLI support.

Then to hold it up as an improvement example is disingenuous at best. Its not like this is Ryans first run at testing SLI which is even more disheartening he would even do this.

April 13, 2014 | 07:42 PM - Posted by Rick (not verified)

You questioned it, but it took you being called out on it to admit it.

Shouldn't you be upfront with your readers?

April 12, 2014 | 09:45 PM - Posted by Alejandro (not verified)

Hey guys, I've seen big improvements with 3DVision too.
Shader cache is also working to cut loading times.

Can you also test CPU bound scenarios with single gpus? Using lower res etc?

3DVision results:

GTX660@i52500k-4,3ghz

BF4 - SP Baku, after the helicopter explodes the elevator and starts the hunt. Hiding behind the pillars, with the smoke ahead:
1600x900 - Ultra except for "effects" in high and msaa off.

Driver = 3d off; 3d on 100 depth

335.23 = 73; 31
337.50 = 71; 46

April 12, 2014 | 11:41 PM - Posted by BCOW (not verified)

Some of you guys need to get a f###ing life. You would think that you could build a better graphics card yourself. If you all are so f###ing smart go to AMD or Nvidia and build some cards. If not shut up.

April 12, 2014 | 11:46 PM - Posted by BCOW (not verified)

Ryan, I think you guys are doing a great job. Dont lisson to these jacka##es. All they do is get on here and cry about something.

April 13, 2014 | 01:41 AM - Posted by Anonymous (not verified)

Yeah, these guys are dumb.

They should all be thankful when a company deceives them.

The nerve of these people spending their hard earn money on a product and not wanting to be deceived by marketing tactics.

How dare these consumers ask for accountability to a companies claims.

April 13, 2014 | 03:58 PM - Posted by Anonymous (not verified)

Dat open advertisment for nVidia again :D
Even more obvious than the awesome titanfall system guide bullshit :D

Congratz !

April 13, 2014 | 08:49 PM - Posted by ThorAxe

It cracks me up when a former AMD fan site (do you know what the original name of PCPer was?) is called an Nvidia fan site.

Guys get a grip and wait for the full review.

April 14, 2014 | 06:38 AM - Posted by nobodyspecial (not verified)

http://www.geforce.com/whats-new/articles/nvidia-geforce-337-50-beta-per...
StarSwarm looks good on 780TI also, isn't this a Mantle game? :) Note it's ONE card, and look at the change, same with others. 780ti went from 55 to 70fps which allows it to beat MANTLE as shown in the graph. Kind of defeats all the whining over SLI crap here. Gaining close to 30% from a driver in a game AMD should dominate, causing you to be tops, is a pretty clear victory for DX11 IMHO. Note you can see Mantle's effects too, as AMD's regular driver sucks as shown. Mantle gave it a HUGE boost (from 32fps to 57fps). But NV that apparently won't be enough to stop NV's new drivers.

I'd rather have better drivers affecting all cards and all games than one api for a few cards from one vendor who happens to have 1/3 of the market vs the other guy who has 2/3. The quicker that API dies the better. It is merely taking resources away from what they SHOULD have been spending on.

http://international.download.nvidia.com/geforce-com/international/image...

Those are single gpu scores, and all are better than you can get from raising speeds from 1006 to 1019. Having said that, not really impressed they can get this, when it is because of AMD basically taking a year and a half to catch them that they held back FULL GK110, no driver improvements for an entire year until AMD released NEVER SETTLE drivers in NOV 2012. As hardocp showed reviewing the drivers (a few 3/2013 I think or so), NV hadn't done ANYTHING until Never Settle came. Why would they? Wait for competition to catch you before revealing your answer (even if you've had that answer the entire time, I'd keep it until you caught me). This is good business, and how you make profits maximize (which is a business goal correct?).

April 14, 2014 | 06:51 AM - Posted by JohnGR

CPU with 12 threads.
Faster and much more expensive gpu used anyway.
Aggressive driver optimizations specifically for that title.
A little(too much) marketing in it.

Results.
Successful with loyal fanboys. They think that they are already running DX12.

April 16, 2014 | 04:26 AM - Posted by Wendigo (not verified)

You are the fanatic and loyal fanboy (a rabid AMD fanboy, of course), making his day with this shit of your mouth (speaking about things that you don' understand or see for yourself).

You don't know anything about this driver, the better performance is in ALL DX11 games (great or little improvements, it depends of the cpu usage of the API DX11 in each game), aren't specific title optimizations:

http://wwwendigo.blogspot.com.es/2014/04/rendimiento-de-los-nuevos-drive...

It´s a very obvious set of optimizations in the cpu usage of the driver, because it makes the best with the weaker cpu configuration, and it's a general optimization because spreads along many DX11 games that aren't in the driver changelog of 337.50.

April 14, 2014 | 09:45 AM - Posted by Mac (not verified)

So the addition of a sLi profile is now being sold as dx11 overhead reduction now?
Wow, just wow!

April 15, 2014 | 09:11 AM - Posted by 1STANCESTOR (not verified)

I got a GTX 750 Ti and I updated to 337.50 Beta at the same time. I keep getting total computer crashes while gaming a couple times here and there. Probably because it's beta, we'll see. I am VERY pleased with the performance I do get though. I get a very smooth 38-40 fps in Skyrim with ENB with no DOF. Great performance in all my games.

April 15, 2014 | 05:03 PM - Posted by BagFullOfSharts (not verified)

This driver wouldn't really have anything to do with that. Skyrim is a DX9 game and the the 337.50 driver "improves" DX11. Unless there are undocumented improvements for the Maxwell architecture, both drivers would perform almost identically.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.