Review Index:
Feedback

Frame Rating: GTX 970 Memory Issues Tested in SLI

Author:
Manufacturer: NVIDIA

Battlefield 4 Results

At the end of my first Frame Rating evaluation of the GTX 970 after the discovery of the memory architecture issue, I proposed the idea that SLI testing would need to be done to come to a more concrete conclusion on the entire debate. It seems that our readers and the community at large agreed with us in this instance, repeatedly asking for those results in the comments of the story. After spending the better part of a full day running and re-running SLI results on a pair of GeForce GTX 970 and GTX 980 cards, we have the answers you're looking for.

Today's story is going to be short on details and long on data, so if you want the full back story on what is going on why we are taking a specific look at the GTX 970 in this capacity, read here:

Okay, are we good now? Let's dive into the first set of results in Battlefield 4.

Battlefield 4 Results

Just as I did with the first GTX 970 performance testing article, I tested Battlefield 4 at 3840x2160 (4K) and utilized the game's ability to linearly scale resolution to help me increase GPU memory allocation. In the game settings you can change that scaling option by a percentage: I went from 110% to 150% in 10% increments, increasing the load on the GPU with each step.

View Full Size

Memory allocation between the two SLI configurations was similar, but not as perfectly aligned with each other as we saw with our single GPU testing.

View Full Size

In a couple of cases, at 120% and 130% scaling, the GTX 970 cards in SLI are actually each using more memory than the GTX 980 cards. That difference is only ~100MB but that delta was not present at all in the single GPU testing.

Continue reading our look at Frame Rating comparisons between GTX 970 and GTX 980 cards in SLI!

Our performance data is being broken up into two sets: the GTX 980s in SLI running at all five of our scaling settings and, separately, the GTX 970s in SLI running on the same five scaling settings. Plotting 10 sets of data on a single graph proved to be a a bit too crowded, so we'll show the graphs successively to help you compare them more easily.

View Full Size

View Full Size

Unlike our first sets of results the SLI numbers are ALMOST in a playable state, making them much more real-world than before. The first thing I noticed when compiling this data was that the GTX 980 cards in SLI actually had a couple of more downward spikes in frame rate at 150% scaling than the GTX 970s did. I did confirm this was a regular pattern by re-running tests on both sets of hardware about six times, and the bright green line you see in the first graph above is actually one of the better results for the 980s.

It appears though that moving from 110% to 150% scaling results in the expected frame rate decreases in both configurations.

View Full Size

View Full Size

Average frame rates are where we expect them to be: the GTX 980 SLI is faster than the GTX 970 SLI by fairly regular margins.

  GTX 980 GTX 970 % Difference
1.10x Scaling 47.3 FPS 41.0 FPS -15%
1.20x Scaling 41.1 FPS 35.8 FPS -15%
1.30x Scaling 35.4 FPS 31.2 FPS -13%
1.40x Scaling 31.0 FPS 27.7 FPS -12%
1.50x Scaling 27.7 FPS 24.6 FPS -13%

The percentage differences here are actually more reliable than the single GPU results, which is a bit of a surprise to us. The GTX 970s in SLI range from 12-15% slower than the GTX 980s in SLI, but as we know from our years of GPU evaluation, that isn't the whole story.

View Full Size

View Full Size

Click to Enlarge

It should be painfully obvious that at 150% scaling the GTX 970s in SLI have a significant amount of frame time variance that is mostly limited with the GTX 980s in SLI. Even at 140% scaling, looking at the thinner gray line, you can see differences in the behavior of the frame time lines in the two graphs (you can click to enlarge them for a closer view).

View Full Size

View Full Size

Before you start eyeballing these graphs, make sure you take note of the slightly different y-axis on the left hand side between them - that's important. Focusing on the highest 10% of frame times in our testing, the GTX 970s clearly exhibit more issues than the GTX 980s. The flagship cards only see about 5ms of variance at the 90th percentile and crosses the 20ms mark somewhere in the 96-97th percentile. The GTX 970s in SLI though reach those marks much sooner - at the 90th percentile we see as much as 18ms frame variance and at the 96th percentile that reaches as high as ~40ms.


February 2, 2015 | 07:53 PM - Posted by Amdbumlover (not verified)

Under closing thoughts, second paragraph.

February 2, 2015 | 08:31 PM - Posted by Angry

Hmmm.....im still really on the fence between a pair of GTX 970s and another GTX 770...for 1440p.

*sigh*

Thankyou for the very informative article. Much appreciated.

April 10, 2015 | 01:32 PM - Posted by Nostrildamus (not verified)

I have 2x770 and a 1440p monitor. For some games, it's mostly okay, but for non stuttering excellence, it's just not enough. I will go for 2x970.

February 2, 2015 | 08:40 PM - Posted by Jappetto (not verified)

So basically you're telling us what we knew all along. The 970s perform worse than the 980s. Is there any chance we can see some frame time comparisons with the 290/290x in xfire? From a consumer standpoint, I think that's the most important question that needs to be answered.

February 2, 2015 | 09:43 PM - Posted by Ophelos

No need to compare AMD vram vs Nvidia.. Since AMD does have much better VRAM @ 4/8GB 512bit (Which is why AMD needs more power then Nvidia). Nvidia just has the better GPU, an you can really notice this difference in 4k res benchmarks.

February 2, 2015 | 09:56 PM - Posted by Jappetto (not verified)

Oh, sorry, I forgot. AMD doesn't have problems with stutter on dual card setups.

February 2, 2015 | 10:03 PM - Posted by Ophelos

stutter is always going to be a problem in games no matter what we do. So i don't worry about it much.

February 3, 2015 | 12:28 AM - Posted by arbiter

AMD doesn't have a problem? I guess you forgot about how many years AMD had a CF problem yet complete ignored all the people complaining about it, before a tool that was developed by Nvidia of all people was released and revealed the extent of the problem with an AMD CF setup. So AMD isn't exactly a clean slate on their side.

February 3, 2015 | 01:40 AM - Posted by ThorAxe

I think Jappetto was being sarcastic. :)

April 22, 2015 | 04:23 PM - Posted by Anonymous (not verified)

Right, when it works, Crossfire just runs like shit.

February 3, 2015 | 07:16 PM - Posted by Johnny Rook (not verified)

I completely agree.

This tests and results are really in a vacuum!

At the light of the results above, I could very well state that my girlfriends' Gigabyte GTX 770 4GB SLi setup runs the same games at same conditions better than a GTX 970 SLi setup because, the GTX 770 in question has REALLY 4GB VRAM and everybody would believe me, if they didn't knew better!

We need AMD R9 290(X) frame-pacing results.

AMD is so confident about their 4GB VRAM, 512bit, I don't see why not put them to the test, if only just to see R9 290(X) CF crush GTX 970 SLi at the same frame-pacing benchmarks

February 4, 2015 | 09:55 PM - Posted by Anonymous (not verified)

All realtime gameing benchmarks ive seen at high res have shown 970 sli and r9 290x to be within about 4 fps of eachother at 4k. Hardly measurable. and you can run 970 sli on a 600watt psu. you need minimum of 800 for r9 290x. 1000 if your useing amd cpu as well. Lets be realistic shall we.

February 2, 2015 | 08:59 PM - Posted by JohnGR

Someone orders a pizza and it comes with only 7 slices instead of the 8 as advertised. But 7 slices are more than enough for most people to satisfy their hunger, so we should forget that the pizzeria lied, and only remember that "it offers a tremendous level of performance, some amazing capabilities courtesy of the Maxwell GPU and runs incredibly efficient at the same time."

The review is excellent as usually and really informative. But the press keeps using the card's performance to cover up the real problem here. A company lied. If we where talking about that other poor company, no performance results would have been enough to cover the scandal. We all saw that when Hawaii come out and almost no one from the press was paying attention to the performance of the card. All where looking at the gpu speed and pointing fingers at AMD.

February 2, 2015 | 09:40 PM - Posted by Ophelos

Welcome to Nvidia AD supported reviews sites. They don't care about AMD at all really.

February 3, 2015 | 12:46 AM - Posted by Anonymous (not verified)

All the websites get ad revenue from all the makers of PC/laptop/mobile parts, it's the whole market that needs to be looked at, by The FTC, the companies have much control, even more so than just advertising, they have the review samples, in addition. Some web pages are nothing more than generators of sponsored content, and certain writers do nothing other than write poorly reworded content from the marketing copy, straight from the marketing departments of the sponsors that provide them. When a writer reviews a product, really reviews a product, look for much comparison, and contrast, with that product's competition, be it GPU, CPU, or whatever.

A shining example of the control big companies have over the online "journalists", is the recent reviews of Intel's latest server SKU offerings, there where many enthusiast web sites mentioning Intel's server SKU, but there was no comparison of the Xeon with the SPARC, or the Xeon with the Power8, and very little to no server benchmarks where used to compare the competing products. Intel gets a lot of "mind share" by getting these friendly blurbs out there, its easy to tell, because the article(Ad Really) makes no mention of the competition.

This type of sponsored content, that is not listed as such, is what the FTC needs to be on the lookout for, that and the tech writer hacks, who do little more than repackage the marketing copy, and word it so that the readers think they are getting an objective description. If you are reading a "review" look for the necessary comparison and contrast with the competing SKU/s, among other more complete information, the single item reviews are OK if links are always be provided to a review of the competing product, preferably from more than one source. Fore sure these sites review products, and have a business relationship with the makers of the products that they review, and it's this one hand washes the other relationship that will force the readers to always have to read as many different articles on the product as possible, that and waiting on the newest products, until there is enough user feedback on the blogs, to be able to clearly see what the product really can do.

February 4, 2015 | 03:19 PM - Posted by Anonymous (not verified)

Well I made a bet and won. The bet was that PCPER, Techreport, TechPpowerUp and maybe even AnandTech will eventually test the GTX 970 SLI when it starts to get too obvious (Nvidia BIAS) but will exclude the R9 290X/290/295X, so the GTX 970 SLI don't look as bad as they really are.

So predictable its not even funny anymore.

February 6, 2015 | 11:08 AM - Posted by arbiter

Back when 970 was new they compared them to 290's and did SLI testing. So go search for them first before acting like a dumbass with no damn clue about anything.

February 3, 2015 | 02:27 AM - Posted by ThorAxe

The petition to Nvidia for a refund hasn't even reached 10,000 yet.

Given that Nvidia has sold well over a 1 million of these cards I think it's safe to assume that the vast majority of people don't care and will likely never run into the issues requiring over 3.5GB of ram or more ROPs.

February 4, 2015 | 03:21 PM - Posted by Anonymous (not verified)

Nope, most people/kids still don`t know

February 6, 2015 | 11:09 AM - Posted by arbiter

As story reports, card works fine til you start pushing really high resolution with really high graphic settings. Generally settings in question fps gets to point anyway that is pretty low and unplayable anyway.

March 10, 2015 | 10:50 AM - Posted by aenews (not verified)

Not if you're using using SLI @4K. I actually have a 3-Way setup.

February 9, 2015 | 02:26 PM - Posted by ProMace (not verified)

My thought exactly. I visited that petition site a couple of times and the number of participants is completely negligible in a relative sense.

August 27, 2015 | 11:35 AM - Posted by Anonymous (not verified)

A couple of years ago, games would fit on a single dvd, now 15-16GB seems a common affair, so let me ask you a question, are you ready to invest that much of money in a GTX 970, knowing fully well that it is not future proof? Plus, had nvidia limited the card to only 3.5GBs and removed the slower memory altogther then it would have been a much wiser move. But that's not what they did, and in a year or two when you see games with higher requirements(say crisis 4 or farcry 5), 4K becoming more common, you'll rue your decision to buy this blasted card, cause everytime you exceed the 3.5GB limit you are gonna stagger !

February 8, 2015 | 07:42 PM - Posted by Anonymous (not verified)

What a lame statement. I have an AMD card and the video driver continues to crash. AMD sucks big time.

February 2, 2015 | 10:31 PM - Posted by DJF (not verified)

The Pizza has 8 slices, its just that the crust at the edges is thicker and takes longer to chew then what you expected, just like the 970 has 4 meg, its just that 1/8 of it is slower then expected

February 3, 2015 | 12:10 AM - Posted by Topinio

The pizza has 8 slices, just one of them's no good for eating, it's really chewy and maybe tastes odd. You bought it instead of a pizza from a different place which is a bit more hit and miss but would've definitely given you the 8 slices you ordered, charged you a bit less, and thrown in some onion rings and a coke.

Some people might be pissed off. Others with less of an appetite mightn't care, they'd've only put the 8th slice in the fridge anyway.

February 3, 2015 | 08:26 PM - Posted by Anonymous (not verified)

In this case it's more like the pizza comes with the advertised 8 pieces but the 8th piece has nothing on it.

February 4, 2015 | 10:41 PM - Posted by Anonymous (not verified)

Haha, I like analogy, well I think its off but I like pizza... problem with the analogy is you didn't get 7/8ths of a GPU. Better would be you ordered a pepperoni pizza that advertised 8 pieces of pepperoni on each slice and upon getting it you counting and maybe it only had 7 pieces per slice, or maybe it had 8 but one of the 8th was unusually small and it was a bit debatable whether or not it is fair to count it as a piece of pepperoni at all. Even though it may technically be a piece of pepperoni it doesn't really add much to the pizza being so tiny. So you're a bit annoyed but even when eating the pizza you're having a hard time noticing the difference. Maybe you go make a fuss about it on principal but for anyone who just wants some good pizza nothing has really changed.

February 13, 2015 | 03:29 PM - Posted by Thiago Sestini (not verified)

I think you are looking at this in a wrong light. I believe most people buy video cards based on performance reviews rather than on what the manufacturer states. I bought a card that produces X average frame rates in a given title and Y minimum frames in a certain title, for instance.

So I don't care if the pizza was cut in 7 or 8 slices, it is still the same amount of pizza.

Obviously, Nvidia publicised wrong information about their product, which is in its own a crime against its customers and should be pursued.

The reason many people don't really care about this is that the cards perform the way they knew it to perform before they bought it.

February 2, 2015 | 08:54 PM - Posted by Vertek (not verified)

Many users complaining many months with Skyrim mod, Shadow Mordor and other games about GTX 970 stutter behavior. PCPer asked again and again to test 970 with FCAT to prove it has bad design for new high texture games.

Big question is when PCPer bring Nvidia executives with live questions from users to answer for FRAUD and LIES about "same memory subsystem as GTX 980", cache size, ROP readings from GPU-Z?

When will Nvidia PAY refund for telling lies to PCPer and to users?

February 2, 2015 | 09:09 PM - Posted by JohnGR

The worst for Nvidia and it's partners is still in front of them. It seems that Nvidia doesn't understand that. They think that people will be persuaded that their cards are as good as they should be, no matter the false specs.

Well there are cases where people returned their cards. But most people think differently. Why take the card and return it today, when the only real options are 290X or to pay more for a 980? Why not keep it for a few more months? In a few months from now the new AMD cards will be out and possibly the new Nvidias too. Then at 350 there are going to be options that will be clearly faster that 970. The cards will still be under warranty and the fraud will still be fraud. Then I expect a second wave of people returning their cards, much bigger than the one we see today. That's when Nvidia will realize that trying to twist reality by giving directions to the press to focus on the cards performance was a bad idea.

February 2, 2015 | 10:21 PM - Posted by Anonymous (not verified)

Yea sure, because nvidia will allow you to return your card till forever. Either you are upset an return it now, or you aren't upset enough and keep the card. Once you keep the card you are stuck with it, and rightly so. You can't have the cake and eat it too.

P.s. Nvidia clearly made a mistake, perhaps with evil intention, probably not. this still is far from fraud. LOL. Now flame me if you must and call me an nvidia employee.

February 2, 2015 | 11:01 PM - Posted by Anonymous (not verified)

You're under a misapprehension that it's up to Nvidia to accept a refund or not. In Europe and Australia, retailers have to accept those returns no matter what. There's no statute of limitations on returns for false advertising. I feel bad for them, but the ones that screwed them was Nvidia not the consumer.

There's a storm brewing, and believe me it'll hit landfall when AMD's 3xx series comes out. I should know, because I'm waiting for this exact situation to return my 970.

February 3, 2015 | 01:12 AM - Posted by JohnGR

Wrong. No one is forcing you to watch technology news all the time and I bet Nvidia or the shops didn't contacted all the customers to inform them about that "mistake" by email or phone. So someone can go at a shop in 3 months and say

"I just found out. This product is still under warranty and in perfect condition. I haven't misused it in anyway. But specs are not as advertised. You lied to me. I want a refund".

February 3, 2015 | 02:28 AM - Posted by Mark D (not verified)

I tried making this argument to MSI and it went nowhere. They flat out refused to refund my card or trade up to a 980. Same with newegg. NVIDIA was the one that made the mistake here, and it's up to them to make good on this.

February 5, 2015 | 05:22 AM - Posted by Earnest Bunbury

I thought about a charge back on my credit card... dunno if it would work or if I would be able to order from Newegg ever again... which I probably won't as their return policy is crap compared to Amazon.

February 6, 2015 | 11:13 AM - Posted by arbiter

If you have the card in your possession and do that without even trying to send the card back. That would constitute Fraud and you could be looking at criminal charges for that not to mention what they could do to your credit rating.

February 22, 2015 | 09:13 PM - Posted by Anonymous (not verified)

You have a card that was sold to you using fraudulent specs. Sue the retailer if they won't give you your money back.

February 3, 2015 | 09:40 PM - Posted by Klimax (not verified)

Not even in EU. Unless you use 2 weeks (or however long it is in particular country as this can vary between member countries) period there is no chance with warranty as there is no defect and majority of user facing materials (unless listed explicitly by shop or vendor) didn't have affected data. (L2/ROP count) and thus there is no right to warranty return. It's up to good will on side of shop or vendor.

Therefore there is even no requirement to contact all buyers. Nothing relevant has changed or affected hardware... (no defect like Phenoms TLB bug or Transaction memory bug or say improper solder on GPUs)

February 4, 2015 | 12:00 AM - Posted by JohnGR

You do have 8 defective ROPs compared with the original specs, but anyway, you might be right. We will see.

February 5, 2015 | 07:47 PM - Posted by Anonymous (not verified)

at least in sweden you can "reklamera" an electronic product for up to 3 years if it for example fails before these 3 years because some capacitors fail, then you can point out that it was faulty from the beginning, the first 6 months is up to the reseller to prove it was you who broke it and the last 2.5 years it`s up to you to prove the problem was there to begin with and capacitors are an easy example to get a refund or an equal product, so the fact that nvidia lied about the specs and the memory then you can (reklamera) the card since it goes under the consumerlaws here that it is a defective produkt since you did not get exactly what you paid for(false advertising), the reseller here has 3 options, replace it with something that has the right specs

(not really possible in this case since all 970s suffers from this problem)

2. give a refund,

or 3 repair it (again not really an option),

though i hear many has gotten the option to uppgrade to a gtx 980 at a discount:) and while i wouldnt want to give nvidia more money atm i have tried out the new gtx 980 G1 ganing edition and it overcloked lika a beast 1570 core clocke and 7940 mhz, it was an extremley nice card but i was to curius about the new memory system om 380 i decided to wait and sent my gtx 980 back sin i want more powah!,

so right now im waiting for 380x/90x and nvidias GM 200, though the gm 200 will have to perform at least 30 % for me to even start looking at it since im a bit comprohasive about giving nvidia my money atm:)

February 2, 2015 | 09:49 PM - Posted by Anonymous (not verified)

How many days did it take to see 4GB 760GTX after launch?
Answer: 1 day

960GTX? still awaiting.

Where is our the 970gtx exchange process?

I haven't bought ATI for the past 15 years because i felt they screwed me over with their customer support.

Nvidia is playing with fire.

February 2, 2015 | 10:14 PM - Posted by Mark D (not verified)

Guys, this is fantastic data, but it would be so much better if you use more contrasting colors than slight variations of blue and green in future articles. I can hardly tell which line is which. It would be much easier to eyeball whats going on if the 970s were shades of red, the 980s shades of blue, etc.

As far as the meat of the article - I still have questions. So the conclusion here is basically that it doesn't matter. I guess what I'm wondering, is how is that even possible? There's no question that the last segment has less bandwidth. What if it were 1GB in the slower partition? 2GB? It can't be that bandwidth doesn't matter at all, so how are they basically breaking the rules like this?

Also, what happens with games like shadow of mordor that use the VRAM as a buffer for textures, etc? In some sense by inflating VRAM with a huge frame buffer you've created a scenario that doesn't reflect reality, especially in the single card tests. No one is running BF4 at 6K on a 970. No one. What I'm most curious about is what happens 1-2 years from now when we're dealing with games that run in 1080p and still easily soak up 4GB of VRAM with epic amounts of textures, shaders, etc... If this is based on an algorithm figuring out what to keep in fast memory and what to keep in slow memory, what happens when NVIDIA stops supporting that algorithm for new games?

February 2, 2015 | 10:29 PM - Posted by Anonymous (not verified)

The assumption is that in a few years, if games really take 4gb of vram on 1080p or some other resolution, the performance will be comparable to what shown by these tests. What will actually happen, nobody knows.
Even now games handle their memory differently from each other and will keep doing that in the future. Since nvidia supports their products quite a long time with new drivers, I don't see why they should stop evolving the algorithm, if there truly is some. The hypothesis was that the OS handles the matter.

February 2, 2015 | 11:18 PM - Posted by Mark D (not verified)

In that case I think it's worth investigating what happens with the games that people have reported issues with. I think this article clearly shows that the memory segmentation can handle huge framebuffers, and can deal fairly effectively with large texture...in a linear games that probably never refer back to cached textures. The games I'm most concerned with are open world games where this is a much larger degree of random access. In my recent experience with far cry 4, watch dogs and AC IV, there was a persistent, unavoidable stutter. Maybe that's just ubisoft...but maybe it was my 970 too. I'm still very suspicious that this memory segmentation is causing ill effects in certain games.

I bought my 970 on launch day so I'm way outside of the retail return window. Had I known it had a bizarre memory setup I might have sprung for the 980, so I'm very, very frustrated with NVIDIA's lack of response on this. Whether or not it's a major issue, this wasn't something that should have been withheld from buyers, and common sense tells me you can't cut bandwidth and not suffer in some way. I stick with a single card because I'm completely turned off by the SLI microstutter - so I very much feel like I've been directly cheated by this misinformation if I'm stuck with a card that's prone to stuttering, even in rare situations.

February 2, 2015 | 11:08 PM - Posted by nevzim (not verified)

In 2 years most AAA games will require 8GB so 3.5 or 4GB makes no difference.

February 23, 2015 | 11:47 PM - Posted by Anonymous (not verified)

In 2 weeks you wont need any card! Think Grid.

February 2, 2015 | 10:25 PM - Posted by Fishbait

Thank you for the excellent review Ryan. I feel very well informed about the limitations of the 970GTX and do not think that I will foolishly put them in a configuration in which they would fail.

My only disappointment was for COD at 1440p, I could care less about the game but I'm wondering if it isn't badly optimized as well. I enjoy playing BF4 and the functionality and features of the Frostbite engine are great, then I see this review for COD performance and think "what could possibly be taking this many resources?"

Thanks again Ryan :D

February 3, 2015 | 12:03 AM - Posted by Anonymous (not verified)

Different game engine.

[Forbes] 'Call Of Duty: Advanced Warfare' Will Feature Brand-New Engine
http://www.forbes.com/sites/davidthier/2014/05/06/call-of-duty-advanced-...

Anyone who thought that the Call of Duty game engine has started to show its age will finally, at long last, be getting a reprieve. The series ran the same (albeit improved) engine throughout the entirety of the Xbox 360/ PS3 generation, and those fans that notice this sort of thing took that as one of the biggest problems with the series — that Activision was essentially re-skinning the same game over and over. Sledgehammer Games’ Call of Duty: Advanced Warfare is the first title in the series built specifically for the Xbox One and PS4, and it’s going to be powered by brand-new tech.

February 6, 2015 | 11:17 AM - Posted by arbiter

Its not foolish in way it was done in such that they had to push settings of game to such extent to show where that fail limit is. Most people don't run BF4 at settings which pretty much Ultra settings @ 6k rez. It just shows that issue is not as big as a lot of people have claimed it as and most games that buy say single 970 for 1080p gaming won't see an issue. 970SLI, yea they are more likely to see a problem but still takes a bit and you least as of games now have to make it fail.

February 2, 2015 | 11:46 PM - Posted by Anonymous (not verified)

The market for used gtx 970 gpu's remains very strong on eBay, so the "loss" seems to be in the 20% range including selling fees. The more difficult dilemma remains whether to spend $550 for a gtx 980. If you are currently experiencing issues, it may justify trading up, otherwise save your powder for newer technology down the road.

February 3, 2015 | 01:21 AM - Posted by Mandrake

Very informative article Ryan, thanks. Still on a GTX 780 here. I'm thinking of getting another for SLI while they're still available.

The alternative is to wait for a GM200 behemoth card like the rumored Titan X/II.

February 3, 2015 | 01:35 AM - Posted by Anonymous (not verified)

Some users reporter "stuttering" ( More tha usual ) in Multi Gpu Setups with GTX 970.

I've read the entire review and there's "Nothing" related to stuttering in GTX 970 Multi GPU Config.

My question is:

GTX 980 SLI, presents stutter ?
GTX 970 SLI, presents stutter ?

I don't know if some users just reported Stutter in SLI setups because they got angy or whatever ...

GTX 970 Will problably have its cost reduced due to the memory issue, so, I'm considering take another one.

Does anyone can help ? This question is not clear in the review ...

February 6, 2015 | 11:20 AM - Posted by Abram730 (not verified)

Above both cards show stutter as they try to push over 4GB, but that is expected. The 980 actually has more severe stutters as it pushes over 4GB. The 970 has more of them.
Under 4GB there are no stutters, but they have the usual frame variance issues. The 970 has more variance, as cheaper cards tend to have. Less variance then AMD, I'll add.

Perhaps some imagined it, perhaps some are sock puppets, as a AMD sales pitch seems to follow. A lot of people got the 970 and a lot of people could be new to SLI. You need more CPU power for it as you need to double up the draw calls and double up transfers. It's very easy to start to have driver issues.
Windows isn't a realtime OS, so you get stutters from driver interrupts. There are a few tools to test for that. DPC latency checker and LatencyMon. There could even be issues with Nvidia's driver overhead. Those who were very insistent on having the cards, having issues and explaining it, did seem to have system stutter. That is the FPS wasn't dropping, but the game was stuttering. I had some check and they did have high latency.

I had a bitch of a time with that on this build. The Asus software and drivers were a no go. I needed to use the default windows stuff.
AI suite II was the worst offender. I couldn't even watch video or play audio with that installed.
This has been a growing issue with each new version of Windows. It's getting harder not to stutter. No BSOD's now, but PC's can get turrets syndrome. This can put you into xperf hell as you dig through it.

February 3, 2015 | 01:44 AM - Posted by Greg (not verified)

This whole fiasco has really made me question the ethics of this site. I've used this site as a resource for years and never once questioned them. Now I realize that PCPER will defend Nvidia in order not to lose the close relationship they have with them. Pretty disappointed right now....

February 3, 2015 | 02:39 AM - Posted by Mark D (not verified)

Honestly, and this is coming from someone with a 970 that no longer wants it because of this....people are being totally unfair to PCPer with statements like this. I don't think they're defending NVIDIA or making excuses for them. They're presenting the data they independently collected, and calling it as they see it. If there's an issue here, its a small one and they're being realistic about it. If there was a bigger deal to be made they'd have made it, just like they did with the 840 EVOs.

They clearly have a friendly relationship with NVIDIA, but they do with AMD as well. As they should. I don't think it's their job to have a default antagonistic stance towards the industry and make a mountain out of a molehill. The 970 is still a great card, what this comes down to is whether or not this should have been marketed as a 4gb card vs a 3.5gb card. I personally think the latter but there's reasonable arguments to be made on either side. On the podcast they all agreed NVIDIA F-Ed up, they're diving deep to see how bad they F-Ed up...what more can you expect from them?

February 3, 2015 | 03:54 AM - Posted by Anonymous (not verified)

I somewhat agree with you, but that COD data looks absolutely atrocious. FPS is pretty good, in a playable range really, but those frametimes are just awful.

And? Hardly a negative thing to say about it. Again it is more excuses, not very impartial imo.

February 3, 2015 | 05:05 AM - Posted by Mark D (not verified)

Did you not read the closing comments? They're being pretty plain about there being an issue and they're directly attributing it to the VRAM. Even to the point where he's saying that the data doesn't even reflect the extent of the issues he was seeing in COD. That doesn't sound like making excuses to me, if anything it's the opposite. I mean the 970 is still a great card. It really is, I'm playing on one right now and it's every bit worth the money I paid for it, despite the fact that I'm not happy about the VRAM either and never would have bought it if I knew.

February 5, 2015 | 02:30 AM - Posted by ThorAxe

Of course they didn't read the closing comments Mark D. They hear what they want to hear.

Ryan could say that the GTX 970 was the worst card he had ever reviewed but the AMD Fanboys would still claim that he was being soft on Nvidia.

February 4, 2015 | 06:44 AM - Posted by The Stig (not verified)

I can't believe the writers of this is or their apologists. NVIDIA actually and intentionally fudged the card to make it slower so that they could sell 980s. Doesn't anyone see this? Does no-one understand oligopoly power? You can do it because there's nothing to compete. Simple. It's about money. Same as Coke and Coke Light. Why is Coke Light not cheaper than Coke when the largest cost (sugar) is left out? Because their marketing department convinced the public that the lack of sugar or substitute for it was worth the additional price. Result - profits soar.

When companies renege on ethics and are defended because "the product is good enough anyway" we lose the point. NVIDIA acted unethically - finished. They purposely dragged the performance down and sold it as a higher spec'd article. Did they lie completely? No, there is indeed 4GB on board, so no lie. But no mention of the split in performance between the 3.5GB and 512KB until some clever guys figured out what was happening.

Then the excuses and the tech jargon from NVIDIA and buddy-buddy with PCPerspective and other sites - because they have the money essentially to buy votes.

The 970 is a great card. No doubt. But it and NVIDIA are tainted now. This is a reputational issue - can we trust NVIDIA going forward. One thing NVIDIA has learnt is that the community aren't as stupid as thought. Scams will be found out and fingers pointed back and difficult questions asked.

I hope AMD makes a meal of this very stupid blunder NVIDIA made just to charge a higher price on the 980. If it hadn't have purposely restricted the 970 you would have noticed no visible difference between the 2 and gamers would have chosen the cheaper option on a dollar for performance basis.

February 4, 2015 | 01:23 PM - Posted by Mark D (not verified)

I dont understand what you're insinuating.

NVIDIA disabled failed blocks of a full 980 chip to make a 970. That's a normal and non-controversial practice across the chip industry, so that can't be it.

They developed a new process to salvage more of the chip than possible with previous methods...so that's questionably good for users, depending on the impact and how it done. Whether or not it's better to have a compromised 512mb partition, or just have a 3.5GB or 3GB card instead...that's the debate. Those are your choices. There's no physical way to make a 970 as it is with a full 4GB. It was either have a compromised 4GB or less memory, period. Either way everyone agrees they shouldn't have hid this info.

But they did. They essentially inflated the specs of the 970, making it look closer to a 980 than it is. That's not going to sell more 980s, it'll sell more 970s.

So either you're saying something that doesn't make sense, or you're not understanding what's going on here.

February 5, 2015 | 01:55 AM - Posted by Greg (not verified)

Your Coke comparison is pretty garbage and takes away from your argument. Distribution is where the cost is, not the cost of raw materials.

February 6, 2015 | 11:22 AM - Posted by arbiter

Nvidia is tainted huh? AMD is not so squeaky clean in their history of things either. If you look back over last 3-4 years, AMD's history is pretty dirty. Prime example is their complete lack of trying to fix crossfire stuttering issue that people repeatedly complained about for years.

February 3, 2015 | 05:29 AM - Posted by Darth415 (not verified)

How so? My 329$ GTX 970 is 6.5 inches long, and can maintain a 1390 mhz oc all day at 75c with a quiet fan profile. At that point, I'm beating my friends much more expensive r9 290 and GTX 780 (bought just a few months ago). You would have to be an idiot to even try to hate in this thing. It's not like Nvidia fabricated some giant lie or something (like amd did when it posted official clocks for the blower cooler 290x). The small team who typed up the technical specs for the reviewers mistyped 2 small details, and no one noticed BECAUSE IT BARELY MATTERED. The benchmarks are perfectly accurate, and they show this card absolutely justifying it's cost all day. What's your problem?
Just because some issues arise in situations that are best described as stupid (it was well known at launch from benchmarks that if you thought 4gb cards this powerful were gonna be enough for 4k gaming, you were gonna have a bad time). As a 970 owner who sees vram usage above 3gb in a couple games with smooth framerate, I personally believe having 2 of these (which should mandate twice the vram to cover twice the power) is about as useful as running 4gb of system ram in a mainframe, and you are an idiot (or just have money to spend) if you are running two 970s in your system in the first place. My 1440p performance is excellent, though I plan on upgrading again in a few months when I make the jump to 4k since there really isn't a setup today aside from a couple 8gb 290x cards that can take the load.

February 3, 2015 | 07:22 AM - Posted by boot318

ROPs and Cache were falsely advertised five months. I doubt they found out about that when things starting hitting the fan. Selling 970 with false specs was better than correcting it when they were selling 1+ million of them. Maybe only 2% of the people that have the card knows about the problem. That was great for business. The 970 is the best card out there for the money (my opinion). I think people are questioning Nvidia's ethics more than the card.

February 4, 2015 | 03:03 AM - Posted by Anonymous (not verified)

Oh shut up already. You had your 15 mins and it's over. Less than 10k petition sigs out of over 1 mil cards sold- that means people are happy with the performance as is, and couldn't give a crap about .2 L2 cache and some goddam ROPS that are meaningless to everybody but weirdo geeks who want attention.

Nvidia will do fine. They will not be hurt by this one bit. When they release their new best ever card, the same idiots around on these forums for the past few weeks will be first in line to buy them. Let's be real.

February 6, 2015 | 11:29 AM - Posted by Abram730 (not verified)

I'd say that PCper was tilting in the direction of you crazies. The data showed no issue with the VRAM. You are proved wrong over and over again. You response is to simple make up another story and claim that people are conspiring against you. Reality is reality.

Please seek the help of a qualified mental health professional. They should be able to put you on an effective program of treatment.

February 3, 2015 | 04:53 AM - Posted by Lelek (not verified)

Typo on Closing Thoughts, "...differently. Yes the GTX 970 has fewer shader cores and runs at slightly lower clocks than the GTX 970, but..."

Nice work, keep it up, cheers.

February 3, 2015 | 04:53 AM - Posted by Lelek (not verified)

Typo on Closing Thoughts, "...differently. Yes the GTX 970 has fewer shader cores and runs at slightly lower clocks than the GTX 970, but..."

Nice work, keep it up, cheers.

February 3, 2015 | 05:14 AM - Posted by Anonymous (not verified)

"The 970 is still a great card, what this comes down to is whether or not this should have been marketed as a 4gb card vs a 3.5gb card."

You had me on your side until you said the above. 4GB on the 970 is not the same as 4GB on a 290/X. It just doesn't work that way. If this was simply a matter of density, we ALL still would have bought the card because the reviews didn't lie. What the reviews lacked were frame time tests. If they were used by more sites on all their titles, people would've been able to see how much difference the two memory pools affected performance a lot better.

February 3, 2015 | 05:20 AM - Posted by Anonymous (not verified)

"The 970 is still a great card, what this comes down to is whether or not this should have been marketed as a 4gb card vs a 3.5gb card."

You had me on your side until you said the above. 4GB on the 970 is not the same as 4GB on a 290/X. It just doesn't work that way. If this was simply a matter of density, we ALL still would have bought the card because the reviews didn't lie. What the reviews lacked were frame time tests. If they were used by more sites on all their titles, people would've been able to see how much difference the two memory pools affected performance a lot better.

February 3, 2015 | 05:28 AM - Posted by Anonymous (not verified)

Okay, well now I have to know how running CoD (and maybe other titles) with various AA modes affects frame times at high resolutions. Hehehe.

February 3, 2015 | 06:51 AM - Posted by obiwan-80 (not verified)

test FAR CRY 4 e SHADOW OF MORDOR, please.

February 3, 2015 | 06:54 AM - Posted by Darryn Muller (not verified)

Ok, the only problem i have with this review is that only 2 games were looked at in the review. We need to be looking at "Next generation titles" like Shadow of Mordor, Dying Light, Metro last light or 2033 redux. Any of the games that are direct x 11 capable and tax your system already out of the box. Thats where you will see even more issues, due to them pushing your system to the limit.

I own 2x GTX 970 G1 gaming editions, and i have huge issues with the latest games with this memory issue. I have run many tests myself and besides the terribly optimised games, most new games are causing use issues in SLI on my system with stuttering and frame time variances, whereby it because extremely noticeable and irritating.

I am also running a Nvidia Surround setup of 4820 x 900. This compounds the problem and makes this issue more noticeable. My battlefield runs great with everything maxed out and resolution scale at 130%. But every other next gen title struggles when it hits the 3.5gig point or over.

Wold it be at all possible to do more tests on some other titles? Would love to put this to bed once and for all.

Thanks for all the hard work by the way! Been waiting for a good SLI review on this issue since it broke.

February 3, 2015 | 07:36 AM - Posted by Searching4Sasquatch (not verified)

You're running a surround monitor setup (resolution of 4820x900) with GTX 970 SLI and wondering why the latest and greatest games don't run well?

Exactly what games and what settings?

I honestly feel like there are a ton of 970 owners who are now complaining because they don't have 980 performance after hearing about this "issue".

February 3, 2015 | 07:52 PM - Posted by Darryn Muller (not verified)

No, not at all. I had been experiencing these issues before it even broke about the vram issue. I kept wondering if it was just my SLI setup and poor profiles. I am running Shadow of Mordor with everything maxed. I used to have one gtx 970 and it would run the game flat out at around 30fps. But it felt like 30fps and was playable. I put in my second card and now it runs the game between 45 and 55fps, but it stutters to death and the game play feels like 15fps. Try to do a camera turn with my mouse 360 degrees and performance just tanks.

I do realize that Shadow of Mordor only just had an SLI profile added on this last update so i thought ok maybe its that.

So i tried farcy 4, it is a lot less apparent when it comes to stuttering but i do get random frame time variences and it once again feels way slower than the 50 - 60fps i am receiving in game. I am running everything flat out with smaa or fxaa.

I then got Dying light. Same thing happened when running it maxed out. MSI after burner say 3.5gigs of ram and there is stuttering to all hell. I thought the new patch would help and it did a bit but it still doesn't feel like i am getting the fps fraps says i am getting. Also get random slow downs.

I am doing my best not to make this up in my mind, as i payed a lot of money here in South Africa to get these cards, and was almost broken when i heard about the Vram issue after having experienced these random problems before reading the full story.

Gtx 970 performance in all reviews are stellar! So i dont understand why any game that uses the "max" Vram of the card wouldn't feel like Sli gtx 970s.

Good example would be my sli Gtx 670 OC windoforce 2gig cards i used to have. Sure i ran out of Vram in a second but the performance i got out of them was rock solid! that's what made me go straight out and buy 2x gtx 970's.

I don't expect gtx 980 performance from these cards, because they are gtx 970's with different clock speeds and now Vram, however i do expect them to run without random anomalies in games that makes your actual game play feel far worse than your fps reader is telling you it should be.

February 6, 2015 | 11:25 AM - Posted by arbiter

Dying light would be a terrible game to test on, it has some kinda bottleneck issue that it doesn't utilize SLI properly so FPS barely is any better on 2 cards compared to 1. You can turn off most graphic options with minor fps impact but you mess with "draw distance slider" it is death to fps but yet draw distance is barely effected by it least for most game what you see.

February 6, 2015 | 11:42 AM - Posted by Abram730 (not verified)

Some of those games have been shown to have high Drawcalls. This is 2X of a problem for people with SLI/Xfire.
Any sort of a driver problem will tend to show up as stutter without the FPS dropping. The game animation starts to stutter. Agian SLI can compound this.

I had this problem and Fuck Asus and their shitty drivers and software. I'm still mad and I still have some issues.

February 3, 2015 | 08:11 AM - Posted by Quubik (not verified)

Are you guys done with testing the 970 issue ?
I feel that conclusion that problems only arise at effectivly running 6K is a bit dishonest as Battlefield 4 nor CoD dont use that much VRAM anyway compared to some other games. Shadow of Mordor might suffer from it already in some dual monitor/4K monitor cases so would love to see testing with more demanding games on VRAM.

February 3, 2015 | 09:11 AM - Posted by MONKEYpatch (not verified)

to remove stutter by g-sync!

February 3, 2015 | 09:33 AM - Posted by Anonymous (not verified)

sony vgp ac10v10 ac adaptor
lenovo ideapad miix 10 charger

February 3, 2015 | 12:09 PM - Posted by Carlo (not verified)

Really nice articles Ryan (all 4 parts), thanks.
I think that it could be really interesting to test GTX970 without(!) the 0.5GB ram portion; to test the effective contribution of this "near line" cache between the video memory and the system (pcie) memory and to verify if the constant struggle to keep memory under the 3.5 limit, the euristic shuffle of data between the two segments have an impact in frame rate consistency.

Just a thought.

February 3, 2015 | 02:15 PM - Posted by AnthonyShea

I find it hilarious that this was some grand conspiracy on nvidia's part. Not that the advertising dept screwed the pooch, besides as how the hardware and bridges are setup it is possible for it to use that extra .5GB everyone is throwing a tantrum over. knowing nvidia they will push out a driver update that better manages that .5GB

okay it doesn't like using the extra .5GB, it might 'feel' that using less and keeping at the higher bus spd returns a better result. they've even stated that the card will attempt to avoid the .5GB sector.

The other thing I have seen is the 'oh AMD has 8GB' well why do they need so much memory? even running a title like Skyrim all textured out your only pushing around 4GB. So that makes me think they actually have problems handling frame buffers. Having an extra 4GB of memory isn't going to radically increase the power needs of a card, thats just poor engineering.

And for the other AMD argument of that their cards don't only come with twice the memory but cost less. well they cost less to purchase but cost you significantly more at the outlet and cooling your apartment or house.

For those who are going to accuse me of just being team green, I have used ATI cards as recently as HD7850 which was a XFX Black edition and another one made by Gigabyte which might be a HD4850.

February 3, 2015 | 02:23 PM - Posted by nevzim (not verified)

There are cases already when more then 4GB is benefitial: http://pclab.pl/zdjecia/artykuly/chaostheory/2015/01/970_bug/charts/mear...

February 24, 2015 | 03:06 AM - Posted by Anonymous (not verified)

It's not poor engineering. It is marketing and an attempt to move an older GPU before new products are available. There is a benefit to 8GB in current games, but only at 8K resolution. Even at 4K the extra RAM is not necessary.

February 3, 2015 | 03:05 PM - Posted by Blake mcBlake (not verified)

Advanced warfare with my dual 970's on my 1440p is a laggy stuttery mess and this is the reason?

February 6, 2015 | 11:49 AM - Posted by Abram730 (not verified)

The tests show that it isn't an issue. There could be driver issues. What CPU are you using as your CPU needs to send 2X the commands and high numbers of drawcalls can kill you. Testing is often done on very high end rigs. So issues on a more normal gamer setup is missed.

I'd run DPC latency checker in windowed mode or run LatencyMon when it is happening. See if you have any latency spikes in windows. This will cause the game to stutter.

February 3, 2015 | 03:28 PM - Posted by Anonymous (not verified)

I think way too much time is being absorbed into this subject. Any new articles outside of the ring being worked on? I'm bored.

February 3, 2015 | 04:28 PM - Posted by Anonymous (not verified)

There is a reason so much time needs to be poored into this ignorant fools. There are people out there that bought 2 GTX970's including me and ever since have issues with games that use alot of VRAM. 3.5GB or more on 2560x1440p with max settings and some AA.

Shadow of Mordor, Far Cry 4, Lords of the Fallen, Dying Light to name a few!!!

I'm force to run a single GTX970 to be able to play them smoothly FFS.

A whole month I thought its the games or SLI profiles that were shit, tweaking the game configs and using tools made for those specific games to make it run smoother, SLI bits tweaks NOTHING helped. Then this shit came up that the memory was a spit partition with a slower portion! Yeah that explained a shit ton and now tests have emerged proofing this, not only on this site but other European sites (Nordic ones, German ones) We in europe have far better tech site digging deeper into this shit and had all this already last week for GTX970 SLI configs and massive frametime variances.

February 3, 2015 | 04:35 PM - Posted by Anonymous (not verified)

Thanks ... That's what I wanted ...

So, THERE is a REAL problem in SLI when the game requests more than 3.5 !

So ... Nvidia REALLY should start the refund process because the card IS NOT EVEN ABLE to perform what it stats to ... SLI.

Since you can play with a single one, means that with two, we have a problem.

Sad ... I Was looking to grab another one ... Now I Wont ... I Cant afford a 980 ...

February 11, 2015 | 12:41 PM - Posted by ProMace (not verified)

Nowhere is there any mention of 'not able to perform SLI'. I think you should put things more into perspective. If anything, the issue is much less significant than many are portraying it to be.

February 3, 2015 | 04:59 PM - Posted by Anonymous (not verified)

No disrespect PCPER. I still like what you do on here and like the streams and podcasts. Keep up the good work!

February 3, 2015 | 04:32 PM - Posted by Anonymous (not verified)

I have the same doubt as another user stated.

We all now that GTX 970 has a second memory module, but what we DON'T and NO ONE have stated it for us till now, is:

How much of "performance", we lost when the card uses this second module ?

That's point no one is seeing.

We all know that the 512MB memory allocation module is slower than the first one, but how much in "FPS" we lose because of it ???

THAT'S the concern we should worry about ( just talking about card performance, not the nVidia lies )

So, we HAVE or we DONT HAVE a stutter when the card uses more than 3.5 of VRAM in SLI ?

February 3, 2015 | 04:56 PM - Posted by Anonymous (not verified)

In SLI at 3.5GB and over you get a bit of stutter and hitches and doesnt feel smooth from my expereinces with those games mentioned above at 2560x1440p maxed out. Framerates arent a concern for me and most I believe but its the overall experience of smoothness isnt there!
I saw it immediately going back to one GTX970 and made sure not to get close to 3.5GB VRAM usage in those games and the experience is smoothness all around

February 3, 2015 | 05:15 PM - Posted by Blake mcBlake (not verified)

970 sli @ 3440x1440 far cry 4 remains unpayable. Even with fps at 60+ most of the time the stutter is unbearable.
970 sli @ 2560x1440 with gsync advanced warfare stutters but is playable.

The above mentioned games are installed on an 840 evo lol. Coincidence??

I have no problems with shadow of mordor however and dying light is fine without Nvidia dof on. Both maxed at 3440x1440.

February 6, 2015 | 12:17 PM - Posted by Abram730 (not verified)

Search "840 evo stutter".

February 3, 2015 | 07:00 PM - Posted by Johnny Rook (not verified)

So, the GTX 970 SLi performs worse than GTX 980 SLi and stutters more. Brilliant conclusion, Ryan.

What you failed to do was to give the GTX 970 SLi owners (and to the people that already have one GTX 970 and are considering to get a 2nd card), a comparison between the other card(s) in the same price point and with similar performance.

I already knew a GTX 980 SLi would run my games better than a GTX 970 SLi. However, what I still don't know is if returning my GTX 970s to get a R9 290(X) Crossfire setup will be better for my gaming experience as far as frame-pacing is concerned.
I learnt nothing useful from this tests. I still don't know if the stutter is due to a weaker chip or due to the memory system. I need to see more "weaker" chips tested in the exact same environment conditions to make a more informed decision.

I have to say I am very dissapointed. I really thought you (PCPer) were taking that extra step to inform your readers about their options.

Thanks, for nothing.

February 4, 2015 | 07:35 AM - Posted by Quubik (not verified)

Man if you have zero analytical skills and and cant draw your own conclusion from this data dont blame Ryan.
How can the stutter be from a weaker chip when the stutter starts right after passing 3.5GB of VRAM usage ? If it was the chip it would be way more gradual. '
Use you head man !

February 6, 2015 | 12:31 PM - Posted by Abram730 (not verified)

The stutter is above 4GB and the 980 has it too. This shows that the frame variance is not overly linked to memory use, but rather load.

February 3, 2015 | 08:19 PM - Posted by Mac (not verified)

Nvidia guidelines for testing sLi state that the test should be done in isolation? Looks like it. Considering the y axis is in 20ms stepping, that is some ugly ass performance.XDMA Xfire would put that to shame and completely own it.

February 3, 2015 | 11:51 PM - Posted by Anonymous (not verified)

So 5 months on, and you finally get around to testing SLI with FCAT. Do you wonder why everybody considers you an NV PR mouthpiece and shill? What took so long? We all know that there is no hesitation to get AMD cards' frame latency tested ASAP. What was the motivation to omit SLI frame latency tests for the latest NV products? What were NV orders, and why should we consider you an unbiased and neutral computer hardware review site with consumer interest in mind?

February 4, 2015 | 12:59 AM - Posted by ThorAxe

Why do you bother coming here if you hate it so much?

February 4, 2015 | 02:49 AM - Posted by Anonymous (not verified)

To piss you off.

February 12, 2015 | 08:01 PM - Posted by ProMace (not verified)

Now you are what we would call 'azijnzeiker'. I won't even bother translating that for you.

February 4, 2015 | 05:28 AM - Posted by Johnny Rook (not verified)

Do you even noticed there's absolutely NOT a single AMD Crossfire frame-pacing benchmark of the same games (BF4 and CoD:AW), conducted under the same conditions (resolutions and settings), as the nVIDIA cards were tested, being submited in this article - or anywhere on PCPer's website?

I wouldn't mind to see AMD Crossfire's frame-pacing results validating the current assumption that GTX 970's stuttering is more resultant from the partitioned memory system than it is resultant from having less cuda cores than GTX 980.

OR do I put my "conspiracy hat" and conclude Ryan's already knows that isn't the case; that Ryan already knows AMD CF stutters as much as the GTX 970 SLi stutters under the same exotic conditions and is only protecting AMD (and AMD card's), from being humiliated?

See how I could conclude Ryan's an AMD "PR mouthpiece and shill", just as easily as you?

I wish I knew what results and AMD R9 290(X) Crossfire produces under same conditions but, I think I'll never know - at least not from PCPer :(

February 6, 2015 | 11:33 AM - Posted by arbiter

Cause THIS IS NOT A STORY ABOUT AMD. that is why there is no AMD results on this. There are plenty of test done using 970SLI vs 290x CF on pcper and all over the damn net if you learned how to use a damn search engine.

February 6, 2015 | 12:54 PM - Posted by Abram730 (not verified)

AMD has worse frame pacing and that is why the AMD "fans" who are pushing this don't want that information out. They are demanding slanted and deceptive reporting.
Fair and balanced is is offensive to them. They demand that the press reflect their unreality. Their UFO tinfoil hat conspiracy thinking.

The results shows them to be wrong over and over. They demand more tests until one shows something that fits their story. Since none do they claim conspiracy and make up their own false evidence like the UFO people photoshopping in UFO's into pictures.

AMD fans are certifiably crazy, as is AMD PR. They literally put out video's with Nvidia breaking into peoples houses and trying smashing their AMD cards. They make false claims against devs who work with Nvidia and point to fake screenshots as evidence for their unfounded claims. They attempt to force the devs out of business unless they join their Gaming Evolved program. They then try to put malicious code into games to harm Nvidia performance and then claim Nvidia is the one doing that when you don't see bias in the benchmarks.

You do however see bias in AMD games. I've preordered games where 7850's were running much faster then GTX 680's.

February 7, 2015 | 02:03 AM - Posted by Anonymous (not verified)

Hey, at least then we would know if it was time to go pick up an AMD card or not. I'd also like to see GTX 960 data up here, for rounding out with other architectures.

February 4, 2015 | 07:08 AM - Posted by Anonymous (not verified)

Those people who have bought 2 970s should have one refunded, those who have bought 3 970s should have two refunded, and those like me who have bought ONE should keep it. It is a very good card and the hard line gamers are never satisfied never!

February 4, 2015 | 09:49 AM - Posted by Mac (not verified)

We're often treated to fraps fps vs observed fps when multi gpu is tested. I don't see it in this mini review.Maxwell sLi is BORKED and has been since release dropping frames and runting by the bucketload.This is why some ppl come into the chat screaming shill and the like because it looks to me like pcper is tip-toeing around the issue and yet they went full bore on the AMD frame pacing issues. I repeat again, maxwell sLi is broken, possibly a hardware issue and that is why FCAT testing has fallen out of favour.

February 4, 2015 | 03:36 PM - Posted by Anonymous (not verified)

Where is FCAT? This site promoted and rammed FCAT down everyone's throat for the past year or so when Nvidia was slightly ahead of AMD in frame Times. But now that has changed, PCPER refuses to use FCAT because it is forbidden by Nvidia.

Any site that don't show FCAT are clearly under Nvidia PR funding or just biased, and should not be taken seriously.

PS. If I remember correctly PCPER and TechReport worked alongside Nvidia to create FCAT to test AMD cards. So why are things different now for Nvidia?

February 5, 2015 | 01:29 AM - Posted by ThorAxe

You are either a troll or not too bright.

They have done the testing you asked for, you are just too stupid to realise it.

February 4, 2015 | 03:46 PM - Posted by Anonymous (not verified)

lol PCPCER just lost the last little bit of credibility it had left by not testing AMD R9 290X.290/295X. And by refusing to use FCAT on Nvidia as it did for AMD.

February 5, 2015 | 02:22 AM - Posted by Allyn Malventano

Funny that you posted this comment on the review that is full of FCAT on Nvidia data.

February 4, 2015 | 09:32 PM - Posted by JohnB (not verified)

PcPer = nVidia damage control squad.

February 5, 2015 | 02:20 AM - Posted by Allyn Malventano

If you guys would get off of your fanboy horse for more than 5 seconds, you might realize that Ryan *agrees* that there is a VRAM related issue on the 970's.

February 5, 2015 | 02:26 AM - Posted by ThorAxe

Allyn that would require them to use their brains, which they clearly don't have.

February 6, 2015 | 11:35 AM - Posted by arbiter

It would require them to take their AMD blinders off as well.

February 4, 2015 | 11:08 PM - Posted by Anonymous (not verified)

To go along with the pizza analogy. I would like to say that it doesn't matter if 7 slices is enough. The point is, I paid for 8, so give me 8, and whether or not I need the 8th is my business.

February 5, 2015 | 12:03 AM - Posted by Johnny Rook (not verified)

What a bunch of patented trolls!

"FCAT", "frame-pacing" benchmarks, "frametime variance" benchmarks ALL are the same and do the exact same damn thing: identifies dropped frames, runt frames, micro-stuttering, and other problems that reduce the visible smoothness of the action on-screen; they analyse the perceived gaming experience the user gets on-screen; what the player's actually seeing, as far as frametimes are concerned.
Refer to: http://www.geforce.com/hardware/technology/fcat/technology
Refer to: http://www.anandtech.com/show/7195/amd-frame-pacing-explorer-cat138/2
Refer to: http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-...

AND STFU!!!

February 5, 2015 | 11:12 AM - Posted by JMO (not verified)

Litigate, litigate, litigate:

http://bursor.com/investigations/nvidia/

February 6, 2015 | 11:36 AM - Posted by arbiter

its a class action lawsuit, enjoy the 10$ you are going to win if you are lucky while the lawyer takes the rest. Suckers.

February 5, 2015 | 08:04 PM - Posted by Anonymous (not verified)

Nobody cares about the .5... Just the basement lemmings, busy jumping on the wagon.

99.9% will not run into an issue and will be upgrading in 1-2 years anyway.

Arguing over this topic is a waste of time and makes most of you look like pretentious children looking to argue.

Go play a game! Not like your having any issues, if you even have this card in the first place

February 7, 2015 | 11:38 PM - Posted by Anonymous (not verified)

I'm running two G1 970s on my old SB 2600k rig. I'd intended to upgrade my system to Haswell E/DDR4 this winter after bioses matured and ram prices fell, but when one of my 570s in SLI died, I replaced it with my first 970 in November, based completely on published reviews, thinking, perhaps, it could be incorporated into my new build. Singly the 970 bested my 570s together in my usual benchmarks, and it overclocked well. My wife then encouraged me to get another for Christmas, so I did, intending to bench and OC the cards, working out the bugs, then install them in my new system later this year. While I have no particular gripe with their performance in my SB machine with my older games, I have a huge gripe with Nvidia for their screw-up.
I'd planned to build my new rig to last at least 3 years, and put the money in these two GPUs with that in mind (over $800 w/ tax for a pair G1s at Microcenter). Well, now that investment seems wasted. Had I kept just one 970 in this obsolescent machine to eke out another 6 months till Broadwell Ks arrived, that would have been an expensive, but tolerable investment, and would have kept this SB system current enough to give away. Now, however, having invested so much of my alloted cash into the video component, I'll have to delay my new rig to save more money or build it with GPUs lacking the longevity of the rest of its components. This is very unsatisfying. Instead of a of a near-peak rig that will take me nearly to the end of the decade, I'll probably install gimped GPUs that will need to be replaced during its lifetime and send my old SB machine to the graveyard. . .
I don't blame Microcenter, Gigabyte, or the reviewers, I assign the blame squarely with Nvidia for putting profit before performance--promising the latter while reaping the former--and myself for thinking that this performance/price ratio was a little too good to be true.

February 8, 2015 | 07:34 AM - Posted by TD42 (not verified)

Nvidia's success in putting so much performance into such quiet and efficient cards with much less memory bandwidth etc is well known.

One question comes to me every time this is mentioned at the latest Maxwell review.. What could Nvidia achieve if they took the shackles off, upped the memory to 6 or 8GB and increased memory interface from 265 to 384 or 512bit, using the hardware/firmware advances that made the 970/980 so fast/efficient?

I suspect 2015 will be VERY interesting.. I was ready to buy a 980 until I pondered this and the probable 4k single card performance.

Bring on the GTX990 !

February 8, 2015 | 07:35 AM - Posted by TD42 (not verified)

Typo of course, 256 not 265.

February 23, 2015 | 11:35 PM - Posted by Anonymous (not verified)

Performance being the same as it was from the beginning is irrelevant. Performance not suffering except in extremely rare circumstances is also irrelevant. Lying is lying. Nvidia lied. Even worse, they knew the specs were a lie and chose to say or do nothing about until they were forced to admit it. So, again, lying is lying. And it is wrong. If you excuse or try to diminish that fact then don't be surprised and don't complain in the future when you are being lied to more and more frequently.

March 25, 2015 | 07:26 PM - Posted by Anonymous (not verified)

I changed my pair of GTX 970 used for SLI.

The Reason ? They STUTTER LIKE HELL. There is NO WAY to play anything that makes use ( or try to ) of the slower 512MB's segment.

Single GPU uses above 3.5GB's of VRAM ( In a different way, but the card uses ).

SLI CAN'T go further 3.5GB's mark, only using DSR or resources alike it.

So, this review is CRAP, you just say that the GTX 970 SLI performs worse than GTX 980 SLI, but you DON'T Mention the stutter. Why ? You can't lose your nVidia contracts ?

I Doubt you tested during hours like you said.

I Single run using Shadow Of Mordor in Ultra Settings will BREAK the GTX 970 SLI.

So the real information is: GTX 970 SLI IS BROKEN ABOVE 3.5GB's VRAM usage.

I've upgraded my pair to two GTX 980, and guess what !?!?!? NO MORE STUTTERS in SLI AT ALL.

Magic ?

I was expecting a HONESTLY review ... Not a paid one trying to "hide" the real deal with these BROKEN cards in SLI. SAD.

This user did a cool tests ( Usefull ones ) reporting the same experience I HAD:

http://www.reddit.com/r/pcmasterrace/comments/2tuqd4/i_benchmarked_gtx_9...

No more craps from nVidia and their "paid" reviewers.

Sorry for my English, not my native language.

April 8, 2015 | 02:39 PM - Posted by Anonymous (not verified)

https://forums.geforce.com/default/topic/820753/geforce-900-series/gtx-9...

April 13, 2015 | 01:56 PM - Posted by Ninjawithagun

@ Ryan Shrout - another excellent article. Thank you for the great breakdown of this contraversial performance issue. IMHO, the arrival of the GTX980Ti will help eliminate my particular decision with regards to which video card to buy. My bet is to buy the GTX980s once they drop in price after the GTX980Ti is released. I can already see all those like-new used GTX980s on sale in the Amazon marketplace. For me, it's a win-win situation and I don't have to pay the full "drive off the lot" price. I sit here patiently waiting for that day :D

May 31, 2015 | 02:34 PM - Posted by Blair (not verified)

Wow The 970 and 980 are very close in SLI! I would say forget about SLI for the 980! Regardless of Nvidia being scumbags on the V-Ram, The 970 is still a very good performing GPU compared to it's bigger brother the 980. And the price still makes it the High End GPU of choice. I think I might SLI mine now.

May 31, 2015 | 02:38 PM - Posted by BossTek (not verified)

Eh yo yo! Yeah Yo! Like yo and eh I uh ahh izza you know what I am sayin yo? yeha! So like yo I am like a badazz gamer yo! Yeah! I got everything! I already got a classified 980 Ti yo because I like party with like the owner of Nvidia yo! Yeah! And yo I gots like 20 for free yo! Yeah!

November 9, 2015 | 05:46 AM - Posted by Blair (not verified)

So in other words a single 970 is fine, But it's a waste of money to SLI them. My last (650 Ti) Nvidia card had no SLI option, So it kinda pisses me off that that I pay out the wazoo for a 970, and still no sensible SLI option. I should have stayed with my R9 270 until the price of the R9 390's went down!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.