Review Index:
Feedback

The NVIDIA GeForce GTX 960 2GB Review - GM206 at $199

Author:
Manufacturer: NVIDIA

A new GPU, a familiar problem

Editor's Note: Don't forget to join us today for a live streaming event featuring Ryan Shrout and NVIDIA's Tom Petersen to discuss the new GeForce GTX 960. It will be live at 1pm ET / 10am PT and will include ten (10!) GTX 960 prizes for participants! You can find it all at http://www.pcper.com/live

There are no secrets anymore. Calling today's release of the NVIDIA GeForce GTX 960 a surprise would be like calling another Avenger's movie unexpected. If you didn't just assume it was coming chances are the dozens of leaks of slides and performance would get your attention. So here it is, today's the day, NVIDIA finally upgrades the mainstream segment that was being fed by the GTX 760 for more than a year and half. But does the brand new GTX 960 based on Maxwell move the needle?

View Full Size

But as you'll soon see, the GeForce GTX 960 is a bit of an odd duck in terms of new GPU releases. As we have seen several times in the last year or two with a stagnant process technology landscape, the new cards aren't going be wildly better performing than the current cards from either NVIDIA for AMD. In fact, there are some interesting comparisons to make that may surprise fans of both parties.

The good news is that Maxwell and the GM206 GPU will price out starting at $199 including overclocked models at that level. But to understand what makes it different than the GM204 part we first need to dive a bit into the GM206 GPU and how it matches up with NVIDIA's "small" GPU strategy of the past few years.

The GM206 GPU - Generational Complexity

First and foremost, the GTX 960 is based on the exact same Maxwell architecture as the GTX 970 and GTX 980. The power efficiency, the improved memory bus compression and new features all make their way into the smaller version of Maxwell selling for $199 as of today. If you missed the discussion on those new features including MFAA, Dynamic Super Resolution, VXGI you should read that page of our original GTX 980 and GTX 970 story from last September for a bit of context; these are important aspects of Maxwell and the new GM206.

View Full Size

NVIDIA's GM206 is essentially half of the full GM204 GPU that you find on the GTX 980. That includes 1024 CUDA cores, 64 texture units and 32 ROPs for processing, a 128-bit memory bus and 2GB of graphics memory. This results in half of the memory bandwidth at 112 GB/s and half of the peak compute capability at 2.30 TFLOPS.

Continue reading our review of the NVIDIA GeForce GTX 960 2GB Graphics Card!!

Those are significant specification hits and will result in a drop of essentially half the gaming performance for the GTX 960 compared to the GTX 980. Some readers and PC enthusiasts will immediately recognize the GTX 960 as a bigger drop from the flagship part than recent generations of graphics cards from NVIDIA. You're not wrong.

  GTX 960 GTX 970 GTX 980 GTX 760 GTX 770 GTX 780 GTX 660 GTX 670 GTX 680
GPU GM206 GM204 GM204 GK104 GK104 GK110 GK106 GK104 GK104
GPU Cores 1024 1664 2048 1152 1536 2304 960 1344 1536
Rated Clock 1126 MHz 1050 MHz 1126 MHz 980 MHz 1046 MHz 863 MHz 980 MHz 915 MHz 1006 MHz
Texture Units 64 104 128 96 128 192 80 112 128
ROP Units 32 64 64 32 32 48 24 32 32
Memory 2GB 4GB 4GB 2GB 2GB 3GB 2GB 2GB 2GB
Memory Clock 7000 MHz 7000 MHz 7000 MHz 6000 MHz 7000 MHz 6000 MHz 6000 MHz 6000 MHz 6000 MHz
Memory Interface 128-bit 256-bit 256-bit 256-bit 256-bit 384-bit 192-bit 256-bit 256-bit
Memory Bandwidth 112 GB/s 224 GB/s 224 GB/s 192 GB/s 224 GB/s 288 GB/s 144 GB/s 192 GB/s 192 GB/s
TDP 120 watts 145 watts 165 watts 170 watts 230 watts 250 watts 140 watts 170 watts 195 watts
Peak Compute 2.30 TFLOPS 3.49 TFLOPS 4.61 TFLOPS 2.25 TFLOPS 3.21 TFLOPS 3.97 TFLOPS 1.81 TFLOPS 2.46 TFLOPS 3.09 TFLOPS
Transistor Count 2.94B 5.2B 5.2B 3.54B 3.54B 7.08B 2.54B 3.54B 3.54B
Process Tech 28nm 28nm 28nm 28nm 28nm 28nm 28nm 28nm 28nm
MSRP $199 $329 $549 $249 $399 $649 $230 $399 $499

This table compares the last three brand generations of NVIDIA's GeForce cards from x80, x70 and x60 products. Take a look at the GTX 680, a card based on the GK104 GPU and the GTX 660 based on GK106; the mainstream card has 62.5% of the CUDA cores and 75% of the memory bus width. The GTX 760 is actually based on the same GK104 GPU as the GTX 680 and GTX 770 and includes a wider 256-bit memory bus though dropped to half of the CUDA cores of the GK110-based GTX 780.

It's complicated (trust me, I know), but NVIDIA definitely wants to get to smaller GPU dies again on the lower-priced parts. Way back in 2012 NVIDIA released the GTX 660 with a 2.54 billion transistor die on the 28nm process, but stayed performance competitive with the 700-series. The GTX 760 jumped up a lot to a 3.54 billion transistor die and increased the price up to $250 at launch. Today's release of the GTX 960 is down to 2.94 billion transistors, near that of the GTX 660, but with a lower starting price point of $199.

Power use on the GTX 960 is amazingly low with a rated TDP of 120 watts and in our testing the GPU almost never even approaches that level. In fact, when playing a game like DOTA 2 with V-Sync off (60 FPS cap) the card barely draws more than 35 watts! (More details on that on the power page.)

View Full Size

In the press documentation from NVIDIA, the company makes several attempts to put a better spin on the specifications surround the GeForce GTX 960. For the first time, NVIDIA mentions an "effective memory clock" rate that is justified by the efficiency improvement in memory compression of Maxwell over Kepler. While this is definitely true, it's been true between generations for years and is part of the reason analysis of GPUs lie ours continue to exist. Creating metrics to selective improve line items is a bad move, and I expressed as much during our early meetings.

View Full Size

Separately, NVIDIA is moving forward with the continued emphasis on MFAA performance numbers. Remember that multi-frame sampled anti-aliasing (MFAA) was launched with the GTX 980 and GTX 970, and uses a post-processing filter to combine multiple frames temporally at 2xMSAA quality with shifted sample points. The result is a 4xMSAA look at 2xMSAA performance, at least in theory. When the GTX 980 and GTX 970 launched game support was incredibly limited, making the feature less than exciting. With this new driver, Maxwell GPUs will be able to support MFAA on all DirectX 11 and 10 games that support MSAA excluding only Dead Rising 3, Dragon Age 2 and Max Payne 3. That applies to most games we test in our suite including Crysis 3, Battlefield 4 and Skyrim; other games like Metro: Last Light or Bioshock Infinite use internal AA methods, not driver-based MSAA, and thus are unable to utilize MFAA.

When NVIDIA defaults to using MSAA, they are comparing 2xMSAA with MFAA (4xAA quality essentially) to 4xMSAA on other cards. To its credit, NVIDIA says they are only comparing this way to previous NVIDIA hardware, not to AMD's competing hardware. My thoughts on this are mixed at this point as it will no doubt start a race from both parties to fully integrate and showcase custom, proprietary AA methods exclusively going forward. See my page on MFAA performance later in the review for more details.

There are interesting comparisons to be made between the new GTX 960 and the currently shipping competing parts from AMD. Some of the specification differences will be claimed as important advantages for the Radeon line up. Obviously our performance evaluation will be the final deciding factor, but is there anything to these claims?

  GTX 960 GTX 760 R9 285 R9 280
GPU GM206 GK104 Tonga Tahiti
GPU Cores 1024 1152 1792 1792
Rated Clock 1126 MHz 980 MHz 918 MHz 827 MHz
Texture Units 64 96 112 112
ROP Units 32 32 32 32
Memory 2GB 2GB 2GB 3GB
Memory Clock 7000 MHz 6000 MHz 5500 MHz 5000 MHz
Memory Interface 128-bit 256-bit 256-bit 384-bit
Memory Bandwidth 112 GB/s 192 GB/s 176 GB/s 240 GB/s
TDP 120 watts 170 watts 190 watts 250 watts
Peak Compute 2.30 TFLOPS 2.25 TFLOPS 3.29 TFLOPS 3.34 TFLOPS
Transistor Count 2.94B 3.54B 5.0B 4.3B
Process Tech 28nm 28nm 28nm 28nm
MSRP $199 $249 $249 $249

While comparing GPU core counts is useless between architectures, many of the other data points can be debated. The most prominent difference is the 128-bit memory bus that GM206 employs when compared to the R9 285 with a 256-bit memory bus or even the R9 280 with its massive 384-bit memory bus. Raw memory bandwidth is the net result of this - the GTX 960 only sports 112 GB/s while the R9 280 tosses out 240 GB/s, more than twice the value. This allows the R9 280 to have a 3GB frame buffer but also means it has disadvantage in TDP and transistor count / die size. An additional 1.4 billion transistors and 130 watts of thermal headroom are substantial. The Tonga GPU in the R9 285 has more than 2.0 billion additional transistors when compared to the GTX 760 - what they all do though is still up for debate.

There is no denying that from a technological point of view, having a wider memory bus and higher memory bandwidth is a good thing for performance. But it comes at cost - both in terms of design and in terms of AMD's wallet. Can NVIDIA really build a GPU that is both substantially smaller but equally as powerful?


January 22, 2015 | 09:41 AM - Posted by Anonymous (not verified)

For the performance of the card at $199, it seems like they have some built-in wiggle room for the inevitable price war. This card looks more like a $180 card, IMO. The 280/285/280X are all just overpriced at the moment.

January 22, 2015 | 10:09 AM - Posted by Friendly Neighbour (not verified)

Huh? What are you yapping about?

You can get an MSI 280 custom-cooled at Newegg for 180 dollars. This card costs 20 dollars more and performs worse in every single benchmark.

It's a laughable card.

January 22, 2015 | 10:52 AM - Posted by dragos (not verified)

You're right. I did a quick calculation based on +100W for the 280 and 600hours of gaming per year ==> 6$/year

In my opinion the 280/285 are better choices as long as they can be found/fitted with a good quiet cooler.

The 960 looks like a good 150$ card, probably competitive with the 275

January 22, 2015 | 11:55 AM - Posted by Cataclysm_ZA

Its not always power usage that's the concern, heat output is also a major factor, especially if you're shoving this into a smaller mATX or ITX chassis that has limited cooling options. Plus, this thing ends up being really, really quiet.

January 22, 2015 | 01:04 PM - Posted by arbiter

Um i would say they it is comparable in every single benchmark while DRAWING only 60% of the power. AS for fool that calculated power $ per year, that is based on what $ per KW? Some places electric is MUCH more expensive.

January 23, 2015 | 04:37 PM - Posted by Mike S. (not verified)

He wrote +100W and 600 hours of gaming per year, so that's 60,000 watt-hours, or 60 kwh. For $6.00, that's $0.10 per kwh. Even at $0.30 per kwh, you're only at $18.00 per year - but that's assuming a whopping 600 hours of gaming with your card pegged to the max.

As someone else said, if cooling the card in a tiny case was a factor then I'd pay extra for the 960. Or I suppose if I was going to use CUDA or mine digital currency 24/7 then calculations per kwh would be THE primary consideration of my purchase.

But otherwise, I wouldn't worry too much about the watts.

January 22, 2015 | 02:55 PM - Posted by Friendly Neighbour (not verified)

By the way, my 180 comment is probably too kind to Nvidia.

You can get some 280's off from Newegg at 140 dollars.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814161400

And you can just shell out just 40 dollars more for a custom-cooled R9-290 and get this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121842

40-50% more performance and DOUBLE the VRAM.

Anyone who buys this card is a fanboy. Sorry, not sorry.

January 22, 2015 | 02:56 PM - Posted by Friendly Neighbour (not verified)

(40 dollars more than a 960 it should say).

January 22, 2015 | 04:00 PM - Posted by Sparky (not verified)

"Anyone who buys this card is a fanboy. Sorry, not sorry."

You are right. Anybody who buys the $140 HIS 7950 is a fanboy. The card is unacceptably loud. Like a vacuum cleaner.

February 7, 2015 | 08:59 AM - Posted by Powerglove (not verified)

I have to stop you there and call you out on your delusions, nobody is a fanboy for buying a graphics card, as a matter of fact that what you're saying is just something that a fanboy would say.

January 22, 2015 | 10:18 AM - Posted by Mandrake

Very impressive power consumption. Very easy to see that Nvidia has a ton of room to move prices down should they wish to do so. They engineered themselves a terrific GPU that works for them in that regard, while offering a healthy performance benefit over the 760 on the same 28nm process node.

The new PCPer power testing looks amazing Ryan! I can't wait to see this more thoroughly examined in future articles. :)

January 22, 2015 | 10:38 AM - Posted by BBMan (not verified)

Hope so. I remember the day that $200 could have got you a leading edge card.

I'm waiting for AMD to come out with their next generation. If they can't at least match the power consumption, I'm staying nVidia. People simply don't understand in this business that the power bill is a "recurring cost" that can eat up whatever you thought you saved in non-recurring price-performance.

January 22, 2015 | 08:57 PM - Posted by Anonymous (not verified)

The power bill argument falls short of reality for the vast majority of people. You think you're being clever by looking but you're probably getting played.

January 23, 2015 | 10:13 AM - Posted by BBMan (not verified)

I'd get a refund on my math and reality education if I were you. At a basic rate of $0.10/KWH a board that uses 100W more than another will eat up $100.00 in 1000 hours. In this economy of a shrinking middle class with a shrinking income may have better plans for $100.00 than frying minds out with game playing.
Oh- and I did forget compounding cooling costs in summer.

January 23, 2015 | 05:07 PM - Posted by willmore (not verified)

You might check into that refund, friend.

100W for 1000 hours is 100,000 W*h

If power is $0.10 per kW*h, then that comes out to $10

You seem to have misplaced the kilo on that kWh value.

January 22, 2015 | 10:21 AM - Posted by Anonymous (not verified)

Lisa-Su talks about new products coming in Q2 of this year for AMD in the earnings conference call -

http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-eventDetails&EventId=5178684

We all know that new AMD products are coming, but just not when. According to the conference call, it sounds like Q2 is when new gpu's will start showing up

January 22, 2015 | 10:31 AM - Posted by Anonymous (not verified)

Gold award ? lol, how much did nvidia pay for that ?

The 960 is garbage in all metrics but power consumption.

Avoid this golden(LOL) turd.

January 22, 2015 | 11:42 AM - Posted by Anonymous (not verified)

I hate to say it, but I'm getting tired of every site giving Gold awards on both sides for less than what we're used to. It's almost a given at this point. Personally I think it deserves more of a Bronze or Silver award at best (only because power consumption). Performance is way too sketchy for what is already said to be an 18 month product refresh.

I guess if you're looking at it from the same perspective Nvidia did with the 900 Series and compare it with the 600 Series, yeah it looks stellar. I don't care what kind of voodoo magic compression they're doing. Buying a gaming oriented 128 bit card in 2015 is not even in consideration for most people for obvious reasons.

January 22, 2015 | 12:36 PM - Posted by fade2blac

I agree that the model number and pricing under which this is marketed raise expectations above the end result. It ends up feeling like a long overdo price correction as opposed to advancing mainstream performance. I always felt like the GTX 760 was overpriced and yet it never really had to drop it's price below $250...and for good reason...people continued to buy them anyway.

But model numbers are just marketing. To borrow a phrase from another tech review site, "There are not bad products, just bad pricing". nVidia is clearly still being stubborn on the price/performance front, but they (and apparently their customers on whole) can afford it. I'm sure many people would have a higher opinion of the GTX 960 as a product if it were released at $179.

January 22, 2015 | 01:25 PM - Posted by Daniel Masterson (not verified)

HardOCP gave it a silver.

January 22, 2015 | 10:58 AM - Posted by Anonymous (not verified)

Yeah, this is kind of a disappointment.
When I read the rumours that claimed performance in line with a R9-280 I hoped it would be priced at around $170-180 and lead to a price cut for the 750Ti.
To call it 'garbage' is of course ridiculous, but it's really coming from the expectation that it would be a similarly great value proposition to the 970. That card has spoiled us all.

January 22, 2015 | 11:18 AM - Posted by Anonymous (not verified)

Oh and congratulations on the new power testing setup, guys.
That could have used a separate article maybe. Looking forward to exploration of power draw in various scenarios like the MOBA 60fps one.

January 22, 2015 | 11:46 AM - Posted by fade2blac

nVidia will just say this "is" a good value proposition. They are pretty much calling the shots right now as far as product cycles and pricing due to stagnant process technology (all GPU's are currently stuck at 28nm) and limited competition from AMD. They have the luxury of dictating due to their market position (high profit margins, high market share/revenue).

nVidia's marketing perspective might be:

1) It's half a GTX 980 for less than half the price and coincidentally nearly the same GPU/$ of a GTX 970.
2) It has modest performance and feature improvements over a GTX 760, but it is also launching at a $50 lower MSRP.
3) The significant boost in power efficiency leads to substantially lower power requirements, less noise/heat.

However, the rest of us do not live in a vacuum and can actually acknowledge real market prices and competing products. I also see a huge $130 gap between the GTX 960 and 970. Rumors of a GTX 960Ti may be the reason nVidia is leaving so much room in the pricing (and performance) of their Maxwell lineup. A GTX 960Ti for about $249 could slot in quite easily.

January 22, 2015 | 11:14 AM - Posted by Anonymous (not verified)

Well this is a snot green turd. If it had been introduced at around 150 bucks it might be worth considering but without ASync support it fails regardless.

January 22, 2015 | 02:00 PM - Posted by Allyn Malventano

You can't even buy an ASync panel yet. Seriously, why is that even relevant?

January 22, 2015 | 02:19 PM - Posted by Anonymous (not verified)

Ignorance is thick at PCPer

Could you not have bought a G-Sync enabled card prior to G-Sync release?

Wow!!! double standards.

January 22, 2015 | 04:03 PM - Posted by Sparky (not verified)

G-Sync is available since a year, Freesync will be released in 3-4 months, and it's apparently worse than G-Sync, as it produces more lag spikes.

January 22, 2015 | 04:15 PM - Posted by Anonymous (not verified)

LOL!!!

Can you please point us to your World Exclusive FreeSync review?

No kidding G-Sync is available Allyn said, "You can't even buy an ASync panel yet. Seriously, why is that even relevant?"

Ah, You could have bought a G-Sync enabled GPU in 2012. G-Sync wasn't announced until the end of 2013. We didn't see a retail G-Sync monitor until Q2 2014. Being able to buy a G-sync enabled gpu 2 years prior to release and is more concerned with someone asking why there isn't the possibility of a FreeSync A-Sync compatibility 2 months out of availability.

Maybe his memory is fading.

January 24, 2015 | 01:08 AM - Posted by Allyn Malventano

I hate to disappoint, but not all AMD GPU's are going to support ASync. The reason I don't find the OP's comment relevant is because GSync is not the same thing as ASync, and I say that having seen both in person.

January 24, 2015 | 03:31 AM - Posted by Anonymous (not verified)

Your not disappointing me. Just once again showing your bias.

There was nothing wrong with the OP wanting to know about ASync compatibility prior to monitors being available.

January 22, 2015 | 11:48 AM - Posted by Anonymous (not verified)

Guys, please reread the articles before posting. Quite a few "for" instead of "or" and stuff like that. Are you writing on an ipad?? You are supposed to be professionals.

January 22, 2015 | 01:35 PM - Posted by Jeremy Hellstrom

Constructive criticism is always welcome but it requires actual examples of what is wrong. 

Without examples of the needed corrections it is more accurately classified as bitching and seeing as how I just looked at every instance of the word 'for ' in this article ...
 

January 22, 2015 | 05:28 PM - Posted by Anonymous (not verified)

"While this is definitely true, it's been true between generations for years and is part of the reason analysis of GPUs lie ours continue to exist. "

I noticed this typo right away. At least, I hope it is a typo and not a Freudian slip. Can the employees see the email address when posters use anonymous? I am not the same "anonymous" from above.

Ryan often looks very tired; I assume he wrote this review himself. I was horribly tired for a long time until I found out I actually have sleep apnea. I hadn't had a good night of sleep in years and was close to having a heart attack every night. Since then, when I see people obviously tired, I have to wonder if they have it also.

January 22, 2015 | 11:53 AM - Posted by Anonymous (not verified)

This site's reviews suck. How could this card possibly get a gold award? Obviously some money being transferred here for such a horrible review to be published.

January 22, 2015 | 01:40 PM - Posted by Allyn Malventano

It got Gold because Ryan felt it deserved Gold. We choose our awards here at PCPer, and we are not influenced by, nor do we accept money for awards or favorable reviews.

January 22, 2015 | 11:53 AM - Posted by Randal_46

"The GTX 960 by design only requires a single 6-pin power connection and the ASUS Strix model follows that lead."

So, is the 960 the new 750Ti if someone wants to drop a GPU into an off-the-shelf desktop?

January 22, 2015 | 01:38 PM - Posted by Allyn Malventano

Pretty much, yeah.

January 22, 2015 | 12:18 PM - Posted by Anonymous (not verified)

MFAA is impressive, and the fact this encodes/decodes .264 is really nice. I think $200 is a bit much, but then it seems like it was designed for OC'rs which generally lends to a price premium. I think once it drops in price to the $150-180 range it'll be the go to card for HTPC

January 22, 2015 | 01:45 PM - Posted by Allyn Malventano

It's not far off to think these will come down from their $200 MSRP. As far as HTPC use goes, this card spends a lot its time with the fans *off*, so it's definitely great for that purpose.

January 22, 2015 | 12:28 PM - Posted by Anonymous (not verified)

That card is a joke, savec by the fact it has nVidia written on it.
The same thing from AMD would have been trashed ... I mean, same perf as three years old product for more $$ ???

Oh wait, AMd did it ... the R9 285 ... and it got trashed.

January 22, 2015 | 04:05 PM - Posted by Sparky (not verified)

AMD just relabeled their GPUs. The 960 is a new chip with new features. The only GPU with x265 encoding and HDCP 2.2 support on the market.

January 22, 2015 | 05:51 PM - Posted by Anonymous (not verified)

That encoder and decoder will still be limited to 8-bit which becomes almost useless when watching 4k content which H.265 is targeted at.

January 22, 2015 | 01:18 PM - Posted by gamesinger

Will there be upgraded 900 series cards coming out when the Display Port 1.4 standard is finalized ?

January 22, 2015 | 01:20 PM - Posted by gamesinger

What is a good size of UHD monitor for the 960 ?

January 22, 2015 | 01:18 PM - Posted by Matt (not verified)

Why is this (official) picture of GTX 960 showing two SLI fingers?
http://international.download.nvidia.com/geforce-com/international/image...

January 22, 2015 | 01:18 PM - Posted by caution (not verified)

Question: Why do I always have to buy AMD GPU's to leverage double precision performance? C'mon nvidia, you can do it too - don't you? :)

January 22, 2015 | 04:06 PM - Posted by Sparky (not verified)

Why? It would make the GPU more expensive and 99% of users never need it.

January 22, 2015 | 04:07 PM - Posted by Sparky (not verified)

Why? It would make the GPU more expensive and 99% of users never need it.

January 22, 2015 | 01:19 PM - Posted by Kranky

I currently run a 760. What's the value of upgrading from such a recent card that performs pretty well at 1080p, or, what will I miss if I keep with my 760?

January 22, 2015 | 01:19 PM - Posted by Chuck M (not verified)

Why do some 960 models have a 8 pin connector and other a 6 pin connector?

January 22, 2015 | 01:20 PM - Posted by justin stevens (not verified)

how much how does the 960 compare to the 750TI? i know the 960 is faster but it would be nice to know as the 960 is really close to the 750ti in price (when compared to non sale price at launch of the 2gig the 750ti is much cheaper now but its not that old of card for the budget gamer who might want to upgrae)

January 22, 2015 | 01:21 PM - Posted by Ashphalt (not verified)

Will Nvidia make an new sli bridge too fit the X99 boards?

January 22, 2015 | 01:21 PM - Posted by tellis006

Here is my question please answer. Will I ever win ANYTHING? 2. I have 2gtx 680 in sli with a 4930K how much better will these 960's do in sli? please do not answer. hint hint

January 22, 2015 | 01:47 PM - Posted by Allyn Malventano

Don't feel bad - I run SLI 680's as well, and I work here! :)

January 22, 2015 | 05:48 PM - Posted by Anonymous (not verified)

You've got a $500 CPU and $1000 in GPUs. You aren't exactly in desperate need of winning free hardware.

January 24, 2015 | 01:04 AM - Posted by Allyn Malventano

680's were $500 a LONG time ago :)

January 22, 2015 | 01:21 PM - Posted by Ben C. (not verified)

How does the 960 use color compression to lower memory bandwidth? Do all games take advantage of this feature?

January 22, 2015 | 01:22 PM - Posted by Sarcasticus (not verified)

4GB memory version? How much would that benefit this card at higher resolutions?

January 22, 2015 | 01:22 PM - Posted by Smithy (not verified)

Question, what is the Level 2 cache of GM206 which MFAA and 3rd gen colour compression with delta relies on?

January 22, 2015 | 02:20 PM - Posted by Smithy (not verified)

Found it, it was 1MB and half that of GM204.

January 22, 2015 | 01:23 PM - Posted by t-user (not verified)

Are there any plans for an "ITX" edition of the GTX960? Cause the thermal envelope seems ideal...

January 22, 2015 | 01:23 PM - Posted by CoffeeKid

At what point does it make sense to upgrade to a GTX 960? Are you targeting everybody that has a 670 or older card?

January 22, 2015 | 01:24 PM - Posted by Anonymous (not verified)

Direct advantages of GTX960 and Windows 10

January 22, 2015 | 01:26 PM - Posted by WHoward (not verified)

Is G-Sync supported on your Linux drivers?

January 22, 2015 | 01:26 PM - Posted by sanadanosa

why 128-bit?

January 22, 2015 | 01:27 PM - Posted by JM446

Will Nvidia ever offer scan line interleaving. Don't you own the patents for the purchase of 3DFX. This seems like it is the best option for rendering with 2, 3, or 4 graphics cards. Could DirectX support scan live interleaving instead of AFR?

January 22, 2015 | 01:29 PM - Posted by bugbite

At resolutions above 1080 eg 1440 or 4k, will the 128 bit bus be a limitation in sli configs?

January 22, 2015 | 01:29 PM - Posted by Jason Schmidt (not verified)

As a developer, can you explain how the NVIDIA VR Direct can help with creating VR experiences?

January 22, 2015 | 01:33 PM - Posted by Ashphalt (not verified)

Will an arm processor ever be put on the Nvidia gpu in the future?

January 22, 2015 | 01:34 PM - Posted by Nacelle

I'll watch a stream with giveaways that use random drawings to distribute the prizes. I don't have questions, but I'm willing to hear what he has to say. As soon as I heard that submitting a good question is how you're drawing winners, I disconnected. Sorry Nvidis guy. You lost an ear because of this.

January 22, 2015 | 01:35 PM - Posted by Ray Witts (not verified)

This is a fantastic price to performance card but is this the last 900 series card or will there be a lower end range 9xx card.

January 23, 2015 | 08:24 AM - Posted by Anonymous (not verified)

No, it's really not.

January 22, 2015 | 01:35 PM - Posted by gamesinger

I am currently running a GTX 680, would I be better of buying a GTX 980 rather than a GTX 960 ?

I am planning to buy UHD monitors shortly and I what to confirm that both GTX 960 and GTX 980 can handed UHD monitors.

January 22, 2015 | 01:38 PM - Posted by untoward

will there be 4 gig variants of these cards?

January 22, 2015 | 01:39 PM - Posted by untoward

will these cards do hardware decodes of h.265/HVEC? what about hardware Encoding?

January 22, 2015 | 01:41 PM - Posted by mauro113

If there will be a monitor that does not have support for G-Sync, but will have display port standard, will it ever get support for Adaptive Sync with a Nvidia graphic card?

January 22, 2015 | 01:44 PM - Posted by CoffeeKid

How does MFAA compare to FXAA introduced with GTX 660?

January 22, 2015 | 01:45 PM - Posted by AndreaT (not verified)

With shrinking memory buses with maxwell and memory clocks not mattering much for fps when overclocking, why will HBM matter much or at all in upcoming graphics card architectures?

January 22, 2015 | 01:46 PM - Posted by Humanitarian

Will nvidia hardware support adaptive-sync?

January 22, 2015 | 01:47 PM - Posted by dandragos3118

I don't have a question. I just want to say I'm new to building pcs. I just built my first. I've Watched numerous interviews with developers from many companies, this was the easiest to understand! Being a car guy, I especially loved the car analogy!

January 22, 2015 | 02:04 PM - Posted by Bubba376 (not verified)

Question. Would a sli setup of 960's be good for one 1440P monitor setup.

January 22, 2015 | 01:47 PM - Posted by Greg (not verified)

Given the 960 will likely be used in mid range systems, can we expect to see lower priced GSYNC monitors targeted at value for performance customers?

January 22, 2015 | 01:50 PM - Posted by wadec24

What was the reason behind the limited release of reference design 960's? Are there any plans to release more in the future?

January 22, 2015 | 01:51 PM - Posted by kenny1007

Will there be a 4gb variant of the 960?

January 22, 2015 | 02:06 PM - Posted by Allyn Malventano

From Tom - the 960 does not support 4GB.

January 22, 2015 | 01:51 PM - Posted by untoward

at the microsoft demo yesterday they claimed that dx12 would mean more power efficient graphics cards. what did they mean by this and how does it effect maxwell graphics cards?

http://blogs.msdn.com/b/directx/archive/2014/08/13/directx-12-high-perfo...

January 22, 2015 | 02:05 PM - Posted by Allyn Malventano

The power efficiency related with DX12 has to do with CPU usage related with overhead in getting those DX commands to the GPU. There may be some power savings on the GPU side, but the bulk of it is going to fall on the CPU side, and should apply regardless of which type of GPU is installed.

January 22, 2015 | 01:57 PM - Posted by tomz8679 (not verified)

Why is this (official) picture of GTX 960 showing two SLI fingers?
http://international.download.nvidia.com/geforce-com/international/image...

January 22, 2015 | 02:02 PM - Posted by Allyn Malventano

That's a dead link.

January 22, 2015 | 02:13 PM - Posted by Matt (not verified)

I just asked the same question on 1st page, and the link is working :)

January 22, 2015 | 01:58 PM - Posted by Isabella (not verified)

Pentium g3258 running the 960 card? Performance?

January 22, 2015 | 01:58 PM - Posted by Symion

There are reports of Maxwell cards that are clocking down on efficiency suddenly during gameplay, will these be resolved with driver updates?

January 22, 2015 | 02:07 PM - Posted by Allyn Malventano

Got any links we can research?

January 22, 2015 | 02:42 PM - Posted by Anonymous (not verified)

GeForce Forums

GTX 970 3.5GB Vram Issue
https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-9...

Boost feature causes GTX 970/980 instability in low utilisation situations
https://forums.geforce.com/default/topic/784294/geforce-900-series/boost...

Nvidia! Is there a fix coming or not? - Very disappointed new customer. ( 970 low usage )
https://forums.geforce.com/default/topic/792706/geforce-900-series/nvidi...

January 22, 2015 | 02:00 PM - Posted by Anonymous (not verified)

i'm so happy i dumped my 760 for $160 a few months ago and went ahead and cried and paid the premium for a 970. Comparing a computer with a 960 to a PS4 and now with windows10 cross play, even the lonely xb1, you'd have to be crazy to buy one of these cards.

The only way i'd see it if it was going to be your first computer and you couldn't afford a console and really need to educate yourself with a computer and you had a friend who could you build you a computer for cheap, like $400 for everything.

Honestly, my 4ghz i7 & 970 is kind of disappointing for gaming now. I still play battlefield 3&4 & titanfall. But other than those, i don't really see a point. Consoles are so much better for gaming right now.

January 22, 2015 | 02:04 PM - Posted by Nacelle

You're preaching to the wrong croud. This is the PC master race.

January 22, 2015 | 02:08 PM - Posted by Allyn Malventano

Come again?

January 22, 2015 | 02:10 PM - Posted by Nacelle

PC > console

January 22, 2015 | 02:15 PM - Posted by Allyn Malventano

Aah :)

January 22, 2015 | 02:00 PM - Posted by Anonymous (not verified)

i'm so happy i dumped my 760 for $160 a few months ago and went ahead and cried and paid the premium for a 970. Comparing a computer with a 960 to a PS4 and now with windows10 cross play, even the lonely xb1, you'd have to be crazy to buy one of these cards.

The only way i'd see it if it was going to be your first computer and you couldn't afford a console and really need to educate yourself with a computer and you had a friend who could you build you a computer for cheap, like $400 for everything.

Honestly, my 4ghz i7 & 970 is kind of disappointing for gaming now. I still play battlefield 3&4 & titanfall. But other than those, i don't really see a point. Consoles are so much better for gaming right now.

January 22, 2015 | 02:03 PM - Posted by kenny1007

What are the plans for NVidia in regards freesync?

January 22, 2015 | 02:10 PM - Posted by Allyn Malventano

Nvidia has stated that they do not plan to support freesync.

January 22, 2015 | 02:07 PM - Posted by Smithy (not verified)

Tom, where do you buy your glasses and what brand are they?

January 22, 2015 | 02:16 PM - Posted by RyoRapture

Combo Question:
Why GTX 960 have only 2GB (Is that enough for modern games)?

January 22, 2015 | 02:08 PM - Posted by Bubba376 (not verified)

Question. Would a sli setup of 960's be good for one 1440P monitor setup.

January 22, 2015 | 02:09 PM - Posted by Allyn Malventano

It should handle 1440P ok, but would have a hard time with 4k.

January 30, 2015 | 10:53 AM - Posted by obababoy

Vram doesnt stack on SLI. 1440p add's quite a few pixels. Stick to at least a 4gb card. my2c

January 22, 2015 | 02:11 PM - Posted by Tate (not verified)

Without gsync, would LoL still be running around 30w?

January 22, 2015 | 02:18 PM - Posted by Allyn Malventano

I believe it was running at 30W when limited to 60 FPS.

January 22, 2015 | 02:12 PM - Posted by Ben C. (not verified)

When developing a new GPU technology like Maxwell, how does nvidia decide how many CUDA cores to include in each design? Is clock speed determined after that decision is made or earlier in the process?

January 22, 2015 | 02:17 PM - Posted by Gelfling (not verified)

Question: Will Nvidia ever develop an SLI solution that doesn't require a bridge?

January 22, 2015 | 02:18 PM - Posted by Neon

Will G-Sync have additional value add / features in future versions?

January 22, 2015 | 02:19 PM - Posted by tomz8679 (not verified)

What's faster gtx 680 or the 960?

January 22, 2015 | 02:21 PM - Posted by SLOWION (not verified)

i r disappoint

January 22, 2015 | 02:23 PM - Posted by revort

edit

January 22, 2015 | 02:25 PM - Posted by Humanitarian

Why is the texture fill rate for the 970 lower than the 980, aren't they using the same memory architecture?

January 22, 2015 | 02:27 PM - Posted by falshamran (not verified)

What is the minimum power supply requirements for 950 SLI

January 22, 2015 | 02:28 PM - Posted by Dominic (not verified)

Live Q: can any other card be used in SLI with an NVIDIA GeForce GTX 960 ?

January 22, 2015 | 04:53 PM - Posted by Anonymous (not verified)

no, you can only sli with the same card family. i've read that some games won't even work with different brand cards. I find that one hard to believe but worth double checking. 2 cards doesn't give you 4gb of videoram. You will still only have 2 gb for gaming.

January 23, 2015 | 04:40 AM - Posted by Parn (not verified)

This is more of a GTX 950 Ti to be honest. While the power consumption is excellent, the performance is nowhere near the x60 class. The gap between this card and the 970 is simply too big. A cut-down GM204 with 1280:80:48 3GB 192bit will fit the 960 monika better.

January 23, 2015 | 06:10 AM - Posted by Mac (not verified)

So when is this power efficiency going to turned into affordable performance? Nothing to see here.

January 23, 2015 | 03:50 PM - Posted by Anonymous (not verified)

This card is just is about as close as you can get to exactly as powerful as a gtx 760 with the same amount of vram and and a 128 bit bus. The only place you see true improvements are in the titles and benchmarks that Nvidia spent time making driver/firmware improvements. Look at older benchmarks like Heaven or 3dmark vantage and the gtx 970 and 760 are neck and neck. We got a much more power efficient card but at the same time a crippled memory bus to save money. Except for the niche htpc market (which I may or may not belong to) this is really isn't that exciting and pretty much a sidegrade to the existing gtx 760. Even the compute numbers clearly show this.

January 23, 2015 | 06:09 PM - Posted by aparsh335i (not verified)

I think a lot of people here hating on the 960 are missing a key point - SLI. $200x2=$400. That's $400 for great fps at 2650x1440 with high settings, barely any heat, and barely any noise.

I think it's a great deal, and I think Nvidia has a TON of room to lower the price if they want/need. Who cares though, its still only $400 for 2x GTX 960 that can basically do everything youll need. Play more, worry less.

January 30, 2015 | 10:49 AM - Posted by obababoy

What about the people that don't want to deal with crappy SLI profiles for games that are getting lazier and lazier with optimization. I want to spend money now for something that plays everything fairly decent and in a year or two buy another.

January 24, 2015 | 12:54 PM - Posted by Itbankrock

Will the 128-bit memory interface cause any performance hit in modern games?

January 26, 2015 | 11:27 PM - Posted by Steve Green. (not verified)

In layman terms, how does Nvidia make their performance so smooth compared to the competition? It's the one thing in particular I like about their products. :)

January 28, 2015 | 12:46 AM - Posted by Matt (not verified)

Do you know why the performance differential for Crysis 3 is so much smaller than for the other games? I was considering this card for an upgrade, but if it doesn't play nice with the CryEngine 3 (Star Citizen in particular) then it might not be worth it.

January 30, 2015 | 10:40 AM - Posted by obababoy

I am on the Sapphire Vapor-X R9 290 right now but I just don't see anyone making a really smart choice by going for this slow of a card. I would at least get the 970.

Call optimization for what it is but PC games now are crushing through 2GB of vram all day long. Dying light at 1080p and max settings is hitting 4gb. Farcry 4 hitting 3GB. Shadow of Mordor 3.5GB. All of this at 1080p...

January 30, 2015 | 02:04 PM - Posted by MAXXHEW

It's funny to see reviews about gaming GPU's that emphasize power consumption, and decibel levels. My son and I game everyday for hours on a regular basis. We play games for entertainment because we have a little bit of disposable income I can use for just playing... paying a few cents worth of electricity extra is not even a concern, we saved more by buying a GPU with a better price.

Decibles... seriously? We aren't playing in a library... and most gamers have headphones that make the decibel level of pc's really un-noticeable. Even if we are recording with sensitive mics for gameplay uploads... our decibel level is not a concern.

Maybe it's just me... and I just don't understand the new sophisticated testing methods. But as a father who enjoys gaming with my son, price... and a GPU's in game playability is my main concern. I refuse to pay the premium price some manufacturers charge just for a few less watts and decibles... we can use that saved money buying new games, and snacks.

February 4, 2015 | 02:48 PM - Posted by Nick (not verified)

Yes yes we all get it...the 960 isnt blowing anyone away with raw performance. At the same time, everyone is giving this card a ton of crap and saying how over priced it is...is this coming from the same crowd that has been paying $250 for a 760 as close as a couple months ago? A TON of people have a 760. The 960 gives you marginally better performance using much less power, giving off much less heat, less noise AND at $200. If power, heat and noise do not concerns you, then think of it as getting an overclocked 760 at a $50 discount.

June 1, 2015 | 01:45 AM - Posted by Anonymous (not verified)

Finally upgraded my Radeon 5850 after a long and painful to tolerate card failure. So happy with this card.
NEVER AMD again !
Fry's price-matched Amazon and same $10 rebate.
$189 after rebate ; )

August 29, 2015 | 03:51 PM - Posted by Dee127459 (not verified)

Is it worth it to upgrade from my gtx 560 to the 960 because from looking at this review it's not...

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.