NVIDIA Officially Announces GeForce GTX 1060 3GB Edition

Subject: Graphics Cards | August 18, 2016 - 02:28 PM |
Tagged: nvidia, gtx 1060 3gb, gtx 1060, graphics card, gpu, geforce, 1152 CUDA Cores

NVIDIA has officially announced the 3GB version of the GTX 1060 graphics card, and it indeed contains fewer CUDA cores than the 6GB version.

View Full Size

The GTX 1060 Founders Edition

The product page on NVIDIA.com now reflects the 3GB model, and board partners have begun announcing their versions. The MSRP on this 3GB version is set at $199, and availablity of partner cards is expected in the next couple of weeks. The two versions will be designated only by their memory size, and no other capacities of either card are forthcoming.

  GeForce GTX 1060 3GB GeForce GTX 1060 6GB
Architecture Pascal Pascal
CUDA Cores 1152 1280
Base Clock 1506 MHz 1506 MHz
Boost Clock 1708 MHz 1708 MHz
Memory Speed 8 Gbps 8 Gbps
Memory Configuration 3GB 6GB
Memory Interface 192-bit 192-bit
Power Connector 6-pin 6-pin
TDP 120W 120W

As you can see from the above table, the only specification that has changed is the CUDA core count, with base/boost clocks, memory speed and interface, and TDP identical. As to performance, NVIDIA says the 6GB version holds a 5% performance advantage over this lower-cost version, which at $199 is 20% less expensive than the previous GTX 1060 6GB.

Source: NVIDIA

August 18, 2016 | 02:54 PM - Posted by Anonymous (not verified)

On paper it looks like a goat, but that will not matter too much as it will probably be as fast as the 970 and outsell the 470 by quite a bit.

These should be $99 parts, sigh.

August 18, 2016 | 03:02 PM - Posted by Kusanagi (not verified)

nVIDIA, please... just call it a 1050 so there won't be any confusion....

August 18, 2016 | 07:01 PM - Posted by aparsh335i (not verified)

I knew people would say this.
Call it a 1050 or call the 1060 6gb a 1060ti to avoid confusion.
However, if they did it, they did it on purpose.
This was the first time they did something like this, right??? NOPE.
Remember the GTX 460?
It was offered in a variety of flavors, including the GTX 460 768mb, 1gb, and the 1GB SE.
http://www.guru3d.com/articles_pages/geforce_gtx_460_se_review,2.html

All cards were GTX 460, but the 3 varied all across the board on ROP, Stream processors etc. I'm going to throw a wild guess out there. I'm going to guess this confusion of GTX 460's was a success for Nvidia, so they are doing it again. On top of that, they may even have marketing data that sayd a large portion of those customers 5 years ago were on 5 year video card buying cycles, looking for a replacement this year, which could be the GTX 1060 3gb.

August 18, 2016 | 03:02 PM - Posted by stigbeater (not verified)

should of call the 6gb the 1060ti and the 3gb just the 1060.

August 18, 2016 | 03:04 PM - Posted by JohnGR

5% with 10% less cores? I think Nvidia is a little generous to the 3GB version, probably because people are going bananas looking less cores but the same product name.

In a German site that I was looking it was selling from 219 to 239 euros. The 6GB version was starting from 268 euros, so the difference in price is significant. Of course going with only 3GB when people are thinking that even 4GBs could be soon a problem with some titles, it is a risk.
In the same site RX 480 is starting from 245 and RX470 from 203, so the 3GB GTX 1060 lands in the middle.

August 18, 2016 | 03:08 PM - Posted by Lukasz Markiewicz (not verified)

Nothing wrong with offering a cut down product for a lower price. If the benchmarks show great pefromance/$ it might even turn out to be a great card.

But really, someone in Nvidia marketing needs a paddling. It's not a 3GB version of a GTX 1060. It's a cutdown version of the 1060 (even if the cut seems modest). This is rubbish and should not be allowed a free pass. Call it a GTX 1050 or 1059 for all I care. But this is deceptive and indefensible.

I mean... it sounds like an obvious criticism... which is why it baffles me why would anyone at the company could think this is ok. Especially if it does turn out that it's a great card for the price. Why tarnish a potentially good product with a whiff of unnecessary marketing bullshit?

August 18, 2016 | 03:19 PM - Posted by JohnGR

Well Nvidia is doing it in the low end market for years. And much worst than this. But because it is the low end market, people respond with a "who cares about that card". But now that there is essentially no low end market, Nvidia is starting doing stuff like this in more expensive cards.

I keep giving the same example over and over again. GT 730. A card that comes at three different versions with completely different performance. Two different memory types, two different data buses, two different GPUs, different features and specifications, but the same name. You don't know if the card you bought is a Fermi or a Kepler, comes with enough bandwidth to play games(40GB/sec), or just barely to see movies(12GB/sec), it supports DX12 fully(Kepler), or partially(Fermi), comes with enough CUDA cores to do some more stuff(384 Kepler) or just a few to send you to the processor(96 Fermi). A total mess.

http://www.geforce.com/hardware/desktop-gpus/geforce-gt-730/specifications

August 18, 2016 | 03:13 PM - Posted by Xanavi

Sounds like a different video card to me...

*Shakes fist at Jen Hsun*

August 18, 2016 | 06:13 PM - Posted by Anonymous (not verified)

They should call the the GTX(Gimped To maXimize) Profits 1060 FE (F__K Everyone) SKU, now with even less compute! It will say GTX 1060 on the box, but the actual specifications are anyone’s guess! JHH cordially invites you to his game of 3 card monte!

August 18, 2016 | 10:06 PM - Posted by Lance Ripplinger (not verified)

Why don't they just call this damn card the GTX 1060LE and be done with it????

August 18, 2016 | 11:52 PM - Posted by Anonymous (not verified)

Yawn...

August 19, 2016 | 02:03 AM - Posted by Anonymous (not verified)

I don't get why they would gimp the shader core count.

August 19, 2016 | 07:54 AM - Posted by Anonymous (not verified)

gotta sell those defective dies somehow. not that it's a bad thing. Same as the 1070, or the 970, 670, etc...

August 19, 2016 | 05:28 AM - Posted by Anonymous (not verified)

Why the **** have less cores and keep 1060 name other than to mislead people.

August 19, 2016 | 05:34 AM - Posted by Anonymous (not verified)

They should call it the 1060CE

Chump Edition.

August 19, 2016 | 09:56 AM - Posted by Mr.Book (not verified)

A GeForce 1060MX you say?

August 19, 2016 | 10:48 AM - Posted by Anonymously Anonymous (not verified)

Would this card sell better labeled as a 1060 or as a 1050ti, even if the specs were kept the same either way?
People want to buy the card with the higher model number(call it prestige, bragging rights, etc) and Nvidia wanted something that would steal the thunder from the 480 and ta-da, they now have a 1060 at that magical $199 pricepoint.

August 21, 2016 | 03:59 AM - Posted by Anonymous (not verified)

Yeah, but the $199 version isn't that great, if you want a nice design with dual fans, quiet cooling etc they are ~$230, not $199.

And for that you can get a 4GB 480 Nitro+, which can be OC'd to within ~1-2% of the 8GB 480 anyway, has 33% more memory than the 3GB 1060, proper async compute, no G-Sync tax, Xfire, and will be much better at DX12/Vulkan...

If the 'nice' 3GB 1060s were ~$180 then they would have some sort of value. But at ~$230 they just don't make sense. Not when you can buy a really nice custom OC 4GB 480 for the same price.

August 19, 2016 | 05:08 PM - Posted by Anonymous (not verified)

They should call it Titan. This way only very few people would be confused as not many can afford 1200USD graphics card ;-)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.