Review Index:
Feedback

The GeForce GTX 1060 6GB Review - GP106 Starting at $249

Author:
Manufacturer: NVIDIA

GP106 Specifications

Twelve days ago, NVIDIA announced its competitor to the AMD Radeon RX 480, the GeForce GTX 1060, based on a new Pascal GPU; GP 106. Though that story was just a brief preview of the product, and a pictorial of the GTX 1060 Founders Edition card we were initially sent, it set the community ablaze with discussion around which mainstream enthusiast platform was going to be the best for gamers this summer.

Today we are allowed to show you our full review: benchmarks of the new GeForce GTX 1060 against the likes of the Radeon RX 480, the GTX 970 and GTX 980, and more. Starting at $250, the GTX 1060 has the potential to be the best bargain in the market today, though much of that will be decided based on product availability and our results on the following pages.

Does NVIDIA’s third consumer product based on Pascal make enough of an impact to dissuade gamers from buying into AMD Polaris?

View Full Size

All signs point to a bloody battle this July and August and the retail cards based on the GTX 1060 are making their way to our offices sooner than even those based around the RX 480. It is those cards, and not the reference/Founders Edition option, that will be the real competition that AMD has to go up against.

First, however, it’s important to find our baseline: where does the GeForce GTX 1060 find itself in the wide range of GPUs?

Continue reading our review of the GeForce GTX 1060 6GB graphics card!!

GeForce GTX 1060 Specifications

Let’s start with a dive into the rated / reference specifications of the GTX 1060 provided by NVIDIA.

  GTX 1060 RX 480 R9 390 R9 380 GTX 980 GTX 970 GTX 960 R9 Nano GTX 1070
GPU GP106 Polaris 10 Grenada Tonga GM204 GM204 GM206 Fiji XT GP104
GPU Cores 1280 2304 2560 1792 2048 1664 1024 4096 1920
Rated Clock 1506 MHz 1120 MHz 1000 MHz 970 MHz 1126 MHz 1050 MHz 1126 MHz up to 1000 MHz 1506 MHz
Texture Units 80 144 160 112 128 104 64 256 120
ROP Units 48 32 64 32 64 56 32 64 64
Memory 6GB 4GB
8GB
8GB 4GB 4GB 4GB 2GB 4GB 8GB
Memory Clock 8000 MHz 7000 MHz
8000 MHz
6000 MHz 5700 MHz 7000 MHz 7000 MHz 7000 MHz 500 MHz 8000 MHz
Memory Interface 192-bit 256-bit 512-bit 256-bit 256-bit 256-bit 128-bit 4096-bit (HBM) 256-bit
Memory Bandwidth 192 GB/s 224 GB/s
256 GB/s
384 GB/s 182.4 GB/s 224 GB/s 196 GB/s 112 GB/s 512 GB/s 256 GB/s
TDP 120 watts 150 watts 275 watts 190 watts 165 watts 145 watts 120 watts 275 watts 150 watts
Peak Compute 3.85 TFLOPS 5.1 TFLOPS 5.1 TFLOPS 3.48 TFLOPS 4.61 TFLOPS 3.4 TFLOPS 2.3 TFLOPS 8.19 TFLOPS 5.7 TFLOPS
Transistor Count 4.4B 5.7B 6.2B 5.0B 5.2B 5.2B 2.94B 8.9B 7.2B
Process Tech 16nm 14nm 28nm 28nm 28nm 28nm 28nm 28nm 16nm
MSRP (current) $249 $199 $299 $199 $379 $329 $279 $499 $379

Much of the commentary on this page is duplicated from our GTX 1060 preview posted last week.

The GeForce GTX 1060 will sport 1280 CUDA cores with a GPU base clock of 1506 MHz Boost clock speed rated at 1708 MHz. Though the card will (initially) be available in only 6GB varieties, the reference / Founders Edition will ship with 6GB of GDDR5 memory running at 8.0 GHz / 8 Gbps. With 1280 CUDA cores, the GP106 GPU is essentially one half of a GP104 in terms of compute capability. NVIDIA decided not to cut the memory interface in half though, instead going with a 192-bit design compared to the GP104 and its 256-bit option.

The rated GPU clock speeds paint an interesting picture for peak performance of the new card. At the rated boost clock speed, the GeForce GTX 1070 produces 6.46 TFLOPS of performance. The GTX 1060 by comparison will hit 4.35 TFLOPS, a 48% difference. The GTX 1080 offers nearly the same delta of performance above the GTX 1070; clearly NVIDIA has set the scale Pascal and product deviation.

View Full Size

NVIDIA wants us to compare the new GeForce GTX 1060 to the GeForce GTX 980 in gaming performance, but the peak theoretical performance results don’t really match up. The GeForce GTX 980 is rated at 4.61 TFLOPS at BASE clock speed, while the GTX 1060 doesn’t hit that number at its Boost clock. Pascal improves on performance with memory compression advancements, but the 192-bit memory bus is only able to run at 192 GB/s, compared to the 224 GB/s of the GTX 980.

The GTX 1060 Founders Edition card has a TDP of just 120 watts, and will have a single 6-pin power connection. With all of the controversy and debate surrounding the Radeon RX 480 and its power delivery system, this is going to be looked at closer than ever. NVIDIA has set the TDP 30 watts lower than the 6-pin + PCI Express slot power is rated, so this definitely gives them room for overclocking and slight power target adjustment within those boundaries. In recent history NVIDIA has tended to be less aggressive on its power targets; I expect the GTX 1060 to fall well within the 120 watt level at stock settings.

The starting MSRP for the GeForce GTX 1060 partner cards will be $249. The Founders Edition card, designed by NVIDIA and the one we were sent for our initial reviews, will cost $299 and will be available ONLY at NVIDIA.com. NVIDIA is listing this one as “limited edition” so I would assume that means we will not see the Founders Edition throughout the entirety of the life of the GTX 1060.

At $249, the GTX 1060 partner cards, available and shipping today, will compete very well with the 8GB variant of the Radeon RX 480, which at reference prices is only $10 less expensive. NVIDIA itself proclaims the GTX 1060 is “on average 15 percent faster and over 75 percent more power efficient than the closest competitive product” which obviously refers to aforementioned RX 480. 


July 19, 2016 | 09:09 AM - Posted by Anonymous (not verified)

'Starting at $249' ? Where ?

The two I've seen so far are $320 and $350. But they are 6GB cards. So I assume it's only the 3GB one that's going to be at $249 ?

July 19, 2016 | 09:18 AM - Posted by Anonymous (not verified)

That might just be the seller gouging the price. Wait a bit for the cards to get into the market and you'll start seeing MSRP prices.

July 19, 2016 | 10:50 AM - Posted by Anonymous (not verified)

It's this thing called "marketing". It's sort of like lying, but it's legal. This review touts a $249 card, but reviews a $299 card instead. Not cool.

At least you had the decency to properly title your RX480 review:
"The AMD Radeon RX 480 Review - The Polaris Promise" instead of

"The AMD Radeon RX 480 Review - Polaris Starting at $199" and then review the $230 version.

I'm not accusing you guys of anything sinister, just that you should consider a common naming convention for your titles that isn't so misleading.

July 19, 2016 | 11:45 AM - Posted by Anonymous (not verified)

The only difference between that 299 and the 249 card is the cooler. No VRAM or shader difference, or even clock speed. i.e. nothing that will impact the performance of the card, so whats the issue? You want them to revisit the review when they get third party cards @ 249 so they can generate the same numbers twice?

July 19, 2016 | 11:50 AM - Posted by JohnGR

No. The difference is that you will be able to find the $299 card available, but you are NOT going to find the $249 card available.

Nvidia's marketing is doing an excellent job because the tech press is helping it.

As for your question is as dumb as it gets.

July 19, 2016 | 12:33 PM - Posted by Anonymous (not verified)

Oh ...look, another worthless AMD fanboy....

Nonetheless, it certainly looks like Nvidia came out to rain on AMD's midrange strategy parade.... 1060 kicks the 480 under a bus.

July 19, 2016 | 12:54 PM - Posted by Anonymous (not verified)

My question is dumb? We've already seen the $249 cards listed for sale. They will exist... and yet people are complaining they they reviewed 299 card and not the 249 card? This isn't like people saying the 480 is 199 then reviewing the 239 version... those had tangible differences (4GB vs 8GB). The difference here is a cooler... complaining about that is as dumb as it gets.

July 19, 2016 | 01:12 PM - Posted by JohnGR

That's why it is dumb. That's what Nvidia's marketing is doing. It throws you a price that you are not going to find, except if you are extremely lucky, like luckier than 99.9% searching for a cheap GTX 1080 at the same time, and puts you in a waiting mode until the prices finally start dropping.

They WILL exist. Until then people will be buying at prices closer or even higher than Founder Edition and the price/performance charts or conclusions from tech sites will be just supporting Nvidia's marketing lie.

Tech sites are RESPONSIBLE for SUPPORTING Nvidia's MSRP/FE marketing when they put the MSRP ON THE TITLE even when reviewing the $299 model.

No one says that GTX 1060 will not drop eventually to $250. By then RX 480 reference could be selling for $199(the 8GB) and custom models for $199-$239.

JMO of course as with every post I do. JMO

July 19, 2016 | 01:35 PM - Posted by Anonymous (not verified)

http://www.evga.com/Products/ProductList.aspx?type=0&family=GeForce+10+S...

Oh look, $250 1060s. Not for sale yet, but they'll be available. You have to ignore the price gouging; there's absolutely no reason to base your price on it.

July 19, 2016 | 02:40 PM - Posted by tatakai

"Not for sale yet"

lol. They will come for sure, but its going to be a bit. Not long probably, but there is a wait for that msrp to be true.

July 19, 2016 | 07:53 PM - Posted by Anonymous (not verified)

Newegg showed the Zotac 1060 Mini as $249. Was listed as backordered, but I placed an order and they said they were shipping it today. {shrug}

July 19, 2016 | 09:25 AM - Posted by malurt

$249 is the MSRP for non-founders, which just means that none of them will ever launch at that low :p

July 19, 2016 | 09:30 AM - Posted by Anonymous (not verified)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125879&utm_medi...

You were saying spanky?

July 19, 2016 | 09:37 AM - Posted by Anonymous (not verified)

try buying it

July 19, 2016 | 09:47 AM - Posted by Anonymous (not verified)

try buying an RX 480

July 19, 2016 | 09:50 AM - Posted by AnonymouS (not verified)

i did

July 19, 2016 | 10:27 AM - Posted by ggreen1955 (not verified)

There are plenty rx480 on launched date. This suppose to be launched on July 7th. Where is the card.

July 19, 2016 | 10:31 AM - Posted by ggreen1955 (not verified)

Another successful vaporware launch by NVDA. Where is the reference card on launched date?

July 19, 2016 | 11:47 AM - Posted by Stefem (not verified)

Reference card are available only at their own shop

July 19, 2016 | 09:28 PM - Posted by Anonymous (not verified)

Where is the card?
Its been in my PC gaming for a few weeks now, and its awesome.
I was able to walk into a store and pick one up just 2 days after launch.

July 19, 2016 | 09:48 AM - Posted by Anonymous (not verified)

Welp looks like he did, and now it's out of stock!

July 19, 2016 | 09:45 AM - Posted by Stefem (not verified)

Excerpt there's no 3GB variant...

July 19, 2016 | 11:43 AM - Posted by Anonymous (not verified)

It will be stripped of ROPs and rebranded as a 1050 and sold for $199 but be slower than the RX480 and will consume less power. Because using fewer watts is the most important aspect of gaming.

July 19, 2016 | 09:52 AM - Posted by Topinio

Sure, "starting at" $250. But why no mention in the review of thr RX 480 "starting at" $200 ?

July 19, 2016 | 10:33 AM - Posted by ggreen1955 (not verified)

Both rx480 are available on launched date and plenty of them. Where is GTX1060 for 249? NONE.

July 19, 2016 | 11:19 AM - Posted by Anonymous (not verified)

"plenty of them" like plenty of them were sold out near instantly at launch? I mean, come on... they're not in stock, don't pretend AMD is killing it and Nvidia can't put something on shelves, because AMD can't either. They probably sold out the minute they went on sale because people BOUGHT them.

July 19, 2016 | 02:43 PM - Posted by tatakai

the numbers we've heard from sellers suggests tons of 480s were available. The most popular I think its overclockers uk saying they sold thousands of 480 cards in 24hrs while having just a few hundred 1060s at launch. They have several thousand more 480x coming too.

July 19, 2016 | 05:53 PM - Posted by Tyler (not verified)

I had to look online for one day the second day I got a sapphire rx 480 8gb modelwith no issues off newegg for 300canadian which is basically what they were announced at.... just use an in stock website but newegg gets them quite often ....not sure about the nvidia cards

August 22, 2016 | 02:47 AM - Posted by Anonymous (not verified)

Theres plenty of 480s in stores. This is because noone is buying them. Even on the day of its lauch there were more 1070/1080 orders than the 480s, and thats your peak sales date right there.

July 19, 2016 | 11:49 AM - Posted by Stefem (not verified)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125879&utm_medi...

July 19, 2016 | 11:55 AM - Posted by Stefem (not verified)

Are you sure? At the first page in the table the RX480 are shown as a 199$ part only without mention that the 8GB cost more

July 19, 2016 | 09:59 AM - Posted by Jiří Kocman (not verified)

In my country all major shops have stock GTX1060 for 250USD eqvivalent price

July 19, 2016 | 05:32 PM - Posted by mazi (not verified)

According to your name you are Czech and I call this bullshit. The cheapest available card is 9000 CZK which is 350 USD! Cheaper cards are "sold out" or "preorder".

July 19, 2016 | 10:40 PM - Posted by Anonymous (not verified)

in your face liar

July 19, 2016 | 02:50 PM - Posted by Hyperstrike (not verified)

Newegg currently has three 6GB cards (there are no "3GB cards") listed for $249. Then a couple listed in the $279-289 range.

http://www.newegg.com/VGA/PromotionStore/ID-1170?Tpk=GTX%201060

July 19, 2016 | 04:12 PM - Posted by Anonymous (not verified)

Here you go there are more than enough $249 cards available on launch day
*http://www.newegg.com/Product/Product.aspx?Item=N82E16814500402&cm_re=zotac_gtx_1060-_-14-500-402-_-Product *http://www.newegg.com/Product/Product.aspx?Item=N82E16814487260 *http://www.newegg.com/Product/Product.aspx?Item=N82E16814487261 *http://www.newegg.com/Product/Product.aspx?Item=N82E16814126115&cm_re=gtx_1060-_-14-126-115-_-Product *http://www.newegg.com/Product/Product.aspx?Item=N82E16814133631&cm_re=gtx_1060-_-14-133-631-_-Product *http://www.newegg.com/Product/Product.aspx?Item=N82E16814127965&cm_re=gtx_1060-_-14-127-965-_-Product
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125879&utm_medi...

July 20, 2016 | 02:37 PM - Posted by Christopher (not verified)

MicroCenter had the $299 MSI "Gaming X" variant (premium silent cooler) in stock on their shelf today when I was there.

That version was supposed to retail for $289, so, the price is not jacked up too much. If the price goes down in 30 days or less, you can get a price adjustment (refund). Also, I recommend buying the 2 year "Purchase Protection Plan" which means that you can return your card 2 years later and get store credit to apply towards your next card. That might not be how it's really supposed to work, but that _IS_ how they sold me the plan originally when I first bought one on a video card a few years ago.

The PPP costs $29 on $299 card so I look at basically like I'm renting the card for $30 for a year or two until I get a new card and pay the $29 again.

July 19, 2016 | 09:20 AM - Posted by John H (not verified)

As always - appreciate the Video in addition to the text article here as they both provide some unique perspective on hardware launches.

Ryan - in your opinion, are the 'frame times' exactly re-usable for VR? or do the tests really need to be re-run on VR hardware with VR overhead in addition as that may affect the equation? (I understand that VR resolutions are different of course, but I'm thinking game/driver code paths may necessitate other testing to see how the AMD/Nvidia cards really compare in VR).

July 19, 2016 | 10:10 AM - Posted by Ryan Shrout

Performacne is performance, for the most part. If your card is fast in "standard" games, it SHOULD be fast in VR as well. That being said, it doesn't take into account technologies like multi-projection, etc.

July 19, 2016 | 06:36 PM - Posted by Anonymous (not verified)

Umm where are the OC results. Like how does a RX480 at 1350 MHZ perform against a GTX1060 at 2 GHZ?

July 19, 2016 | 09:40 AM - Posted by Anonymous (not verified)

good review overall but that nvidia "bios" shows thru.

July 19, 2016 | 09:43 AM - Posted by Kenworth

There you are. What took you so long?

July 19, 2016 | 09:51 AM - Posted by AnonymouS (not verified)

i was busy with ur mum

July 19, 2016 | 11:54 AM - Posted by Kenworth

Ah yes! The quick wit, can't have a troll without it.

July 19, 2016 | 10:44 PM - Posted by Anonymous (not verified)

pcper dictionary: troll- a pcper fan who is not happy about a biased review takes time out of his day to speak up knowing that all he is doing is increasing traffic and clicks and will most likely be called names.

July 20, 2016 | 06:33 AM - Posted by Spunjji

Nope, he's just describing you, a trollfaced trollpants troll. It's adorable that you think you have anything really to do with site traffic, though. You're like a mini, ineffectual and deliciously nerdy Donald Trump.

July 20, 2016 | 09:16 AM - Posted by Anonymous (not verified)

thanks for proving my point adolf clinton.
so some click are worth more then others? yeah

July 20, 2016 | 10:06 AM - Posted by Kenworth

"Takes time out of his day" really takes the cake. Now it is gracing us with your presence to drive traffic with your wonderful insight. This is some next level trolling.

July 19, 2016 | 10:35 AM - Posted by Anonymous (not verified)

his face even went red when he started bashing the 480. he knew he was doing something wrong and he did it anyway. how many teslas does one man need?

July 19, 2016 | 09:43 AM - Posted by AnonymouS (not verified)

1060 faster in dx11 but more expensive and less vram
rx480 faster in dx12 and vulcan

July 19, 2016 | 02:03 PM - Posted by Anthonymous (not verified)

almost all of the games regular people play are directx 11 titles. amd fanboys has been singing the same song since mantle and where did that lead? open your eyes, we game now so the performance we see now is the performance we will enjoy now guaranteed. and its not like nvidia cards cannot run dx12/vulkan games they also can. the rx480 is a good card but its a step down below the 1060. both can coexist! trying to make the 1060 look bad only makes you look idiotic at worst. why defend a brand? for people who can only afford 199$/239$ the rx 480 is the best, but for $10+ more one can get a 1060 and be happier... lets all be happy!

July 19, 2016 | 02:44 PM - Posted by tatakai

sure they are. but PC gamers used to care about future proofing. If i pay less for a little less dx11 performance, its better than paying more for less dx12 performance for near future games. Which should i sensibly choose?

The best bet is custom 480s that will remove the lower boost of the reference 480 and come at higher clocks to match 1060 dx11 performance. Too bad they are taking a while to reach.

July 19, 2016 | 08:09 PM - Posted by Anonymous Nvidia User (not verified)

Gonna need a lot higher boost from AIB 480 to match old directx11 mantle AMD Gaming Evolved game. Battlefield 4. The 1060 is a whopping 34% over a Rx 480 at 1080. 94fps vs 70 fps. LOL

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/11.html

I'm sure I could dig up more games with maybe higher results but not necessary.

Trolls will just say but Vulkan and dx12. Dx12 isn't going to be huge. It's a coding nightmare to get proper performance out of.

Dx11 is here for a long time according to Microsoft themselves. Dx12 is alternative if seeking to control all details of the API.

Vulkan is new so it's not totally optimised yet. Especially for Pascal being a new architecture.

July 20, 2016 | 06:34 AM - Posted by Spunjji

I couldn't give a monkeys about Battlefield 4 though. :|

July 20, 2016 | 06:34 AM - Posted by Spunjji

I couldn't give a monkeys about Battlefield 4 though. :|

July 20, 2016 | 07:17 PM - Posted by John H (not verified)

Battlefield One has DX12..

July 19, 2016 | 09:41 PM - Posted by Anonymous (not verified)

Err, last time I checked, mantle evolved into vulcan.

July 20, 2016 | 08:27 AM - Posted by Anonymous Nvidia User (not verified)

Indeed the basic components of Mantle are there but Khronos Group is committed to developing an open source API that is supposed to be vendor neutral. That is why they will get Nvidia's version of async running. Because they already have AMD's version running. It's only fair.

Let's face it if you own an Nvidia card would you buy a game that runs like total a** on your card. Hitman and AOTS looking straight at you. These games didn't sell very well as a result. This improper support of Nvidia will stop if we stand up and not buy any game that features a huge performance gap between AMD and Nvidia. Developers will either get with the program or possibly go out of business because of poor sales.

July 19, 2016 | 09:46 AM - Posted by Anonymous (not verified)

Does all the memory run at full speed or due to the way its cut down do we have a 970 fiasco all over again? Later on it became evident that this problem on the 970 had a real impact on performance.

Testing for this should be mandatory on all cut down models.

July 19, 2016 | 10:31 AM - Posted by Anonymous (not verified)

This appears to be a fully enabled chip. There's no need.

July 19, 2016 | 04:12 PM - Posted by arbiter

Do you think they would make the same mistake a 2nd time?

July 19, 2016 | 09:50 AM - Posted by khanmein

where's doom benchmark?

July 19, 2016 | 10:11 AM - Posted by Ryan Shrout

It's not part of my normal test suite until we can get a Vulkan-capable overlay to work with our Frame Rating performance metric system.

July 19, 2016 | 01:23 PM - Posted by Buyers

Doesnt DOOM have a built-in benchmark mode? Or what about this 'PresentMon' application that HardOCP used? http://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founde...

July 19, 2016 | 02:08 PM - Posted by Jeremy Hellstrom

There are problems with the data that we need to make sure are fully resolved.

https://github.com/GameTechDev/PresentMon/issues/14

 

 

July 19, 2016 | 03:01 PM - Posted by JohnGR

That problem there seems to had a simple solution.

"You should use the -process_name command line flag (e.g. "-process_name DOOMx64vk.exe"). The ones with the 16.67ms are from a different process, probably dwm.exe."

But I guess that was just an example.

July 19, 2016 | 06:17 PM - Posted by Jeremy Hellstrom

It's a little more complicated thanks to new things like DWL the lack of time to test and verify solutions and making it work with the overlay we use.

We are working on it ... hell, Ryan and Al will probably get a up very technical post up about how it works at some point. 

July 19, 2016 | 03:00 PM - Posted by Anonymous (not verified)

Real reason the 480 would win in doom...and i like getting a check from nvidia thank you.there fixed that for you shill.

July 19, 2016 | 09:14 PM - Posted by Coupe

Still think you should have noted it. Particularly since there is a distinct performance difference between the two cards and it being the only Vulkan game that is available.

July 19, 2016 | 10:12 AM - Posted by Anonymousss (not verified)

why would you test doom? its not like its using the latest api and shows the future of gaming. DX11 FTW

July 19, 2016 | 10:34 AM - Posted by ggreen1955 (not verified)

1060 only use for DX11?

July 19, 2016 | 10:46 AM - Posted by Legion495

As for that point in time
1060=Dx11
480=Dx12 and Vulkan

Also you may want to consider that AMD gains quite a lot %performance over several months. So at this point in time I would suggest a 480, I actually though Nvidia would make a 480 as option useless.

July 19, 2016 | 12:27 PM - Posted by Ty (not verified)

Because it would have been another example where the amd card was ahead of the nvidia card in the newer api for games that are coming out that push performance. Doom is even worse since it's a shooter and people actually play that as opposed to ashes of the singularity.

It would have been trivially easy to compare the main card it was up against and post up the minimum frames and average frames.

http://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founde...

but like I said, that would not have made the 1060 look good.

July 19, 2016 | 07:04 PM - Posted by leszy (not verified)

LOL. [H]OCP again made most honest test in the web. Impressive with their latest anti-AMD bias.

July 19, 2016 | 09:50 AM - Posted by TREY LONG (not verified)

Hexus.net showed 1060 much better frame times than 480.

July 19, 2016 | 09:52 AM - Posted by terminal addict

VR Testing? I am assuming with Pascal's improvements for VR, the 1060 should significantly outperform the 980.

This is definitely a good showing, and I am VERY excited about GP106 in laptops. Laptops with 960 could be had for under $1000. Decent ones were just barely over $1000. If the rumors are true and the mobile Pascal parts will simply be slightly lower clocked versions of the full desktop chips, it should be possible to have a fully VR-capable laptop at a really great price.

July 19, 2016 | 09:57 AM - Posted by Grrizz (not verified)

Sadly I don't think anything uses Nvidias VR tech yet (maybe their own funhouse?) but it should make for some nice improvements in the future on at least some titles :)

July 19, 2016 | 12:36 PM - Posted by Allyn Malventano

Yeah, no VR game we've looked at so far has been updated to take advantage of NV's newer tech, so the rough percent change in performance can just be applied to the last VR test we did (here / here). The 980 was sufficient to run at 90 FPS in our tests, so the 1060 would roughly be in the same boat. Newer titles with more difficult scenes to render should also come with SMP enabled, meaning a 1060 should still be able to keep up even then.

July 19, 2016 | 10:00 AM - Posted by Anonymousss (not verified)

or you are getting a laptop chip in the desktop because amd has been broke and not developing for years.

July 19, 2016 | 09:55 AM - Posted by No one (not verified)

LOL PcPer... adds TimeSpy, conveniently avoids DOOM, the only game that best represent the future of gaming with low level api and async unbiased implementation ( not like that TimeSpy fiasco: https://steamcommunity.com/app/223850/discussions/0/366298942110944664/ + https://i.imgur.com/s51q4IX.jpg )

Want to see a factory overclocked 1060 AIB get beaten by 22% by an underclocked by 5% reference 480 in 1440p?

Head over there:

http://www.purepc.pl/karty_graficzne/geforce_gtx_1060_vs_radeon_rx_480_t...

Cheers! :)

July 19, 2016 | 10:13 AM - Posted by Anonymous (not verified)

What about the TSSAA settings for Time Spy? And how does UWP affect things. That Nvidia benchmarking manual look there also! Look for what ever is omitted and compare to leaked benchmarking manual, and you will have your answer!

July 19, 2016 | 10:37 AM - Posted by ggreen1955 (not verified)

GTX 1060 is overpriced and underperformed that is for sure. So are it availability.

VAPORWARE, VAPORWARE, VAPORWARE.............

July 19, 2016 | 10:39 AM - Posted by ggreen1955 (not verified)

GTX 1060 is overpriced and underperformed that is for sure. So are it availability.

VAPORWARE, VAPORWARE, VAPORWARE.............

July 19, 2016 | 12:37 PM - Posted by Allyn Malventano

Doom can't be frame rated yet.

July 19, 2016 | 06:44 PM - Posted by Anonymous Nvidia User (not verified)

Async unbiased in Doom. What a laugh. Only unbiased because it only works for AMD. Bethesda is working with Nvidia to get their version of async working as well. Doom is biased right now in the fact that it only supports async the AMD way. Once it is patched to include Pascal then we can call it unbiased. If 3dmark got async to work on Nvidia there is no reason other games can't be coded to take advantage of it for Nvidia as well.

July 19, 2016 | 09:56 AM - Posted by Anonymousss (not verified)

I would still pick the RX 480 and heres why:
1) Don't need to pay for G-sync. That alone saves me $200.
2) DX12/Vulkan/CrossFire support.
3) AMD open source contribution.
4) No expensive "Founders" bullshit.
5) No shady business practice like Ngreedia. 6) The RX 480 is $379 AUD and the GTX 1060 is $500 >_>
-Mike S on you tube

July 19, 2016 | 10:29 AM - Posted by ggreen1955 (not verified)

So true.

July 19, 2016 | 11:22 AM - Posted by Anonymous (not verified)

Heres why I'd pick the 1060 and why
1) Its faster
2) Because I'm sick and tired of "power to the people" AMD fanboys

July 19, 2016 | 12:30 PM - Posted by Anonymous (not verified)

The dictionary definition of a stupid gaming GIT. Enjoy your segmented product with no SLI. Maybe the new graphics APIs with their multi-adaptor support will still allow you to use more than one GTX 1060, but expect JHH to try and gimp that. Well no more of that hidden geometry monkey business from Nvidia for benchmarking with AMD's primitive discard accelerator working to keep any unneeded geometry culled out of the execution pipelines on Polaris SKUs.

Now that the by the marketing manual benchmarking is done there will be the benchmarking done for real using the latest graphics APIs and the newest tweaks by the games developers, and they will be using all that the Hardware asynchronous-compute has to offer. Enjoy your Founders' overcharges and retailer premiums on top of Nvidia's price gouge. Also enjoy your faster to get the same or less amount of work done because on the newer graphics APIs Nvidia's hardware will not make the most efficient use of the hardware available and need to be overclocked to get the same amount of work done.

It's easy to do things for DX11 for the games developers but the real power will come with the newer graphics APIs and the developers with more than half a brain will get at the New API's close to metal performance in the GPU's hardware, and Polaris has more functioning metal to perform for VR gaming and under the new graphics APIs. Nvidia is master at gaming the benchmarks but this time around on the new graphics APIs the games developers will have more control over the hardware, and the games companies not financed by Nvidia will be using the new graphics API's features for better results. Even M$'s UWP monkey business/lowest common denominator for gaming will not stop the Linux/Vulkan games from showing just what asynchronous compute fully in the GPU's hardware can do on some GPUs.

There needs to be more continuous re-benchmarking done with all the new graphics APIs as games are reworked/optimized for the newest graphics APIs, so there will be more emphasis on revisiting the benchmarking in the final half of 2016 well into 2017, on older and the newest hardware to grade the GPU makers in performance improvements over time. Maxwell does not appear to be ageing well under the newest graphics APIs while AMD's GCN is giving the users of AMD's older GCN hardware something to look forward to in the area of continued improvements over time!

July 19, 2016 | 04:29 PM - Posted by Anonymous (not verified)

So you're basing your graphics card purchasing decision on who you hate more?

Remind me to never take anything you say seriously.

July 19, 2016 | 06:13 PM - Posted by Jeremy Hellstrom

I beleive that is how advertising works now, you don't promote your own products strengths but instead attack the competitions weaknesses.

 

Same applies for voters sadly.

July 19, 2016 | 08:34 PM - Posted by Anonymous Nvidia User (not verified)

I'd like to add number 3 for you if I may. The power efficiency of 1060.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/24.html

Typical gaming draw is 116 watts for 1060 (under TDP of 120) vs 163 watts for Rx 480 (over TDP of 150/110). The rx 480 uses 40.5% more watts in gaming. Even if Rx 480 gets a few more frames in dX12, so what.

It only uses 6 watts to play a bluray at 1080 p with 1060 and 39 watts for Rx 480. This is a whopping 5.5 times more wattage used. Better not use Rx 480 for a media PC. LOL

July 19, 2016 | 07:12 PM - Posted by leszy (not verified)

Just modern architecture. All this Pascals are DX11 cards modded for DX12 support. They will soon be a great exhibits in museums, as the crowning achievement of the epoch DirectX11 :)

July 19, 2016 | 10:03 AM - Posted by pacmanfan

I didn't see idle power draw mentioned. I'd love to see that compared with the other cards that were compared on other tests.

July 19, 2016 | 10:13 AM - Posted by Ryan Shrout

Ah, you know, I did leave that out. Let me try to compile it today. Honestly though, we are talking about wattages in the 5-10 range.

July 19, 2016 | 08:39 PM - Posted by Anonymous Nvidia User (not verified)

It's 6 watts for a 1060 vs 40 watts for a 8 gig Rx 480. Indeed wow. It's almost 5.7 times more power draw for Rx 480. Better not spend much time browsing the internet or on desktop with the AMD Polaris. LOL

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/24.html

August 22, 2016 | 02:56 AM - Posted by Anonymous (not verified)

Thats.... odd. Why would an IDLE card draw 40 watts? Something smells bad here.

July 19, 2016 | 10:04 AM - Posted by Anonymous (not verified)

WTF kind of review is this?! Games sure was horrible.

I'd call it a product showcase, pointless read.

July 19, 2016 | 11:34 AM - Posted by Anonymous (not verified)

then don't

July 19, 2016 | 04:38 PM - Posted by Anonymous (not verified)

Too bad this comment buried in here, I wasted time reading this review also with all these old games. Agreed useless review for today's Games.

July 19, 2016 | 10:05 AM - Posted by Anonymous (not verified)

"Hitman (2016) and 3DMark Time Spy are both known to use asynchronous compute shaders under DX12 – one of the key areas that AMD has been promoting and targeting. In both of those benchmarks, and the DX12-based Gears of War: Ultimate Edition, the Radeon RX 480 is able to outperform the GeForce GTX 1060."

But in time Spy GTX 1060 is faster than RX 480!

July 19, 2016 | 10:54 AM - Posted by Legion495

From what I have seen in the lowlevelAPI games the 480 always pulls away, the fury X was quite a wow effect for me. Can't wait for cards with more ACEUnits.

I wonder how Nvidiasponsored games will do in future tho :)

July 19, 2016 | 07:20 PM - Posted by leszy (not verified)

It will be expensive for NVidia. Games for XBO are developed now with DX12 and already optimized for GCN. NVidia will need to pay for developing code path for their architecture for every games.

July 19, 2016 | 04:23 PM - Posted by StephanS

The reason is the rx480 dinky heatsink... not the chip.

It would be like running a i6700k CPU with the fan capped at 800rpm.
Benchmark will SUCK, yet the CPU is super powerfull.

I have a RX 480 and for sure its somes with a poor cooler to showcase the card capability.

All the thermal throttling of the rx480 is really messing up reviews. (and on top the overvolting, power limit)

ex: I lowered my rx480 clock by 5%, benchmarks scores got better.
So people should wait for the better cooler (the heatsink on the rx480 is so dinky small), or use wattman to tweak their cards.

From what I can tell from the review, the GTX 1060 HW is actually slower by about 15%.

We might see another 290x story unfolding...

July 19, 2016 | 10:09 AM - Posted by William Henrickson (not verified)

Was waiting to kick myself for buying a 980 for $260, but I can cancel that. SLI to boot.

July 19, 2016 | 10:19 AM - Posted by Anonymous (not verified)

this card would have been great with a 256bit bus and 8gb vram. but politics man

July 21, 2016 | 04:13 AM - Posted by Anonymous (not verified)

It cannot do SLI so 6GB is plenty for what it is meant for.

If you need 8GB go 480 or add money and buy 1070

July 19, 2016 | 10:28 AM - Posted by Anonymous (not verified)

I was wondering if it would be a compelling upgrade from a 970 and it kinda is for VR purposes. 980 performance at stock, places it right smack in the middle of a 980 and 980 ti OC'd. Considering I can probably get $175 for the 970, a $75 dollar upgrade is very, very compelling.

July 19, 2016 | 10:48 AM - Posted by Legion495

Or just don't^^ Why upgrade if there is so few to gain. I bet you are good just by overclocking a bit.

July 19, 2016 | 11:09 AM - Posted by Anonymous (not verified)

My 970 is OC'd to the nines. It's simply not enough for VR. I suppose I should have mentioned my plan was to use the 1060 as a stop gap, it's a cheap way to get a little more performance until the 1080ti drops, which I'm buying instantly. The 1080 too expensive and the 1070 just doesn't have the power to run VR the way I want it to.

July 19, 2016 | 10:46 AM - Posted by endlesswargasm (not verified)

What is the policy on driver updates?

On the AMD side 16.7.1 has been out for a couple of weeks now, and it brings a 3-5% performance bump for the 480.

This would change the numbers somewhat.

I understand it would be a much too large time commitment to rerun the entire suite every time a driver drops.

But will custom 480s be tested with up to date drivers once they launch?

July 19, 2016 | 10:49 AM - Posted by Legion495

A every 6 or 3 month retesting would be quite nice to see tho.

July 19, 2016 | 11:18 AM - Posted by Anonymous (not verified)

Yep this is exactly what happen in the case of the 7970ghz and the GTX 680. The 680 was crowned as the champ and later the 7970ghz blew the 680 out of the water by 10-20%. Side note the VRAM at the time on the 7970 was 3gb which furthered its longevity. My 7970ghz crossfire benchmarks just a hair above a 980 which is pretty great for 2012 cards.

Point is driver re-testing should become mandatory. This is not just to benifit AMD but to NVIDIA as well. Right now the 1080 has a big problem with DPC latency which Nvidia promises to fix.

July 19, 2016 | 10:51 AM - Posted by vin1832 (not verified)

i know i sound like a kid, but he said:

"welcome to [PC RESPECTIVE]"

July 19, 2016 | 11:10 AM - Posted by JohnGR

Doom? Where is Doom? Anyway, I did read Ryan's answer. Nvidia is safe for now. Not even a side note for an API like Vulkan "We tested Doom and we found this and that, but we will not show you ant charts because of equipment limitations". Not even that.

Compared to other reviews this looks less biased. But even here that "Starting at $249" marketing lie from Nvidia is on the title.
Yeah, probably they will sell 1000 cards in the whole planet at that price point. The rest can enjoy prices close to the Founder Edition price point.

Anyway, good card. People who where hoping to get an Nvidia option at $300 or a little less so they can avoid RX 480 or older Maxwell cards, they got their wish.

PS 3DMark Time Spy using async? Well there are many that believe it does NOT.
http://www.overclock.net/t/1605674/computerbase-de-doom-vulkan-benchmark...

http://www.overclock.net/t/1606224/various-futuremarks-time-spy-directx-...

PS2 AMD: Crimson 16.6.2
LOL OK, haven't seen that. Maybe I should reconsider that "less biased" I wrote.

July 19, 2016 | 11:19 AM - Posted by Anonymous (not verified)

Yes, I'm surprised that PCPer hasn't picked up on the scandal that is surrounding Futuremark's DirectX12 "Timespy" and yet lacks DX12 features like async.

I can't tell if they're investigating more to report on it or if they just don't care altogether.

July 21, 2016 | 04:04 AM - Posted by Anonymous (not verified)

Time Spy does use async. BUT...it uses the simplest and least taxing version of it possible. One which Pascal can cope with.

Futuremark explained it was so it could run on any hardware. Which is fair enough, but the end result is that it barely scratches the surface of GCNs compute capability.

This is why real games such as Doom and Hitman, which *have* been optimized to take advantage of GCNs full async capabilities, show very different results.

Time Spy was never designed to max out GCNs compute capability, so it doesn't. In effect it's a simple yes/no test to see if a card can support the most basic form of async compute.

July 19, 2016 | 11:23 AM - Posted by Anonymous (not verified)

Because every review is biased and everybody is a shill. I love this narrative from the AMD fans. Yes, yes the world gangs up against you.. AMD is the little guy who can do no wrong and the only reason they're getting crushed in the market is a giant conspiracy rather than crappy products. Got it.

July 19, 2016 | 11:25 AM - Posted by Anonymous (not verified)

Nvidia fans and their victim complex is astounding.

July 19, 2016 | 11:33 AM - Posted by Anonymous (not verified)

Nvidia fan? No, I've got AMD cards right now, just sick and tired of AMD's marketing creating fanboys with the real victim complex... who decide that every benchmark is rigged from day 1 not to show AMD's glorious capabilities. Unfortunately for them, they don't actually perform as well in the games people actually play.

July 19, 2016 | 11:55 AM - Posted by JohnGR

Well the nvidia support army just arrived. Yes yes I know. You have an AMD cpu an AMD mobo an AMD graphics card an AMD ssd and also AMD ram. Two posters in your bedroom, one of Su and one with AMD's logo on it. You convince us all.

July 19, 2016 | 01:04 PM - Posted by Anonymous (not verified)

Intel CPU's in all my machines except a kabini HTPC, 290X CF in one, 7850 in another, a GTX 960 in another. Been building AMD machines since Thunderbird and owned AXP's 3 A64's, and Opteron, a thuban X6, Intel 950's, 970's, and all types of shit. Have owned ATI cards, Nvidia cards from almost all generations. I buy what suits my needs. Period. I don't care about some david v. goliath narrative, I remember when AMD charged $800 for their CPU's, just like Intel did. I paid $500 for an X800 over 10 years ago. The tables turn, the new boss is the same as the new boss. The difference between you and me is that I can be critical of anyone I want and still look myself in the eye in the mirror because I'm NOT a fanboy.

July 19, 2016 | 01:27 PM - Posted by JohnGR

You are trying too hard to convince yourself that you are not an Nvidia fanboy.

July 19, 2016 | 04:46 PM - Posted by Anonymous (not verified)

Says the one with the AMD avatar..

July 19, 2016 | 05:24 PM - Posted by JohnGR

If I had no avatar and was posting as an anonymous would it make any difference?
I pity those who look at an avatar when they have no other arguments.

July 19, 2016 | 06:19 PM - Posted by Anonymous (not verified)

No arguments, just enjoying the irony :)

July 19, 2016 | 11:55 AM - Posted by JohnGR

You know that I am right, that's why you try to attack my avatar than comment about what I am saying.

July 19, 2016 | 12:23 PM - Posted by Anonymous (not verified)

your avatar sucks! bitch

July 19, 2016 | 01:24 PM - Posted by JohnGR

It's overclocked :p

July 19, 2016 | 12:46 PM - Posted by Allyn Malventano

> Doom? Where is Doom?

Can't be frame rated yet.

July 19, 2016 | 01:19 PM - Posted by JohnGR

Other sites do have Doom numbers. Of course many sites avoid it.

What if you will NEVER being able to frame rate Vulkan? In case Vulkan becomes the standard in the market are you going to stop writing reviews because you can't frame rate it?

You could add numbers from Doom and just warn your readers that those numbers are not verified the way you want. It's an AAA title, a known and very popular title, the first game that uses Vulkan, you shouldn't be trying avoiding showing numbers from it.

July 21, 2016 | 04:10 AM - Posted by Anonymous (not verified)

What about the gadget you guys designed to monitor frame times with G-Sync/Freesync ? That seemed to work pretty well ? Iirc it was a camera stuck on the screen which could count the exact frame times/rates ?

July 19, 2016 | 06:50 PM - Posted by Anonymous Nvidia User (not verified)

They used an older driver for Nvidia but don't see you mentioning that too. Oh no now PC Per is AMD biased. See how easily I can turn things. The review was probably done weeks ago when they received their sample. The 1060 AIB board reviews will likely will have newer drivers used as well as Rx 480 AIB reviews upcoming soon.

July 19, 2016 | 11:24 AM - Posted by Anonymous (not verified)

Another GTX 10X0 launch where there's absolutely no supply, probably around 1000~ units.
Let the price gouging and two months of paperlaunch and $100-above-MSRP begin.

July 19, 2016 | 11:58 AM - Posted by Robbie (not verified)

I was just able to get an EVGA 1060SC on Newegg that says it will ship in 2 days. $249 after a $10 rebate.

July 19, 2016 | 02:50 PM - Posted by nobodyspecial (not verified)

ROFL. OCuk said they sold 1000 GTX 1080's at launch and 2000 radeon 480's (both one day numbers). I'm fairly sure 1070 had at least as many available as a far more expensive 1080, so between the two they sold as many or more than a $200-250 card in one day. That's not exactly vapor availability and more come in daily and sell in minutes. For a VAPOR card, the 1080 (according to retailers NOT NVIDIA) was the fastest high-end card in history. PERIOD. Can't do that with no cards. Their quarterly report will show this in the earnings.

Just google GTX 1080 fastest selling :)

IE:
http://www.fudzilla.com/news/graphics/40947-geforce-gtx-1080-sold-out-se...

July 19, 2016 | 07:39 PM - Posted by leszy (not verified)

Better give us link to Gibbo post on OCUK :)))
He said, they sold 1k 1080 in first month, and next, 1k RX480 in first two days.
Next one interesting post:
https://forums.overclockers.co.uk/showpost.php?p=29795238&postcount=516

July 19, 2016 | 12:21 PM - Posted by JohnGR

DOOM Numbers

http://www.hardocp.com/images/articles/1468921254mrv4f5CHZE_4_3.gif

1060 64 fps
480 80 fps
980 71 fps

July 19, 2016 | 12:27 PM - Posted by Anonymous (not verified)

hypocrite. cries about hardocp being nv biased then posts link from their site showing good results for amd. LMFAO get over yourself loser

July 19, 2016 | 12:51 PM - Posted by Anonymous (not verified)

Actually the irony both ways is great, because even the by the Nvidia benchmarking manual folks at hardocp cannot fudge the Doom results. GCN has arrived, and even for the Older GCN SKUs. So Maxwell is showing Max-Less improvements under DX12/Vulkan compared to GCN/Polaris and GCN/older SKUs!

So now Nvidia will try its hardest to do the Pascal two step and throw some gimp-works into the benchmarks, but this time around it's not going to work with the developers getting access to more of the GPU's metal! There are less places in the drivers for Nvidia to Gimpvidia with the newer graphics APIs, as the games developers will be more in charge of the optimizations for their games.

July 19, 2016 | 01:06 PM - Posted by Anonymous (not verified)

I think the true definition of irony is thinking that its ironic that HardOCP can't fake the doom results to help Nvidia rather than just realizing that they were never biased in the first place.

July 19, 2016 | 01:22 PM - Posted by JohnGR

Exactly.

And it might work even this time for Nvidia. Time Spy show that there is a way to make Pascal cards look better and AMD cards less great.

July 19, 2016 | 03:15 PM - Posted by Cyclops

You're still digging, huh? That tin-foil hat gotta come off sometimes.

July 19, 2016 | 03:26 PM - Posted by JohnGR

Then stop wearing it.

July 19, 2016 | 01:20 PM - Posted by JohnGR

Yes I do think that hardocp is Nvidia biased and that's why I believe that those numbers are NOT AMD biased.

July 19, 2016 | 12:40 PM - Posted by Cyclops

AMD trolls strike again!!

Seriously, keep a level head.

July 19, 2016 | 01:46 PM - Posted by Anonymous (not verified)

The amount of confirmation bias in these comments is astounding. You've got AMD fans calling Nvidia fans biased for calling AMD fans biased for calling Nvidia fans biased. The circle is never ending and I can't imagine how frustrated Ryan and Allyn must be with the state of their comment section.

July 19, 2016 | 03:27 PM - Posted by JohnGR

Believe me, when you write articles there is nothing more frustrating than having no to very few comments.

July 19, 2016 | 04:55 PM - Posted by Anonymous (not verified)

Dude this whole community is dysfunctional.. I cannot for the life of me understand why people spend some much time and energy crying and whining about "but, but, but, but... launch day prices, availability, drivers will make it faster, uses cheap marketing tricks.. " etc..

Why don't you all just take a breath, relax, and WAIT for a month or two for everything to settle.

Then you can make a more informed decision on which card...
is more appropriate...
for...

YOUR PERSONAL REQUIREMENTS.

July 19, 2016 | 07:10 PM - Posted by CNote (not verified)

I dont know who is worse GPU fan boys fighting or truck fan boys pissing on each other.