RX 480 versus the GTX 1060; who gets your $250?

Subject: Graphics Cards | November 15, 2016 - 02:58 PM |
Tagged: rx 480, nvidia, GTX1060, amd

On one side of the ring is the RX 480, with 2304 Stream Processors, 32 ROPs and 144 Texture Units.  In the opposite corner, at 1280 CUDA Cores, 48 ROPs and 80 Texture Units is the GTX 1060.  The two cards retail for between $200 to $250 depending on the features present on the card as well as any sales.  [H]ard|OCP tested the two cards head to head, not just raw performance numbers but also the stability of the GPU frequencies. power draw and temperatures.  All games were tested at base clocks and at the highest stable overclock and the results were back and forth, in some games AMD pulled ahead while in others NVIDIA was the clear winner.  It is worth keeping in mind that these results do not include VR results.

View Full Size

"We take GIGABYTE’s Radeon RX 480 G1 GAMING video card and pit it against a MSI GeForce GTX 1060 GAMING X video card in today’s evaluation. We will overclock both video cards as high as possible and compare performance and find out what both video cards have to offer in the upper $200 price range for gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Video News


November 15, 2016 | 03:30 PM - Posted by Anonymous (not verified)

IF YOU CHOOSE THE SHITTY UGLY LOUD HOT AND UNRELIABLE AF, VASTLY INFERIOR IN EVERY WAY, AMD PRODUCT, YOURE A FUCKING REE-REE AND DESERVE ALL THE HASSLES HEADACHES RMAS AND BULLSHIT YOURE GONNA GET. ONLY NOOBS, AND UNEDUCATED WOULD MAKE A STUPID CHOICE, AND GET THE INFERIOR AF GPU. PERIOD.

November 15, 2016 | 03:54 PM - Posted by Anonymous (not verified)

^^ someone voted S3 Graphics.

November 15, 2016 | 04:01 PM - Posted by Anonymous (not verified)

I voted for ARM, too, and all I got was 3.5GB VRAM.

November 15, 2016 | 04:17 PM - Posted by Anonymous (not verified)

Make async compute great again!

November 15, 2016 | 04:31 PM - Posted by Anonymous (not verified)

I ACTUALLY VOTED FOR MCAFEE.

November 15, 2016 | 06:11 PM - Posted by Jeremy Hellstrom

Damn you all!  I am deleting these posts until Sweetcheeks McGlueSniffer gets bored and wanders off back to 4chan but this is too funny to remove. 

You might notice what I am now doing to the ones that don't end up in the electrical wastebin, though.

November 15, 2016 | 06:34 PM - Posted by pdjblum

I am going to go out on a limb and say you are making the font italic?

November 15, 2016 | 10:41 PM - Posted by Kung-Fu (not verified)

by you giving that knucklehead airtime becuaes you think it's funny is only going to encourage him.

Get rid of the anon accounts and this all pretty much goes away.

November 15, 2016 | 07:01 PM - Posted by Anonymous (not verified)

McAfee was unironically the best. He was the only one who stood up to the NSA, knew anything about technology, and actually understood and cared about freedom. I'd have voted for him as well if I remembered how to spell his name

November 15, 2016 | 06:18 PM - Posted by Anonymous (not verified)

"I voted for Matrox, too, and all I got was 3.5GB VRAM." Best comment on Pcper!

November 15, 2016 | 06:28 PM - Posted by Jeremy Hellstrom

Can't argue that ... but I also have to change it.

What is a ree-ree by the way ... apart from a horror movie sound?

November 15, 2016 | 07:51 PM - Posted by Anonymous (not verified)

Oh, oh... we not allowed the "T" word here?

November 16, 2016 | 02:00 AM - Posted by collie

Those of us who live in the great northern country of CANADA enjoy our wood burning GPUs just fine, they may not have the best graphics but they do just fine.

Have a nice day, EH?

November 16, 2016 | 12:37 PM - Posted by Jeremy Hellstrom

This is not a political site, this is a technology site.  You want to argue politics there are plenty of other places to go.  Any political mentions will be modified.

November 16, 2016 | 09:28 AM - Posted by Anonymous (not verified)

From the way the chap was talking I'm guessing you just need to add "tard" on the end and you have your answer

November 15, 2016 | 04:44 PM - Posted by Anonymous (not verified)

Should have used the same version of both. No reason they couldnt have gotten an MSI Gaming X 480.

November 16, 2016 | 01:58 AM - Posted by Cyric (not verified)

That would be best, since there is a little scandal around G1 Gaming RX480, having half the cooling capacity comparing to GTX1060 G1 Gaming. There is a v2 RX480 G1 G.. though that I haven't looked at.

November 15, 2016 | 05:01 PM - Posted by Anonymously Anonymous (not verified)

sigh, one day PCPer will wake up and get rid of that shared Anon account.

November 15, 2016 | 07:35 PM - Posted by Anonymous (not verified)

Yes they will, and one day there my not be a shared anon account, but not here and not now!

P.S. one Day AMD will not be forced to rely on gamers to stay in business, as gamers are ruining the quality of graphics processing with their FPS obsession and their need for far too many ROPs relative to the numbers of shaders at the expence of computational power. AMD appears to be making some very nice Radeon Pro WX SKUs for graphics work, and accelorator work. The real revenues for AMD are with Zen and the Server/HPC/Workstation Zen/Radeon Pro GPU WX professional market Zen/Vega for the Pros.

November 15, 2016 | 06:19 PM - Posted by Anonymous (not verified)

I didn't know that website was still working.

November 16, 2016 | 01:59 AM - Posted by Anonymous (not verified)

I AM THE SHITTY UGLY LOUD HOT AND UNRELIABLE AF, VASTLY INFERIOR IN EVERY WAY ... PERIOD.

November 16, 2016 | 02:04 AM - Posted by collie

FUCKIN RIGHT ON MAN!!!!!! I AM ANGRY BECAUSE PEOPLE ARE DOING THINGS, NOT NECESSARILY THINGS I DISAGREE WITH BUT THINGS IN GENERAL. PEOPLE SHOULD BE PUNISHED FOR DOING THINGS. WHY IS THAT DOG LOOKING AT ME? FUCK THAT DOG. RAM-BUS SHOULD STILL BE AROUND. WHO DO I NEED TO TALK TO TOO GET SOME FUCKING TANG IN THIS COUNTRY. DEATH TO ALL SQUIRRELS!!!!!!!!!!!! SEA MAMMALS SHOULD GO BACK WHERE THEY CAME FROM!!!!!!

EXCETERA!!!!!!!!!!!!!!!!!!!!!!

November 16, 2016 | 07:49 AM - Posted by CB (not verified)

Haha. Nice job man.

November 16, 2016 | 05:33 AM - Posted by malurt

Nice to see articles highlight the almost non-existing performance difference in the cards that most people will be buying.
Been very happy with my Sapphire Nitro+ RX 480 so far.

November 16, 2016 | 07:18 PM - Posted by Anonymous Nvidia User (not verified)

Yeah in frame rate they are similar. However that 480 Nitro eats more electricity than a 1080 to get 1060 level performance. It consumes 82 watts more than a 1060 system which is around 40% more.

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-rx-480-n...

Here's Tom's take of efficiency 480 vs 1060.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-...

Not even close. However you do have the best red team card so far.

November 17, 2016 | 02:23 PM - Posted by Anonymous (not verified)

The RX 480 has way more SP FP Flops compared to the GTX 1060, and the RX 480 runs at lower clocks to get that higher SP FP Flops metric. So games alone can not be used for any accurate GPU workload efficiency numbers. There are other workloads besides gaming that get AMD's GPUs sold as with any bitcoin mining using any new algorithms that are not yet implemented in ASIC form! Bitcoin miners will most likely go for AMD's extra SP FP performance for the dollar until the ASICs can be made.

Gaming usage may not make use(Yet) of all that extra AMD compute in AMD's consumer SKUs but the other use cases for AMD's GPUs will take all that extra compute and use it all.

So its easy to see why the RX 480 uses the power it's in that extra FP hardware/async hardware that is used by more than games for some workloads! And as more Vulkan and DX12 optimized games are released those gaming benchmarkes will have to be redone.

November 16, 2016 | 07:31 AM - Posted by Daniel Grabovskiy (not verified)

Got a nitro + 480 8gb for 225$ just now. Can't complain.

Don't get the hate on AMD when they have future proofed their new cards really well.

November 16, 2016 | 08:44 AM - Posted by Anonymous Nvidia User (not verified)

AMD's definition of future proofing doesn't include VR obviously.

November 16, 2016 | 04:39 PM - Posted by Anonymous (not verified)

I think most of what you are talking about "doesn't include VR obviously" steams from the common game engine used. I am in no making any excuses, but when a certain GPU manufacturer has it's finger prints in the engine. You can expect alternate GPU choices to not run as well.

November 16, 2016 | 04:40 PM - Posted by Anonymous (not verified)

Sorry- sreams not steams

November 16, 2016 | 06:45 PM - Posted by Anonymous Nvidia User (not verified)

I assume you're referring to Unreal Engine. Take a look at VR leaderboard of average of all VR content rated so far by the site. It might open hour eyes a little or will you still call the source material biased.

http://m.hardocp.com/article/2016/11/11/amd_nvidia_gpu_vr_perf_please_st...

However even in games designed with AMD's liquidVR, they still aren't faster than Nvidia.

http://m.hardocp.com/article/2016/10/21/amd_nvidia_gpu_vr_performance_se...

Most of VR isn't even using technology Nvidia built into their cards yet. When companies start supporting it look for the gap to get wider.

And AMD fingerprints aren't all over Microsoft's directx 12 because of AMD graphics in Microsoft's Xbox consoles. Yeah LMAO.
The future still holds for as long as there is a dx12 there will be dx11. It's still needed for out of the box multi gpu support and easier to code for. I'm sure you don't need a reminder who is better in that API.

November 16, 2016 | 10:00 PM - Posted by Anonymous (not verified)

Please State Your Name - Unreal Engine 4

"However even in games designed with AMD's liquidVR,"

http://m.hardocp.com/article/2016/10/21/amd_nvidia_gpu_vr_performance_se...

IMO The average time frame times don't look that bad (no dropped frames). Plus GTX 980ti and GTX 1070 are faster cards than the Fury X to start with. Close but not apples to apples. Is this game DX11, DX 12 or Vulcan?

"And AMD fingerprints aren't all over Microsoft's directx 12 because of AMD graphics in Microsoft's Xbox consoles."

So Microsoft should gimp their console with a NVidia biased API?
GTX 10XX series cards seem to fair very well with AMD's finger prints. That said DX12 merely "helps to" even the overhead discrepancy between AMD and NVidia. It does not gimp NVidia. afaik there is no performance loss in DX12 for NVidia (baring unforeseen problems handling compute). Seems strange for someone to be doing damage control for the leader.

"Most of VR isn't even using (their software) technology." Unreal Engine 4 says hello.

"I'm sure you don't need a reminder who is better in that API." Your right I do not need a reminder of NVidias affiliation with Microsoft in DX11.

November 17, 2016 | 04:47 PM - Posted by Anonymous Nvidia User (not verified)

Got your facts a little wrong. Dx11 was also created with AMD in mind as they had the xbox360 console at the time. The only dx that favored Nvidia was dx9. Because they had the original Xbox. It's why Nvidia beats AMD so much in Blizzard games who still uses dx9 for games. AMD had dominance in dx11 as Nvidia didn't even have a card with tesselation out at release. AMD enjoyed many months of dominance before Nvidia put out a card that was fully compatible. Don't hate Nvidia because they did dx11 better by designing better and AMD got lazy/cheap.

The AMD VR game is Serious Sam. That other link was to show VR leaderboard of all VR content up to this point.

You haven't been around long or done any research. Dx12 sometimes even hits AMD with negative performance as well. Just not as much as Nvidia. Dx12 is much harder to code for and we'll it often doesn't work as well as one would think.

And yes Pascal cards do well because Nvidia upped the compute greatly because dx12 was going to utilize it more because AMD cards can do compute well.

Directx should be as vendor agnostic as possible supporting the features of both cards equally. I don't worry because with Nvidia's next generation of cards will do dx12 so well(if history repeats itself yet again). They will have AMD begging Microsoft for dx13.

Seems strange a lot of AMD fanboys are mindless followers posting the same old tired sh*t about Nvidia. I am paid by no one and don't have a leader. I post to inform or to counter misinformation because I want to.

November 17, 2016 | 06:22 PM - Posted by Anonymous (not verified)

Why do you keep moving the goal posts?

"Got your facts a little wrong. Dx11 was also created with AMD in mind" Please correct me If I am wrong but Parts of DX 11 (11.1, 11.2) were created with AMD "in mind", but not proprietary.
You are arguing 2 distantly at odds lines of thought.

1/ That AMD sucks because it does not do well with biased towards NVidia software/API's/etc.

2/ It's unfair to use software/API's/etc that favor AMD.

" Don't hate NVidia because they did dx11 better by designing better and AMD got lazy/cheap."

1/ My NVidia cards run in a Win 7 environment. My AMD cards run in a Win 10 environment. It is what works best for each. NO hate there. AMD moved on from TeraScale (microarchitecture), I do not think it was lazy. Mistimed maybe.

You brought up consoles not me.

"The AMD VR game is Serious Sam. That other link was to show VR leaderboard of all VR content up to this point."

Yes I easily caught that that inflection. But the majority of that list favors NVidia leaning software solutions.

"You haven't been around long or done any research. Dx12 sometimes even hits AMD with negative performance as well. Just not as much as Nvidia. Dx12 is much harder to code for and we'll it often doesn't work as well as one would think."
If 6 years isn't long, then no. But research and observe; as best I can; yes I have. But do you see me questioning your back ground? No, our opinions clash though.
Note my words:"GTX 10XX series cards seem to fair very well with AMD's finger prints."
Yes the learning curve with DX12 is steep.

"And yes Pascal cards do well because Nvidia upped the compute greatly because dx12 was going to utilize it more because AMD cards can do compute well."

Early on I caught lots of flack on compute threads for suggesting NVidia now works well with compute.

"Directx should be as vendor agnostic as possible supporting the features of both cards equally. I don't worry because with Nvidia's next generation of cards will do dx12 so well(if history repeats itself yet again). They will have AMD begging Microsoft for dx13."

But it is not entirely agnostic. It already has some NVidia only (perhaps too strong of word) adaptations. DX 12.1

"Seems strange a lot of AMD fanboys are mindless followers posting the same old tired sh*t about Nvidia. I am paid by no one and don't have a leader. I post to inform or to counter misinformation because I want to."

I responded as I read and came across this last. No fanboy here so it kind of goes back to you.

If you did LMAO (fanboyish). Perhaps a good surgeon could reattach the lost part.

November 17, 2016 | 07:14 PM - Posted by Anonymous (not verified)

To continue. I'm walking away from this as your drawing me down to your level. I'll live happily with GPUs form both venders.

Later dude

November 18, 2016 | 08:28 PM - Posted by Anonymous Nvidia User (not verified)

So it's Nvidia's problem that companies are making VR content that favors their cards and not AMD that is too busy pushing dx12. AMD is starting to add directx 12.1 support for rasterization as they added it in rx 480. A video card is more than just compute. It's fine if that's all you're looking for.

Are you implying that Nvidia isn't good in win 10. It still has backwards compatibility with dx11 and 9 as well.

It's good that you have two systems to play whatever games are better for them.

Obviously you seem to favor AMD with your more modern system being theirs and the tone of your comments. That's OK. If you want to call me a fanboy fine. IDC. I like and support their products but don't lie and post any falsehoods as to the capability or lack of AMDs I'm not saying you did either.

November 16, 2016 | 05:04 PM - Posted by Lance Ripplinger (not verified)

A very nice GTX 1060 from Gigabyte got my green paper. :D

November 17, 2016 | 03:17 PM - Posted by Anonymous (not verified)

And good luck using more than one GTX 1060 using SLI! And I'll bet that Nvidia will do some more gimping at the driver level to make the 1060 unable to perform well using any DX12/Vulkan Non CF/SLI milti-GP adaptor in these Graphics APIs also. Nvidia wants you to pay to play, and pay you will for Nvidia's overpriced kit!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.