Review Index:

The GeForce GTX 1070 8GB Founders Edition Review

Manufacturer: NVIDIA

GP104 Strikes Again

It’s only been three weeks since NVIDIA unveiled the GeForce GTX 1080 and GTX 1070 graphics cards at a live streaming event in Austin, TX. But it feels like those two GPUs, one of which hasn't even been reviewed until today, have already drastically shifted the landscape of graphics, VR and PC gaming.

View Full Size

Half of the “new GPU” stories are told, with AMD due to follow up soon with Polaris, but it was clear to anyone watching the enthusiast segment with a hint of history that a line was drawn in the sand that day. There is THEN, and there is NOW. Today’s detailed review of the GeForce GTX 1070 completes NVIDIA’s first wave of NOW products, following closely behind the GeForce GTX 1080.

Interestingly, and in a move that is very uncharacteristic of NVIDIA, detailed specifications of the GeForce GTX 1070 were released on well before today’s reviews. With information on the CUDA core count, clock speeds, and memory bandwidth it was possible to get a solid sense of where the GTX 1070 performed; and I imagine that many of you already did the napkin math to figure that out. There is no more guessing though - reviews and testing are all done, and I think you'll find that the GTX 1070 is as exciting, if not more so, than the GTX 1080 due to the performance and pricing combination that it provides.

Let’s dive in.

Continue reading our review of the GeForce GTX  1070 8GB Founders Edition!!

The GeForce GTX 1070 – An only slightly mutilated GP104

The setup for this review is going to be a lot quicker than was the case with the GTX 1080. We already know about the architecture, the new features and how it ticks. At this point, we have only a couple specification changes and a memory swap to worry about.

  GTX 1080 GTX 1070 GTX 980 Ti GTX 980 GTX 970 R9 Fury X R9 Fury R9 Nano R9 390X
GPU GP104 GP104 GM200 GM204 GM204 Fiji XT Fiji Pro Fiji XT Hawaii XT
GPU Cores 2560 1920 2816 2048 1664 4096 3584 4096 2816
Rated Clock 1607 MHz 1506 MHz 1000 MHz 1126 MHz 1050 MHz 1050 MHz 1000 MHz up to 1000 MHz 1050 MHz
Texture Units 160 120 176 128 104 256 224 256 176
ROP Units 64 64 96 64 56 64 64 64 64
Memory 8GB 8GB 6GB 4GB 4GB 4GB 4GB 4GB 8GB
Memory Clock 10000 MHz 8000 MHz 7000 MHz 7000 MHz 7000 MHz 500 MHz 500 MHz 500 MHz 6000 MHz
Memory Interface 256-bit G5X 256-bit 384-bit 256-bit 256-bit 4096-bit (HBM) 4096-bit (HBM) 4096-bit (HBM) 512-bit
Memory Bandwidth 320 GB/s 256 GB/s 336 GB/s 224 GB/s 196 GB/s 512 GB/s 512 GB/s 512 GB/s 320 GB/s
TDP 180 watts 150 watts 250 watts 165 watts 145 watts 275 watts 275 watts 175 watts 275 watts
Peak Compute 8.2 TFLOPS 5.7 TFLOPS 5.63 TFLOPS 4.61 TFLOPS 3.4 TFLOPS 8.60 TFLOPS 7.20 TFLOPS 8.19 TFLOPS 5.63 TFLOPS
Transistor Count 7.2B 7.2B 8.0B 5.2B 5.2B 8.9B 8.9B 8.9B 6.2B
Process Tech 16nm 16nm 28nm 28nm 28nm 28nm 28nm 28nm 28nm
MSRP (current) $599 $379 $649 $499 $329 $649 $549 $499 $389

NVIDIA has reduced the CUDA core / processor count on the GTX 1070 from 2560 to 1920, a drop of 25%. This is actually a bigger drop than the GTX 970 withstood coming from the GTX 980 (~19%) so there is the potential for a larger performance disparity this generation. The 1920 core count is still higher than the GTX 970 (1664), and with a significantly higher clock speed there is no doubt which card is going to be faster.

Texture unit count drops from 160 on the GTX 1080 to 120 on the GTX 1070 thanks to a loss of five SMs. It looks like NVIDIA has disabled one complete GPC rather than disabling SMs piecemeal across the GPU. ROP count remains the same though at 64, and of course the memory bus stays at 256-bit to go along with them.

View Full Size

Speaking of the memory controller, even though it is architecturally identical between the two Pascal products, the GTX 1070 is using 8GB of GDDR5 rather than the newer, faster GDDR5X found on the GTX 1080. Frequency is reduced from 10 Gbps to 8 Gbps and memory bandwidth drops from 320 GB/s to 256 GB/s. That being said, this is the first GPU to ship with 8.0 GHz GDDR5 memory, so that is still an increase over the data rate the GTX 980 produced at stock. Couple that with the improved memory compression on Pascal and there shouldn’t be any concern over the memory design on the GTX 1070.

Out of the box clock speeds on the reference / Founders Edition of the GTX 1070 are set at 1506 MHz base, and 1683 MHz Boost. Though they are slightly lower than the GTX 1080, the increase over the GTX 970 is substantial – nearly 50%! Doing the math in your head should already give you a clue to performance: 15% more cores and 50% higher clock rates than the GTX 970 should give the GTX 1070 a big step forward for the price segment compared to Maxwell.

As to power consumption, the GTX 1070 uses 30 watts less than the GTX 1080, with a TDP of 150 watts according to NVIDIA specifications. That is 5 watts higher than the GTX 970, though I imagine any gamer would give up 5 watts for the performance delta we are seeing. Power is again supplied by just a single 8-pin connector.

Architectural Changes and Features Recap

For both my sanity and yours, I’m not going to attempt to retell the story surrounding Pascal and the GP104 GPU. There are some slight architectural changes including the addition of a new geometry processing block that enables simultaneous multi-projection and some intricate path modifications to allow the 16nm process to reach these kinds of crazy clock speeds, but GP104 exhibits the exact same FLOPS per clock that Maxwell does.

View Full Size

There are several new features supported on Pascal that are worth noting as well, but were covered previously in the review of the GeForce GTX 1080.

View Full Size

If you haven’t caught up on these technologies I would actually encourage you take a slight detour to those pages linked above and give them a read. They are significant and noteworthy additions to the GeForce product stack and I am particularly excited to try out SMP in some working titles.

(PS – if you want, we interviewed NVIDIA’s Tom Petersen on GTX 1080 launch day during a live video stream and you catch up on the technology via this YouTube playlist alternatively.)

Video News

May 29, 2016 | 06:28 PM - Posted by Hoyty (not verified)

$379 for 60 fps at least at 25x14, count me in. Just wait for pricing / avail to shake out. Glad I saved and only bought 960 last Christmas, now easy upgrade.

May 29, 2016 | 06:42 PM - Posted by Ryan Shrout

Yeah, unless you can get them at the prices listed here, it's probably better to wait until it settles.

May 29, 2016 | 09:16 PM - Posted by Anonymous (not verified)

How long does that generally take? I haven't followed that many video card releases recently. All of the listings seem to be third party sellers listing the 1080 for $900 or more. I don't think I am going to buy anything until I see what AMD has available with Polaris anyway. While comparisons with previous generation parts are interesting, they don't really inform the buying descision much since there is such a big performance gap. It looks like the 1070 may really put price pressure on AMD parts though, once it is actually available for non-founders edition prices.

May 30, 2016 | 04:30 AM - Posted by arbiter

Would expect since 1070 and 1080 cards can be identical in so 3rd party cards should be quick.

May 29, 2016 | 09:18 PM - Posted by Dennis (not verified)

Could you give us very small hint as to whether the Polaris 10 will be competitive to this card? You have been to that event in China that AMD held.

May 30, 2016 | 07:02 AM - Posted by vin1832 (not verified)

I know it's already alot of work, but can we have some openCL or blender benchmarks? or even just from preimer pro testing

not all people game

and, well, also for the 1080 please! >.<

June 2, 2016 | 07:47 AM - Posted by Michael McAllister

Ryan, do you know why your GTA V screenshots are only showing 6 GB of memory available? This seems very strange considering these are 8 GB cards. Thoughts?

May 30, 2016 | 07:47 AM - Posted by rcald2000

$379 for 60 FPS @ 2560 x 1440 sounds amazing.
25x14 sounds like the abbreviation that a certain someone from Nvidia kept using during a recent live stream. I kept wondering, "What's wrong with just saying 2560 x 1440 or 1440p?". Does anyone know how the 25x14 abbreviation started?

May 31, 2016 | 05:08 PM - Posted by Kenworth

When marketing departments stopped calling televisions their vertical pixel rate of 1080 and going horizontal and calling 3840 "4K". I guess rounding sounds cooler.

May 30, 2016 | 01:04 PM - Posted by AntDX316 (not verified)


May 30, 2016 | 01:04 PM - Posted by AntDX316 (not verified)

plus like, it's not all about the FPS

the time it takes for the scene to show on the screen is INSANE

May 29, 2016 | 06:30 PM - Posted by aargh (not verified)

Witcher with nvidia features off?
Well, ok.

May 29, 2016 | 06:43 PM - Posted by Ryan Shrout

I thought that was the most fair and would prevent people from complaining. :)

May 29, 2016 | 09:09 PM - Posted by Mandrake

I understand leaving Hairworks turned off. I'd think that HBAO+ is a fairly ubiquitous technology now, though. It runs just fine on the Radeon cards. Perhaps you'd consider turning that option 'on' for future reviews. :-)

May 30, 2016 | 04:25 AM - Posted by arbiter

HBAO+ was originally made by Nvidia. If its on and a game does better on an nvidia card they will call that out as the reason why cause its gimped on amd cards. Best to remove every point they can claim makes the test biased from the start.

May 29, 2016 | 08:17 PM - Posted by arbiter

All sites turn off that 3rd party stuff, even AMD graphic features get turned off so people don't complain of basis 1 way or another.

May 30, 2016 | 10:49 AM - Posted by Silver Sparrow

Would be comparing Civ BE Mantle to DX11.

May 29, 2016 | 07:12 PM - Posted by mav451 (not verified)

Dude just wait for non-reference SKUs to be released :p

I don't see why anyone would even consider the FEs, considering what we already know from the 1080 haha.

May 29, 2016 | 10:17 PM - Posted by johnc (not verified)

I guess because some of us need blower fans and there are so few options?

May 29, 2016 | 10:28 PM - Posted by renz (not verified)

if you want blower type cooler there is also partners offering that but without that expensive NVTTM cooler.

May 30, 2016 | 05:29 AM - Posted by arbiter

eVGA has a blower style cooler coming soon for 610$ usd

May 29, 2016 | 06:54 PM - Posted by Anonymous (not verified)

Will you be doing 4K testing with this card?

May 29, 2016 | 08:16 PM - Posted by arbiter

I don't think 1070's really viable 4k, less you are doing SLI with them.

May 29, 2016 | 09:15 PM - Posted by Anonymous (not verified)

you can play 4k games with a single 980ti so a 1070 would be viable. Just with the settings not cranked all the way up, which makes a negligible difference anyway

May 30, 2016 | 04:43 AM - Posted by rcald2000

I support this question. Numerous people will attempt the 4K resolution with this card. How do I know this? Because numerous people attempted the 4K resolution with the GTX 970. One possible reason is that people started out with one GTX 970, with the option of adding a second 970 in SLI in the future. People would save the $400+ but would instead opt to get a 4K display, in lieu of the 2nd 970.

May 30, 2016 | 05:42 AM - Posted by Spunjji

I play a few games in 4K on a 970, so this is relevant.

May 30, 2016 | 01:13 PM - Posted by Anonymous (not verified)

Prob not since it cant beat the fury.

May 29, 2016 | 07:00 PM - Posted by darknas-36

Will I see any differences since I am running an AMD FX 8300 and my current video card is a GTX 960 if I was to upgrade to the GTX 1070?

May 29, 2016 | 08:06 PM - Posted by Kyle (not verified)

Yes, especially considering the 1070 is faster than the 980ti which was almost as powerful as 2 GTX 970's SLI'd.

May 30, 2016 | 01:46 AM - Posted by Ryan Shrout

Honestly, I think you will be fine.

May 29, 2016 | 07:01 PM - Posted by Anonymous (not verified)

Hawai and "Fiji" (not "Tahiti")
Great article!

May 30, 2016 | 04:26 AM - Posted by Topinio

Wrong Fiji though, and I don't understand why -- R9 Fury is faster than the R9 Nano and the same $480 at the mo.

May 29, 2016 | 07:55 PM - Posted by Anonymous (not verified)

Great review, I love the addition of Competitive and Geforce, when you have 80% you are the market.

Good job guys!

May 29, 2016 | 08:26 PM - Posted by djotter

Earthquakes in Kentucky? ;-) or is Ken messing with the camera again?
980 Ti performance for ~half the price, deal of the year right there.

May 29, 2016 | 08:41 PM - Posted by Watcher (not verified)

Just a few points on your video. 1 - PRACTICE your spiel, you're doing a lot of umms and ahs. 2 - Look at the camera, each time you look back to us, for a brief moment you do but then you look just above it (to your tele-prompter??) and 3 - there's noticeable camera wobble everytime you flail your arms out. Make the wobble stop!

Thank you.

May 29, 2016 | 08:59 PM - Posted by Andrew (not verified)

Hi Ryan,

Any chance you can do some ultrawide comparisons? I'm like many others these day are running 3440x1440 and am curious what card I'll need to maintain >60FPS.

May 29, 2016 | 10:18 PM - Posted by Desert_Nocturne (not verified)

I second this. Thanks Andrew for bringing this up. I usually just go by the numbers for the 2560X1440 and hope for the best.

May 30, 2016 | 12:36 AM - Posted by J Nevins (not verified)

Yea 21:9 is big and ugly enough to be benched at 3440x1440 - come on Ryan, ditch 1080p and do 3440x1440 instead

May 30, 2016 | 01:47 AM - Posted by Ryan Shrout

I just got in another 21:9 monitor before leaving for a work trip. I think I will try to do this when I get home...need to validate that I CAN do this with our Frame Rating methods.

May 30, 2016 | 07:05 AM - Posted by Fishbait

Let us know, maybe a video on the challenges if you find some? I just got a 3440x1440 monitor and would love some benchmarks from you guys as PCPer Rocks :D

May 30, 2016 | 10:05 AM - Posted by Anonymous (not verified)

I would LOVE to see these benchmarks too, let us know please! :)

May 29, 2016 | 09:12 PM - Posted by Mandrake

Great review. Thanks Ryan.

May 29, 2016 | 10:32 PM - Posted by renz (not verified)

"The performance of both the Radeon R9 Nano and the R9 390X are better in Hitman, compared to the GTX 1070, than in any other benchmark or game we tested, which is quite interesting. I am told that Hitman uses asynchronous compute for a couple of the visual effects in the game and that may have something to do with the competitiveness of GCN compared to Pascal."

the game simply favors GCN architecture. even in DX11 (without async compute)there is quite significant gap between comparable geforce and radeon in this game. in fact DX12 does not always give more performance for radeon. in recent guru3d test of hitman several radeon card actually lose a bit of FPS in DX12 using 1080p res.

May 29, 2016 | 10:56 PM - Posted by Anonymous (not verified)

So many words.
I just want to know if 1070 is 8GB or 6+2GB.

May 29, 2016 | 11:01 PM - Posted by JohnL (not verified)

Indeed......Can PCPers. run the past NAI Benchmark just to be sure?

May 29, 2016 | 11:45 PM - Posted by DaKrawnik

Just like the GTX 970 reviews, IF it were to have separate memory pools, it wouldn't change the observed performance including frame times.

May 30, 2016 | 04:53 AM - Posted by Spunjji

Where's the 6+2 number coming from? Even if it had the same memory subsystem as the 970 it would be 7+1.

May 30, 2016 | 12:23 AM - Posted by Cat's Paws (not verified)

Thanks Ryan. Great work as always.

Go Big Blue!

May 30, 2016 | 12:24 AM - Posted by J Nevins (not verified)

Too many nVidiots buying the hype

May 30, 2016 | 01:14 AM - Posted by Anonymous (not verified)

This is a legitimately good card. Go shove it

May 30, 2016 | 05:13 AM - Posted by Renz (not verified)

Fanbois have it hard. Most people just glad to get more performance for less money regardless red or green.

May 30, 2016 | 01:23 AM - Posted by RS84 (not verified)

buy now or wait for AMD Polaris..

any thought Ryan.?

May 30, 2016 | 01:47 AM - Posted by Ryan Shrout

I think you'll find that Polaris is going to scale quite this high, but I honestly don't know for sure yet.

May 30, 2016 | 05:14 AM - Posted by Renz (not verified)

So any chance polaris to be as fast as 1080 or even faster?

May 30, 2016 | 05:43 AM - Posted by Spunjji

Pretty sure at this stage there is 0 chance of that.

May 30, 2016 | 04:27 PM - Posted by Anonymous (not verified)

Zero chance -- zilcho, nada, none -- absolutely no chance. Polaris will not even warrant an after thought.

June 10, 2016 | 10:03 AM - Posted by Anonymous (not verified)

Polaris would be lucky to come close to the 1070, let alone the 1080!

May 30, 2016 | 03:22 AM - Posted by zkid (not verified)

time to dump my 680sli and go for the 1070 and later sli them man... im so excited .. my 680 sli with the 2giga memory is now obsolete

May 30, 2016 | 03:54 AM - Posted by TREY LONG (not verified)

Great review as always Ryan!
Can't wait to see the custom cooled variants. The next 7 or 8 months of Pascal releases should be pretty interesting.

May 30, 2016 | 04:09 AM - Posted by Idiot (not verified)

Luckily, PcPer measures frame times, and as far as I can see from this review and others like guru3d, AMD STILL has problems with frame times even on single GPUs. Rise of Tomb Raider as an example - lots of stutters on AMD. That's ridiculous.

May 30, 2016 | 04:11 AM - Posted by khanmein

Ryan Shrout, i'm not sure "The stub received bad data" this error related to 368.22 WHQL on W10

this error popping out whenever i tried to run any application for the 1st time eg. desktop shortcuts, type cmd/msconfig under search & open task manager etc.

everything happened randomly. any solution? thanks.

June 3, 2016 | 03:14 AM - Posted by Aaron (not verified)

I had the same issue and it was with version 368.22 of the Nvidia driver. I started seeing 'the stub received bad data' messages when trying to open task manager, the management console and a message that the application was missing when trying to run command prompt. The Windows Store wouldn't work either - just sat there spinning after clicking on a store item or trying to go to the updates screen. After even opening the store, the computer would experience even more errors than before. I ended up uninstalling and reinstalling Windows, happily chugging along until I noticed the problem had come back. There were of course a lot of twists and turns, lots of event log reviews, lots of messing about with the registry, but, ultimately, I discovered that uninstalling the latest Nvidia driver and reverting back to the previous corrected the issue. I did attempt to disable any Nvidia services or startup tasks first, but to no avail. If you're using the 1080, I may be wrong here, but I think that's the first driver for it and you're unfortunately stuck with it. If that's the case with you, you may want to just stay away from the Windows Store until the next version comes out and know that you may have to launch some applications twice.

June 3, 2016 | 03:58 AM - Posted by Aaron (not verified)

Oh, and just for posterity - I'm using Windows 10 Pro x64. I have 32GB of GSkill DDR3 RAM, an AMD Vishera 8-Core Black Edition FX-8350, two (2) PNY SSD drives (one for the OS) and an MSI GeForce GTX 970 4GD5T Titanium OC. The card is of course manufacturer overclocked, but I didn't overclock it further. I first installed driver 368.22 on May 25th and noticed the issues a day or two after. I first noticed that I received the "The stub received bad data message" just occasionally when opening Task Manager. In attempting to check the event log, I noticed it would throw the same error at times. I then tested other native software and noticed the issue across most of the applications. I had also noticed that the Windows Store wasn't working properly, but didn't immediately link the two. After reviewing the event log, I began to look further into the update service, as it was throwing the most dcom errors. Just as a matter of good practice, I went ahead and disabled all of my startup services and applications and restart the computer. I noticed after the restart that the error with the update service seemed to have disappeared. I also noticed that the system was more responsive and, although the "stub" errors still appeared, they appeared less frequently. Getting excited, I opened the store and saw that it still wasn't working. I then noticed that the "stub" errors became more frequent and the system started to chug again. I couldn't link any of that to resources, as I had plenty available memory and CPU. After scouring through the event logs and doing some research online, I discovered that one of the errors being received had historically been linked to the size of the service cache in earlier versions of Windows. Too many services would overflow the cache and return the same "stub" message. This did seem to be somewhat in-line with something I was seeing in the event logs. I saw numerous failures for completely disparate services. Although I couldn't determine the exact size of the cache in newer versions, I did attempt to manually uninstall services that were no longer needed, hoping that it would resolve the problem. Unfortunately, it didn't. After poking around in the logs some more, I realized that the cascading service failures were popping up after the Windows Store was launched. I tested after numerous restarts and saw that the "stub" message, while a pain, only generated a single error in the logs, not the numerous that I received after opening the store. I did some fiddling around with the AppReadiness folder, tried a wsreset, ran a powershell command known to fix issues, tried an "sfc /scannow" and, well, some other various store-related things, and some more non-store things, namely removing the most recent Windows updates. None of them helped the store, but I had at least discovered that if I didn't launch it, I could at least use the computer somewhat reliably. All of this was fine and dandy, but things were still screwy and I couldn't deal with that. I decided to just reinstall Windows. It wasn't worth the hassle to continue to bang my head up against the wall trying to figure this out. After reinstalling, everything was fine at first, but the issue reappeared. Immediately after noticing it, I went to the event log. I tried looking for any link I could find. I had verified that the Store had been working earlier, but it was now out of commission again. I saw the same types of errors I had seen before in the logs. Scratching my head, I popped over to my programs list to see what I had installed most recently. "No - that's not it, no that's not it - shouldn't be the Nvidia driver, but, eh, let's see." I uninstalled the 368.22 Nvidia driver and, bam - the problems were gone. I was using a crappy basic display driver, but the problems were gone. I could launch command prompt, task manager, resource monitor, the store was working - the whole bit. I popped over to Nvidia's site and installed the previous driver to see what would happen. Voila - I was back up and running with no issues. That was on the 30th and I've been humming along perfectly since. I keep checking for mention of this issue elsewhere, but I've only found this and one or two other postings, but people seem to discount it. This is a real issue.

June 13, 2016 | 05:02 AM - Posted by khanmein

@Aaron, there's few users faced the same issue like us. clean install won't solve the issue. since r368 branch the stub error is exist on 368.22 & 368.39

temporary solution is roll back to 365.19 (clean install) or install 368.39 on top of 365.19 so that the stub received bad data will gone.

most likely, pcper still using older driver like 365.19 for testing, that's y i didn't hear anything from them.

apparently, i submitted the report & log file to NV driver team to resolve the issue.

FYI, the NVCP setting unable to open too but this happened very rare & some other bugs too.

May 30, 2016 | 04:48 AM - Posted by rcald2000

Based on my observation of the GTX 970 and 980 releases, I have a feeling that the GTX 1070 will be the best value. And anyone who buys a GTX 1080 will regret it once the 1080 Ti's releases. Personally I may end up getting just one 1080 just to try it out for gaming and folding@home, but I'm really eager to see what Nvidia brings to the table with the Titanium release.

May 30, 2016 | 05:01 AM - Posted by Jann5s

The link on the "Testing Suite and Methodology Update" page in this paragraph:
"For those of you that have never read about our Frame Rating capture-based performance analysis system, the following section is for you. If you have, feel free to jump straight into the benchmark action!!"
jumps to the 1080 review.

I was properly confused for a few seconds when I didn't see any 1070 data on the page.

May 30, 2016 | 05:10 AM - Posted by Jann5s

@Allyn: What would you think about frame time weighted frame time percentile graphs? Like in the SSD reviews?

Just a joke, I don't think it matters that much in this data since the variance is not multiple orders of magnitude here.

May 31, 2016 | 03:04 AM - Posted by Allyn Malventano

Ryan and I actually had this conversation the other day. It could come into play with the percentile plots, but things would need to be presented a bit differently. It would help spread cards with greater variation out of the pack a bit more, but as it stands now, cards that misbehave tend to misbehave badly enough that we don't need to weigh it any differently to make it obvious.

May 30, 2016 | 05:23 AM - Posted by Jann5s

These new power measurements are amazing, thanks pcper for keeping on it, pushing measurement methods and supplying us with sensible data.

(however, I think the particular page mixes Hawaii, Fiji and Tahiti as others have also commented on)

May 30, 2016 | 06:00 AM - Posted by Mr.Prayer

"Testing suite" page:
>> As a result, you'll two sets of data in our benchmark pages

Word missing?

May 30, 2016 | 03:10 PM - Posted by Seiko, M (not verified)

>> As a result, you'll word missing two sets of data in our benchmark pages.

May 30, 2016 | 07:03 AM - Posted by vin1832 (not verified)

I know it's already alot of work, but can we have some openCL or blender benchmarks? or even just from preimer pro testing

not all people game

and, well, also for the 1080 please! >.<

May 30, 2016 | 09:18 AM - Posted by Dark_wizzie

Using Chrome atm. When I click on a picture, the pictures tend to look a bit weird. Like, with the power graph when I click on it, the picture isn't centered on the page. When I click on the bar graph, the picture is super large.

May 30, 2016 | 10:52 AM - Posted by Silver Sparrow

Would anyone be able to say if one could pair this GPU with a 980ti since they are comparable in performance and are pretty much the same architecture?

May 30, 2016 | 02:54 PM - Posted by Spunjji

Unlikely nVidia would let you do it. Might work in something like Ashes of the Singularity but betting other developers will do a similar version of multi-card rendering doesn't seem like a sound plan.

May 30, 2016 | 03:26 PM - Posted by Anonymous (not verified)

Why single out power used by graphics card alone?
As long as GPUs need driver executed by CPU it does nor make sense to me.

May 31, 2016 | 07:22 AM - Posted by Brash16 (not verified)

Great review yet again Ryan. Just a heads up, the link to the benchmarks on page 3 sends one to the 1080 page.

June 1, 2016 | 09:44 AM - Posted by Anonymous (not verified)

Are the other cards used in the comparison overclocked?

June 7, 2016 | 07:30 PM - Posted by BBlair (not verified)

Why does this site still use the stupid tiny lines? Why can't you just put the damn FPS numbers down and be done with it! I hate looking at very tiny lines just to get a idea of performance! This is a huge reason why I stopped coming to this site for reviews!

June 10, 2016 | 10:02 AM - Posted by Anonymous (not verified)

Ryan.. would you agree that nVidia probably made the 970 too good of a deal for what you got? As it seems there is more differences between the 1070 vs 1080 this time around.

If nVidia could change history, they probably would have either made the 970 not as fast or more expensive.

September 28, 2016 | 05:22 AM - Posted by khanmein

@Ryan Shrout, can u do another review regarding MICRON & SAMSUNG VRAM for GTX 1070 again?

there's some fiasco like previous GTX 970 3.5GB VRAM & guess what now is bout the brand.

obviously, every reviewers cherry picked with SAMSUNG chip & how come there's no MICRON chip for review??? thanks.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.