Review Index:
Feedback

The GeForce GTX 1070 8GB Founders Edition Review

Author: Ryan Shrout
Manufacturer: NVIDIA

Founders Edition and Overclocking

Let the debate of Founders Edition versus reference cards continue! Just like we saw with the GTX 1080 launch, the GeForce GTX 1070 will start its life in the form of the new NVIDIA Founders Edition, a card that is more or less equivalent to the reference designs in previous generational releases, though NVIDIA intends to sell this card and design directly and to partners for the lifetime of the generation.

View Full Size

Stylistically, the GTX 1070 looks identical to the GTX 1080 – same shroud, same length, same back plate, just a different number at the face of it. There are some changes though from a board design and cooler design we should note. The GeForce GTX 1070 Founders Edition is only a 4-phase power design, compared to the 5-phase on the GTX 1080.

View Full Size

The cooler is the other big change – this is a heatsink and triple heatpipe based design rather than vapor chamber. All that really means is that we expect the fan to have to spin slightly faster (and thus be slightly louder) to keep the GPU at its rated ~83C temperature at stock settings.

View Full Size

NVIDIA wisely kept the dual partition backplate on the GTX 1070 FE, with the back part removable should you worry about air flow into your top card in a dual-GPU configuration. The SLI connections on the 1070 support the same SLI HB (high bandwidth) bridges announced with the GTX 1080 and are also only recommended for two-card / two-GPU configurations. If you want 3- or 4-Way options, you’ll have to offer up your first child register for an Enthusiast Key.

View Full Size

The top GeForce GTX logo is still going to glow a soft green for you while the 8-pin power connector remains in the same spot.

View Full Size

Display connectivity is unchanged: a full size HDMI, three full size DisplayPort and one dual-link (DL) DVI connection.

Stock Clock Speeds and Overclocking Testing

The GeForce GTX 1070 comes with a base clock of 1506 MHz and a rated Boost clock of 1683 MHz. But how does that actually work out in extended gaming? I ran through a 10+ minute loop of Unigine’s Heaven 4.0 benchmark (very GPU heavy obviously) in order to get temperatures stabilized to see where the clock speed of the GTX 1070 landed.

View Full Size

These results are really positive for the GTX 1070; after starting at north of 1850 MHz with a cold GPU at the beginning of the test, we stabilized at around 1775 MHz at the end of our test run. Temperatures normalized at 79C or so, though obviously the fan sped up to keep the GPU in that range. Considering that we were promised Boost clock rate of 1683 MHz, getting ~100 MHz higher than that in a real-world workload is great news for gamers.

Quick Overclocking Testing

Using the latest version of EVGA’s Precision X software, I went through some quick manual overclocking on the GTX 1070 Founders Edition and found promising early results.

View Full Size

View Full Size

While I was able to run at a +200 MHz offset (with 112% maximum power target) for a few minutes, I settled on a +175 MHz for my stable overclocking, running for well over 30 minutes in Heaven for me without corruption or crashing. The result is a clock speed hovering around 1987 MHz, more than 450 MHz higher than the base clock the card is rated at! Obviously that is going to translate directly into gaming performance in games that are shader limited, as most tend to be.

(Note that I am NOT turning the fan speed up to 100% when doing these overclocking tests – I don’t feel that represents an action an end user would take. A sane end user at least.)

Video News


May 29, 2016 | 06:28 PM - Posted by Hoyty (not verified)

$379 for 60 fps at least at 25x14, count me in. Just wait for pricing / avail to shake out. Glad I saved and only bought 960 last Christmas, now easy upgrade.

May 29, 2016 | 06:42 PM - Posted by Ryan Shrout

Yeah, unless you can get them at the prices listed here, it's probably better to wait until it settles.

May 29, 2016 | 09:16 PM - Posted by Anonymous (not verified)

How long does that generally take? I haven't followed that many video card releases recently. All of the listings seem to be third party sellers listing the 1080 for $900 or more. I don't think I am going to buy anything until I see what AMD has available with Polaris anyway. While comparisons with previous generation parts are interesting, they don't really inform the buying descision much since there is such a big performance gap. It looks like the 1070 may really put price pressure on AMD parts though, once it is actually available for non-founders edition prices.

May 30, 2016 | 04:30 AM - Posted by arbiter

Would expect since 1070 and 1080 cards can be identical in so 3rd party cards should be quick.

May 29, 2016 | 09:18 PM - Posted by Dennis (not verified)

Could you give us very small hint as to whether the Polaris 10 will be competitive to this card? You have been to that event in China that AMD held.

May 30, 2016 | 07:02 AM - Posted by vin1832 (not verified)

I know it's already alot of work, but can we have some openCL or blender benchmarks? or even just from preimer pro testing

not all people game

and, well, also for the 1080 please! >.<

June 2, 2016 | 07:47 AM - Posted by Michael McAllister

Ryan, do you know why your GTA V screenshots are only showing 6 GB of memory available? This seems very strange considering these are 8 GB cards. Thoughts?

May 30, 2016 | 07:47 AM - Posted by rcald2000

+Hoyty
$379 for 60 FPS @ 2560 x 1440 sounds amazing.
25x14 sounds like the abbreviation that a certain someone from Nvidia kept using during a recent live stream. I kept wondering, "What's wrong with just saying 2560 x 1440 or 1440p?". Does anyone know how the 25x14 abbreviation started?

May 31, 2016 | 05:08 PM - Posted by Kenworth

When marketing departments stopped calling televisions their vertical pixel rate of 1080 and going horizontal and calling 3840 "4K". I guess rounding sounds cooler.

May 30, 2016 | 01:04 PM - Posted by AntDX316 (not verified)

congrats

May 30, 2016 | 01:04 PM - Posted by AntDX316 (not verified)

plus like, it's not all about the FPS

the time it takes for the scene to show on the screen is INSANE

May 29, 2016 | 06:30 PM - Posted by aargh (not verified)

Witcher with nvidia features off?
Well, ok.

May 29, 2016 | 06:43 PM - Posted by Ryan Shrout

I thought that was the most fair and would prevent people from complaining. :)

May 29, 2016 | 09:09 PM - Posted by Mandrake

I understand leaving Hairworks turned off. I'd think that HBAO+ is a fairly ubiquitous technology now, though. It runs just fine on the Radeon cards. Perhaps you'd consider turning that option 'on' for future reviews. :-)

May 30, 2016 | 04:25 AM - Posted by arbiter

HBAO+ was originally made by Nvidia. If its on and a game does better on an nvidia card they will call that out as the reason why cause its gimped on amd cards. Best to remove every point they can claim makes the test biased from the start.

May 29, 2016 | 08:17 PM - Posted by arbiter

All sites turn off that 3rd party stuff, even AMD graphic features get turned off so people don't complain of basis 1 way or another.

May 30, 2016 | 10:49 AM - Posted by Silver Sparrow

Would be comparing Civ BE Mantle to DX11.

May 29, 2016 | 07:12 PM - Posted by mav451 (not verified)

Dude just wait for non-reference SKUs to be released :p

I don't see why anyone would even consider the FEs, considering what we already know from the 1080 haha.

May 29, 2016 | 10:17 PM - Posted by johnc (not verified)

I guess because some of us need blower fans and there are so few options?

May 29, 2016 | 10:28 PM - Posted by renz (not verified)

if you want blower type cooler there is also partners offering that but without that expensive NVTTM cooler.

May 30, 2016 | 05:29 AM - Posted by arbiter

eVGA has a blower style cooler coming soon for 610$ usd

May 29, 2016 | 06:54 PM - Posted by Anonymous (not verified)

Will you be doing 4K testing with this card?

May 29, 2016 | 08:16 PM - Posted by arbiter

I don't think 1070's really viable 4k, less you are doing SLI with them.

May 29, 2016 | 09:15 PM - Posted by Anonymous (not verified)

you can play 4k games with a single 980ti so a 1070 would be viable. Just with the settings not cranked all the way up, which makes a negligible difference anyway

May 30, 2016 | 04:43 AM - Posted by rcald2000

I support this question. Numerous people will attempt the 4K resolution with this card. How do I know this? Because numerous people attempted the 4K resolution with the GTX 970. One possible reason is that people started out with one GTX 970, with the option of adding a second 970 in SLI in the future. People would save the $400+ but would instead opt to get a 4K display, in lieu of the 2nd 970.

May 30, 2016 | 05:42 AM - Posted by Spunjji

I play a few games in 4K on a 970, so this is relevant.

May 30, 2016 | 01:13 PM - Posted by Anonymous (not verified)

Prob not since it cant beat the fury.

May 29, 2016 | 07:00 PM - Posted by darknas-36

Will I see any differences since I am running an AMD FX 8300 and my current video card is a GTX 960 if I was to upgrade to the GTX 1070?

May 29, 2016 | 08:06 PM - Posted by Kyle (not verified)

Yes, especially considering the 1070 is faster than the 980ti which was almost as powerful as 2 GTX 970's SLI'd.

May 30, 2016 | 01:46 AM - Posted by Ryan Shrout

Honestly, I think you will be fine.

May 29, 2016 | 07:01 PM - Posted by Anonymous (not verified)

Hawai and "Fiji" (not "Tahiti")
Great article!

May 30, 2016 | 04:26 AM - Posted by Topinio

Wrong Fiji though, and I don't understand why -- R9 Fury is faster than the R9 Nano and the same $480 at the mo.

May 29, 2016 | 07:55 PM - Posted by Anonymous (not verified)

Great review, I love the addition of Competitive and Geforce, when you have 80% you are the market.

Good job guys!

May 29, 2016 | 08:26 PM - Posted by djotter

Earthquakes in Kentucky? ;-) or is Ken messing with the camera again?
980 Ti performance for ~half the price, deal of the year right there.

May 29, 2016 | 08:41 PM - Posted by Watcher (not verified)

Just a few points on your video. 1 - PRACTICE your spiel, you're doing a lot of umms and ahs. 2 - Look at the camera, each time you look back to us, for a brief moment you do but then you look just above it (to your tele-prompter??) and 3 - there's noticeable camera wobble everytime you flail your arms out. Make the wobble stop!

Thank you.

May 29, 2016 | 08:59 PM - Posted by Andrew (not verified)

Hi Ryan,

Any chance you can do some ultrawide comparisons? I'm like many others these day are running 3440x1440 and am curious what card I'll need to maintain >60FPS.

May 29, 2016 | 10:18 PM - Posted by Desert_Nocturne (not verified)

I second this. Thanks Andrew for bringing this up. I usually just go by the numbers for the 2560X1440 and hope for the best.

May 30, 2016 | 12:36 AM - Posted by J Nevins (not verified)

Yea 21:9 is big and ugly enough to be benched at 3440x1440 - come on Ryan, ditch 1080p and do 3440x1440 instead

May 30, 2016 | 01:47 AM - Posted by Ryan Shrout

I just got in another 21:9 monitor before leaving for a work trip. I think I will try to do this when I get home...need to validate that I CAN do this with our Frame Rating methods.

May 30, 2016 | 07:05 AM - Posted by Fishbait

Let us know, maybe a video on the challenges if you find some? I just got a 3440x1440 monitor and would love some benchmarks from you guys as PCPer Rocks :D

May 30, 2016 | 10:05 AM - Posted by Anonymous (not verified)

I would LOVE to see these benchmarks too, let us know please! :)

May 29, 2016 | 09:12 PM - Posted by Mandrake

Great review. Thanks Ryan.

May 29, 2016 | 10:32 PM - Posted by renz (not verified)

"The performance of both the Radeon R9 Nano and the R9 390X are better in Hitman, compared to the GTX 1070, than in any other benchmark or game we tested, which is quite interesting. I am told that Hitman uses asynchronous compute for a couple of the visual effects in the game and that may have something to do with the competitiveness of GCN compared to Pascal."

the game simply favors GCN architecture. even in DX11 (without async compute)there is quite significant gap between comparable geforce and radeon in this game. in fact DX12 does not always give more performance for radeon. in recent guru3d test of hitman several radeon card actually lose a bit of FPS in DX12 using 1080p res.

May 29, 2016 | 10:56 PM - Posted by Anonymous (not verified)

So many words.
I just want to know if 1070 is 8GB or 6+2GB.

May 29, 2016 | 11:01 PM - Posted by JohnL (not verified)

Indeed......Can PCPers. run the past NAI Benchmark just to be sure?

May 29, 2016 | 11:45 PM - Posted by DaKrawnik

Just like the GTX 970 reviews, IF it were to have separate memory pools, it wouldn't change the observed performance including frame times.

May 30, 2016 | 04:53 AM - Posted by Spunjji

Where's the 6+2 number coming from? Even if it had the same memory subsystem as the 970 it would be 7+1.

May 30, 2016 | 12:23 AM - Posted by Cat's Paws (not verified)

Thanks Ryan. Great work as always.

Go Big Blue!

May 30, 2016 | 12:24 AM - Posted by J Nevins (not verified)

Too many nVidiots buying the hype

May 30, 2016 | 01:14 AM - Posted by Anonymous (not verified)

This is a legitimately good card. Go shove it

May 30, 2016 | 05:13 AM - Posted by Renz (not verified)

Fanbois have it hard. Most people just glad to get more performance for less money regardless red or green.

May 30, 2016 | 01:23 AM - Posted by RS84 (not verified)

buy now or wait for AMD Polaris..

any thought Ryan.?

May 30, 2016 | 01:47 AM - Posted by Ryan Shrout

I think you'll find that Polaris is going to scale quite this high, but I honestly don't know for sure yet.

May 30, 2016 | 05:14 AM - Posted by Renz (not verified)

So any chance polaris to be as fast as 1080 or even faster?

May 30, 2016 | 05:43 AM - Posted by Spunjji

Pretty sure at this stage there is 0 chance of that.

May 30, 2016 | 04:27 PM - Posted by Anonymous (not verified)

Zero chance -- zilcho, nada, none -- absolutely no chance. Polaris will not even warrant an after thought.

June 10, 2016 | 10:03 AM - Posted by Anonymous (not verified)

Polaris would be lucky to come close to the 1070, let alone the 1080!

May 30, 2016 | 03:22 AM - Posted by zkid (not verified)

time to dump my 680sli and go for the 1070 and later sli them man... im so excited .. my 680 sli with the 2giga memory is now obsolete

May 30, 2016 | 03:54 AM - Posted by TREY LONG (not verified)

Great review as always Ryan!
Can't wait to see the custom cooled variants. The next 7 or 8 months of Pascal releases should be pretty interesting.

May 30, 2016 | 04:09 AM - Posted by Idiot (not verified)

Luckily, PcPer measures frame times, and as far as I can see from this review and others like guru3d, AMD STILL has problems with frame times even on single GPUs. Rise of Tomb Raider as an example - lots of stutters on AMD. That's ridiculous.

May 30, 2016 | 04:11 AM - Posted by khanmein

Ryan Shrout, i'm not sure "The stub received bad data" this error related to 368.22 WHQL on W10

this error popping out whenever i tried to run any application for the 1st time eg. desktop shortcuts, type cmd/msconfig under search & open task manager etc.

everything happened randomly. any solution? thanks.

June 3, 2016 | 03:14 AM - Posted by Aaron (not verified)

I had the same issue and it was with version 368.22 of the Nvidia driver. I started seeing 'the stub received bad data' messages when trying to open task manager, the management console and a message that the application was missing when trying to run command prompt. The Windows Store wouldn't work either - just sat there spinning after clicking on a store item or trying to go to the updates screen. After even opening the store, the computer would experience even more errors than before. I ended up uninstalling and reinstalling Windows, happily chugging along until I noticed the problem had come back. There were of course a lot of twists and turns, lots of event log reviews, lots of messing about with the registry, but, ultimately, I discovered that uninstalling the latest Nvidia driver and reverting back to the previous corrected the issue. I did attempt to disable any Nvidia services or startup tasks first, but to no avail. If you're using the 1080, I may be wrong here, but I think that's the first driver for it and you're unfortunately stuck with it. If that's the case with you, you may want to just stay away from the Windows Store until the next version comes out and know that you may have to launch some applications twice.

June 3, 2016 | 03:58 AM - Posted by Aaron (not verified)

Oh, and just for posterity - I'm using Windows 10 Pro x64. I have 32GB of GSkill DDR3 RAM, an AMD Vishera 8-Core Black Edition FX-8350, two (2) PNY SSD drives (one for the OS) and an MSI GeForce GTX 970 4GD5T Titanium OC. The card is of course manufacturer overclocked, but I didn't overclock it further. I first installed driver 368.22 on May 25th and noticed the issues a day or two after. I first noticed that I received the "The stub received bad data message" just occasionally when opening Task Manager. In attempting to check the event log, I noticed it would throw the same error at times. I then tested other native software and noticed the issue across most of the applications. I had also noticed that the Windows Store wasn't working properly, but didn't immediately link the two. After reviewing the event log, I began to look further into the update service, as it was throwing the most dcom errors. Just as a matter of good practice, I went ahead and disabled all of my startup services and applications and restart the computer. I noticed after the restart that the error with the update service seemed to have disappeared. I also noticed that the system was more responsive and, although the "stub" errors still appeared, they appeared less frequently. Getting excited, I opened the store and saw that it still wasn't working. I then noticed that the "stub" errors became more frequent and the system started to chug again. I couldn't link any of that to resources, as I had plenty available memory and CPU. After scouring through the event logs and doing some research online, I discovered that one of the errors being received had historically been linked to the size of the service cache in earlier versions of Windows. Too many services would overflow the cache and return the same "stub" message. This did seem to be somewhat in-line with something I was seeing in the event logs. I saw numerous failures for completely disparate services. Although I couldn't determine the exact size of the cache in newer versions, I did attempt to manually uninstall services that were no longer needed, hoping that it would resolve the problem. Unfortunately, it didn't. After poking around in the logs some more, I realized that the cascading service failures were popping up after the Windows Store was launched. I tested after numerous restarts and saw that the "stub" message, while a pain, only generated a single error in the logs, not the numerous that I received after opening the store. I did some fiddling around with the AppReadiness folder, tried a wsreset, ran a powershell command known to fix issues, tried an "sfc /scannow" and, well, some other various store-related things, and some more non-store things, namely removing the most recent Windows updates. None of them helped the store, but I had at least discovered that if I didn't launch it, I could at least use the computer somewhat reliably. All of this was fine and dandy, but things were still screwy and I couldn't deal with that. I decided to just reinstall Windows. It wasn't worth the hassle to continue to bang my head up against the wall trying to figure this out. After reinstalling, everything was fine at first, but the issue reappeared. Immediately after noticing it, I went to the event log. I tried looking for any link I could find. I had verified that the Store had been working earlier, but it was now out of commission again. I saw the same types of errors I had seen before in the logs. Scratching my head, I popped over to my programs list to see what I had installed most recently. "No - that's not it, no that's not it - shouldn't be the Nvidia driver, but, eh, let's see." I uninstalled the 368.22 Nvidia driver and, bam - the problems were gone. I was using a crappy basic display driver, but the problems were gone. I could launch command prompt, task manager, resource monitor, the store was working - the whole bit. I popped over to Nvidia's site and installed the previous driver to see what would happen. Voila - I was back up and running with no issues. That was on the 30th and I've been humming along perfectly since. I keep checking for mention of this issue elsewhere, but I've only found this and one or two other postings, but people seem to discount it. This is a real issue.

June 13, 2016 | 05:02 AM - Posted by khanmein

@Aaron, there's few users faced the same issue like us. clean install won't solve the issue. since r368 branch the stub error is exist on 368.22 & 368.39

temporary solution is roll back to 365.19 (clean install) or install 368.39 on top of 365.19 so that the stub received bad data will gone.

most likely, pcper still using older driver like 365.19 for testing, that's y i didn't hear anything from them.

apparently, i submitted the report & log file to NV driver team to resolve the issue.

FYI, the NVCP setting unable to open too but this happened very rare & some other bugs too.

May 30, 2016 | 04:48 AM - Posted by rcald2000

Based on my observation of the GTX 970 and 980 releases, I have a feeling that the GTX 1070 will be the best value. And anyone who buys a GTX 1080 will regret it once the 1080 Ti's releases. Personally I may end up getting just one 1080 just to try it out for gaming and folding@home, but I'm really eager to see what Nvidia brings to the table with the Titanium release.

May 30, 2016 | 05:01 AM - Posted by Jann5s

The link on the "Testing Suite and Methodology Update" page in this paragraph:
"For those of you that have never read about our Frame Rating capture-based performance analysis system, the following section is for you. If you have, feel free to jump straight into the benchmark action!!"
jumps to the 1080 review.

I was properly confused for a few seconds when I didn't see any 1070 data on the page.

May 30, 2016 | 05:10 AM - Posted by Jann5s

@Allyn: What would you think about frame time weighted frame time percentile graphs? Like in the SSD reviews?

Just a joke, I don't think it matters that much in this data since the variance is not multiple orders of magnitude here.

May 31, 2016 | 03:04 AM - Posted by Allyn Malventano

Ryan and I actually had this conversation the other day. It could come into play with the percentile plots, but things would need to be presented a bit differently. It would help spread cards with greater variation out of the pack a bit more, but as it stands now, cards that misbehave tend to misbehave badly enough that we don't need to weigh it any differently to make it obvious.

May 30, 2016 | 05:23 AM - Posted by Jann5s

These new power measurements are amazing, thanks pcper for keeping on it, pushing measurement methods and supplying us with sensible data.

(however, I think the particular page mixes Hawaii, Fiji and Tahiti as others have also commented on)

May 30, 2016 | 06:00 AM - Posted by Mr.Prayer

"Testing suite" page:
>> As a result, you'll two sets of data in our benchmark pages

Word missing?

May 30, 2016 | 03:10 PM - Posted by Seiko, M (not verified)

>> As a result, you'll word missing two sets of data in our benchmark pages.

May 30, 2016 | 07:03 AM - Posted by vin1832 (not verified)

I know it's already alot of work, but can we have some openCL or blender benchmarks? or even just from preimer pro testing

not all people game

and, well, also for the 1080 please! >.<

May 30, 2016 | 09:18 AM - Posted by Dark_wizzie

Using Chrome atm. When I click on a picture, the pictures tend to look a bit weird. Like, with the power graph when I click on it, the picture isn't centered on the page. When I click on the bar graph, the picture is super large.

May 30, 2016 | 10:52 AM - Posted by Silver Sparrow

Would anyone be able to say if one could pair this GPU with a 980ti since they are comparable in performance and are pretty much the same architecture?

May 30, 2016 | 02:54 PM - Posted by Spunjji

Unlikely nVidia would let you do it. Might work in something like Ashes of the Singularity but betting other developers will do a similar version of multi-card rendering doesn't seem like a sound plan.

May 30, 2016 | 03:26 PM - Posted by Anonymous (not verified)

Why single out power used by graphics card alone?
As long as GPUs need driver executed by CPU it does nor make sense to me.

May 31, 2016 | 07:22 AM - Posted by Brash16 (not verified)

Great review yet again Ryan. Just a heads up, the link to the benchmarks on page 3 sends one to the 1080 page.

June 1, 2016 | 09:44 AM - Posted by Anonymous (not verified)

Are the other cards used in the comparison overclocked?

June 7, 2016 | 07:30 PM - Posted by BBlair (not verified)

Why does this site still use the stupid tiny lines? Why can't you just put the damn FPS numbers down and be done with it! I hate looking at very tiny lines just to get a idea of performance! This is a huge reason why I stopped coming to this site for reviews!

June 10, 2016 | 10:02 AM - Posted by Anonymous (not verified)

Ryan.. would you agree that nVidia probably made the 970 too good of a deal for what you got? As it seems there is more differences between the 1070 vs 1080 this time around.

If nVidia could change history, they probably would have either made the 970 not as fast or more expensive.

September 28, 2016 | 05:22 AM - Posted by khanmein

@Ryan Shrout, can u do another review regarding MICRON & SAMSUNG VRAM for GTX 1070 again?

there's some fiasco like previous GTX 970 3.5GB VRAM & guess what now is bout the brand.

obviously, every reviewers cherry picked with SAMSUNG chip & how come there's no MICRON chip for review??? thanks.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.