Review Index:
Feedback

The NVIDIA GeForce GTX 1080 Ti Review

Author: Ryan Shrout
Manufacturer: NVIDIA

Sound Testing, Pricing and Closing Thoughts

We ran the reference cooler on the GeForce GTX 1080 Ti through our standard noise testing.

View Full Size

Under a full load, the GTX 1080 Ti is slightly more quiet than the Titan X launched last year and just 1.0 dbA above the GTX 1080 Founders Edition card. 

Performance Summary

Though I do encourage you go through the previous benchmark pages for more detail on each result, we can summarize our experiences with the GeForce GTX 1080 Ti succinctly: it's damn fast! Compared to the Titan X, we should deltas of 0-8% in favor of the GTX 1080 Ti across both 2560x1440 and 4K resolution testing. For a card with a starting sale price $500 lower than where the Titan X launched last year, that make the performance all the more impressive.

The most important distinction lies with the GTX 1080 though. At 2560x1440, the GTX 1080 Ti ranged from 2% faster (in Grand Theft Auto V and Gears of War) to 42% faster (in Dirt Rally). The average improvement was 25% INCLUDING the 2% outliers! At 4K that average difference jumps to 36%, with Dirt Rally showing the biggest gain at 50%! These are extraordinary performance gains for a card that prices $100 more than the launch price of the GTX 1080 and $200 more than its current (reduced) price.

View Full Size

For AMD fans, its tough to find anything positive about where the company stands in high-end GPU discussions. The GeForce GTX 1080 Ti is 94% faster (average) than the Fury X across all seven games and both resolutions we tested, nearly doubling the performance of the most recent "flagship" product bearing the Radeon name. It has been less than two years since we saw the Fury X hit the scene...AMD needs Vega to be great and to be here sooner rather than later.

Pricing and Availability

I mentioned it on the first page: the NVIDIA GeForce GTX 1080 Ti is going to be selling for $699 starting today from both NVIDIA and it's partners. Custom cooled variants of the GTX 1080 Ti will be coming from EVGA and others in due time, with individual pricing to be determined. 

With the new pricing on the GeForce GTX 1080 (and below), the whole market gets a shake up of sorts that has been a long time coming. At $699, the new GeForce GTX 1080 Ti will run you $100 more than the base MSRP of the GTX 1080 but the same as many aftermarket, custom designed GTX 1080s were selling for just last week. For that upgrade you get 30-40% better performance - that's an impressive change considering there is no direct pressure on NVIDIA from AMD...yet.

If you are are looking for the new flagship GPU to get you through your VR gaming, 4K gaming or high refresh rate gaming for 2017, you're going to be hard pressed to find a solution that is going to best the GeForce GTX 1080 Ti.

Closing Thoughts

When I reviewed the NVIDIA Titan X based on Pascal back in August of last year, I had to caveat every statement with something along the lines of "if you are willing to pay the premium" blah blah. At $1200, and a solid $500-600 more than the next best performance GeForce product on the market, it was tough to make the claim that ANYONE should buy that product for gaming purpose. With the GeForce GTX 1080 Ti offering nearly identical (maybe slightly better) performance than the Titan X, and for a $500 lower MSRP, those caveats no longer need apply. Yes, $699 is still incredibly expensive for a component for a gaming PC, but it is within reach of the current lineup (GTX 1080 at $499-549). 

AMD needs a flagship Vega graphics card and it needs it yesterday. We are now years into an NVIDIA-only world for cards over $300 being relevant and though NVIDIA deserves credit for not launching the GTX 1080 Ti at $800-900 (really, it could have), competition only makes the market better for consumers. We continue to hear about AMD's plans for 2017 and the promises of HBM2 memory, a high-bandwidth cache controller, and double-packed math. Whether or not they can deliver on those promises has yet to be proven.

View Full Size

What cannot be doubted is that NVIDIA continues to create new products and iterate on a cadence that is unmatched. Yes, the GP102 GPU and the GTX 1080 Ti could have been released last year in place of the Titan X. But why would they? Why would a company cut its profit margins and not hold back cards until absolutely necessary? In the end, that's why everyone benefits from two healthy competitors. 

As we near the end of Q1 2017, without a doubt NVIDIA maintains its dominance in the high end of the graphics market with the GeForce GTX 1080 Ti, leading the performance pack. If you have the wallet and need for the best gaming card in the world, the GeForce GTX 1080 Ti is for you.

View Full Size


March 9, 2017 | 09:11 AM - Posted by Anonymous (not verified)

Small remark on 11 page (Detailed Power Consumption Testing):

Last Light testing at 4K with the Titan X running at a +150 MHz offset (should be 1080 Ti, instead of Titan X).

Other than that - thanks for review.

March 10, 2017 | 02:39 AM - Posted by djotter

Saw that too, copy pasta!

March 9, 2017 | 09:20 AM - Posted by khanmein

interesting power draw from 6-pin & any slow down once we hit peak of 11 GB VRAM?

weird no DOOM, For Honor, Sniper Elite 4 etc benchmark?

March 9, 2017 | 02:36 PM - Posted by Anonymous (not verified)

Doom isn't much of a stress test for anything, I think thats been demonstrated very well by many testing outlets. For Honor and Sniper Elite 4 are pretty damn new, hard to castigate them for not using them

March 10, 2017 | 04:38 AM - Posted by Anonymous (not verified)

Actually running with Vulcan is good for testing CPU OC stability.

March 9, 2017 | 09:26 AM - Posted by Anonymous (not verified)

2% faster then GTX1080 in Grand Theft Auto V and Gears of War at 1440p.

Woooooww!

March 9, 2017 | 11:53 AM - Posted by slizzo

Those two titles are heavily CPU dependent.

March 9, 2017 | 02:51 PM - Posted by Anonymous (not verified)

Gears of War isn't

March 9, 2017 | 04:54 PM - Posted by Anonymous (not verified)

You know it's not GOW4 right?

March 9, 2017 | 06:35 PM - Posted by Anonymous (not verified)

Unreal engine isn't cpu dependant

March 9, 2017 | 09:30 AM - Posted by Anonymous (not verified)

IB4 Ryzen testing requests.

March 9, 2017 | 09:34 AM - Posted by Geoff Peterson (not verified)

Thanks for the 1400p 980ti benchmark comparisons for those of us thinking of upgrading from that gpu. Looks like a pretty compelling upgrade.

My one question is the reference PCB layout. Is it identical to Titan X pascal? I like to watercool my GPUs to get the most out of them

March 9, 2017 | 09:59 AM - Posted by Jann5s

yes, it is the same PCB, see the pictures here

March 9, 2017 | 09:35 AM - Posted by Anonymous (not verified)

250 amps? I'd like to see that power supply!

March 9, 2017 | 10:51 AM - Posted by Anonymous (not verified)

At 3V, 5V, 12V... to name a few! :)

March 9, 2017 | 12:19 PM - Posted by Anonymousmouse (not verified)

Hopefully, it is at 1V or below:) It was mentioned in the launch live stream too, there must be a mix up somewhere.

March 9, 2017 | 11:18 AM - Posted by Anonymous (not verified)

"the NVIDIA GeForce GTX 1080 Ti is going to be selling for $699 starting today from both NVIDIA and it's partners"

But.. but.. it's not selling today anywhere..

March 9, 2017 | 11:54 AM - Posted by slizzo

Yeah, looks like NDA date was today, actual launch is tomorrow.

March 9, 2017 | 01:01 PM - Posted by Anonymous (not verified)

Any word on the 980ti step up to 1080ti?

March 9, 2017 | 01:04 PM - Posted by Anonymous (not verified)

A bigger chip off the GP 102 block, so more performance through more resources. It will be intresting to see what AMD's Vega will offer in performance with all of Vega's new features that are new to the Vega micro-arch relative to the Ti's more amounts of the same with some slightly higher clocks. that $699 price point will not be too difficult for AMD to beat and Vega will have that NCU, HBCC, and other all new features to take into account. So it's the Pascal refresh versus Vega's new features for AMD's latest Flagship offering when Vega is released 2Q of 2017.

AMD does have its more of the same in the mainstream also with the RX 500 series Polaris refresh offerings in April but then Vega will be following. So hopefully by the summer Vega will be here.

I hope that the games are being tuned for Vega in advance unlike some of the games that were/are still not tuned for Ryzen or Ryzen's extra cores for the affordable prices.

March 9, 2017 | 01:54 PM - Posted by Anonymous (not verified)

You realize that by the time Vega is out (may), we will be looking at preliminary Volta information, right?

AMD missed this generation entirely on the high-end.

March 9, 2017 | 03:19 PM - Posted by Anonymous (not verified)

Good more Volta news to force AMD to get its Navi to market on time! But with Zen/Ryzen and Zen/Naples making some mad revenues for AMD there will be no more excuses for AMD to be late with any flagship SKUs! So the competition moves from the two Ps in the mainstream market to the two Vs in the Flagship market. AMD does not have a flagship P SKU but some dual RX 480 prices deals will be had by some when the RX 500 series P refresh get here for AMD’s mainstream P SKUs along with that V flagship from AMD shortly after.

That Ps and Vs competition will lead to some N and whatever comes after competition as always.

March 10, 2017 | 09:04 AM - Posted by Anonymous (not verified)

Vega is coming to desktop in the next few months. Desktop Volta is now 1H next year. So at minimum it will be *at least* six months behind Vega, and possibly as much as a year.

Also, AMD will be launching Navi next year. So it won't be Vega that Volta finds itself up against.

Hate to break it to you, but it's AMD who are going to leap frog 'Paxwell' with Vega this year, and unless Volta is something really special, Navi will keep them ahead.

I'm not really sure how you managed to get things *completely* the wrong way round, but trust me, you have.

March 9, 2017 | 01:59 PM - Posted by Geforcepat (not verified)

march 2017. amd gives out "vega" tshirts. nvidia releases a new enthusiast card. and drops the price on another. #gofigure

March 9, 2017 | 02:42 PM - Posted by Anonymous (not verified)

you have an error in the summary, "Whether or not they can deliver on those promises has yet to be proven.", should say 'deliver on any promises has never been proven'.

Still very upset about Rysen.

PC Master Race, lets see those breath of wild & horizon zero dawn benches. LOL reviews 2 week's before launch suckas

September 8, 2017 | 12:52 PM - Posted by AnonymousaNUS (not verified)

You have a knack for scrutiny. Prepositions aren't sentunzez. Phreiz is 4 needing, dog cat barf, pooped em' XD

www.whatinthehellisthislinkandwhyisitsolongandwhydidyouactuallyspendthet...

March 9, 2017 | 02:56 PM - Posted by Dark_wizzie

Seems there is a typo in OC page:

'NVIDIA is slightly aggressive with target clock speeds and voltages ath the rated power targets, and that results in the variance that you see here.'

March 9, 2017 | 03:28 PM - Posted by Anonymous (not verified)

I think you should do an article on just how much of a lie TDP figures can be.

250W part? Might spike to 20% higher than that rating.

I think this is something that has real consequences. For supercomputers and other systems that have to last and be run at full load with minimal amounts of failures, this is unacceptable because it shortens the bathtub curve somewhat unpredictably.

Consumer GPUs with boost features are made to squeeze every last bit of performance out of them, rather than last long and be reliable. Hopefully theboard makers compensate, and hopefully their EDA, electromigration simulations and emulation when designing the chips treat the TDP as a complete lie as well.

Power virus games with shitty ScaleformUI menus could very well have a gpu like this sitting above its TDP long enough to probably cause some electromigration. Its really no wonder that larger GPU VRMs are a common failure.

March 9, 2017 | 04:28 PM - Posted by superj (not verified)

I think the key to TDP is that it is THERMAL DESIGN POWER. Not Total Input Power. As long as the short term average power stays below the rated TDP, then substantial brief spikes above the TDP are ok since the thermal cooling capability is buffered by the thermal mass and dissipation power of the cooler.

March 9, 2017 | 04:25 PM - Posted by Angelica (not verified)

Why not Doom test?

March 9, 2017 | 04:45 PM - Posted by Jaye1998

awesome

March 9, 2017 | 04:52 PM - Posted by Anonymous Nvidia User (not verified)

108% faster than Fury's in AMD sponsored Hitman 4k. Wow!!!!!

37% better than 1080 in GTA5 4k.

26% faster than 1080 Gears of War 4k.

Performance in 1440 may be the result of both ti and 1080 hitting engine limits not that ti is only 2% faster.

This card is designed for 4k isn't this what AMD users always bragging on.

Anyways I'm looking for a great 4k card and this looks like a winner.

Good review PCPer.

March 9, 2017 | 04:52 PM - Posted by retiredmtngryl07

great review.

March 9, 2017 | 05:03 PM - Posted by Stinky3toes (not verified)

WHY in your testing results do you not show any results for multiple monitors. With all the connectors on the GPU, Why don't you do any testing using them ?

March 9, 2017 | 05:27 PM - Posted by Jeremy Hellstrom

FCAT incompatibility unfortunately.  We are working on it with vendors, so hopefully we will be able to do this soon as we do want to provide that info in reviews.

March 9, 2017 | 05:07 PM - Posted by Ugiboy

I am very envious, looks amazing

March 9, 2017 | 05:17 PM - Posted by Michael Bertrand Pratte (not verified)

Seems to be the best card ATM! For quality and price, gtx 1080 ti FTW!

March 9, 2017 | 07:46 PM - Posted by ChuckyDiBi (not verified)

That power draw is much smarter when it come to repartition than the mess that was stock RX480 at launch. Still, I'd wait for a dual 8pin version from OEM!

March 10, 2017 | 02:13 AM - Posted by serpico (not verified)

I'm curious how you measure power through PCI-E. Measuring voltage is easy enough, do you use a precision resistor and op-amp to measure the current?

March 12, 2017 | 06:44 PM - Posted by crackshot91 (not verified)

Hey Ryan, just wondering if you could share the exact settings you used for the unigine heaven bench? It says 1440p Ultra, but what about tessellation (I imagine that's maxed out, of course) or more importantly, antialiasing? Only wondering because compared to your 980 score, mine is a few fps lower with 8xAA, though my card is heavily OC'd and scores significantly higher in 3DMark. Just trying to get a perfect-as-possible comparison :)

March 12, 2017 | 10:42 PM - Posted by Brandito (not verified)

Any word on wether that display port adapter is active or passive?

March 14, 2017 | 08:02 AM - Posted by Ninjawithagun

Get a new monitor that has DisplayPort 1.2 or higher. Problem solved. Why would you waste money buying an adapter for an older, inferior monitor?

March 16, 2017 | 11:41 AM - Posted by An extremely concerned reader (not verified)

I have 4 displays, I'd like to use all of them and am wondering whether I'd need to buy another adapter or use the one supplied with the card. What kind of pleb only runs one display? Do you even know how to PC Master Race?

March 14, 2017 | 08:01 AM - Posted by Ninjawithagun

Meh, my Titan X performs a bit better as it will hit 2100Mhz on the GPU. The only real advantage here is the price point. For me, it's a 'no thanks'. I'll wait for Volta to be released this fall ;-)

April 24, 2018 | 11:03 PM - Posted by ment1 (not verified)

I’m running a 8700K at 4.9 ghz + 16 gig ram + ASUS gtx 1080ti Turbo (OC) + 2560x1440 res

Same settings as the review

In game benchmark always stays at around 90 FPS and no more

Any idea why ?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.