Review Index:
Feedback

The NVIDIA GeForce GTX 1080 Ti Review

Author:
Manufacturer: NVIDIA

Flagship Performance Gets Cheaper

UPDATE! If you missed our launch day live stream, you can find the reply below:

It’s a very interesting time in the world of PC gaming hardware. We just saw the release of AMD’s Ryzen processor platform that shook up the processor market for the first time in a decade, AMD’s Vega architecture has been given the brand name “Vega”, and the anticipation for the first high-end competitive part from AMD since Hawaii grows as well. AMD was seemingly able to take advantage of Intel’s slow innovation pace on the processor and it was hoping to do the same to NVIDIA on the GPU. NVIDIA’s product line has been dominant in the mid and high-end gaming market since the 900-series with the 10-series products further cementing the lead.

View Full Size

The most recent high end graphics card release came in the form of the updated Titan X based on the Pascal architecture. That was WAY back in August of 2016 – a full seven months ago! Since then we have seen very little change at the top end of the product lines and what little change we did see came from board vendors adding in technology and variation on the GTX 10-series.

Today we see the release of the new GeForce GTX 1080 Ti, a card that offers only a handful of noteworthy technological changes but instead is able to shake up the market by instigating pricing adjustments to make the performance offers more appealing, and lowering the price of everything else.

The GTX 1080 Ti GP102 GPU

I already wrote about the specifications of the GPU in the GTX 1080 Ti when it was announced last week, so here’s a simple recap.

  GTX 1080 Ti Titan X (Pascal) GTX 1080 GTX 980 Ti TITAN X GTX 980 R9 Fury X R9 Fury R9 Nano
GPU GP102 GP102 GP104 GM200 GM200 GM204 Fiji XT Fiji Pro Fiji XT
GPU Cores 3584 3584 2560 2816 3072 2048 4096 3584 4096
Base Clock 1480 MHz 1417 MHz 1607 MHz 1000 MHz 1000 MHz 1126 MHz 1050 MHz 1000 MHz up to 1000 MHz
Boost Clock 1582 MHz 1480 MHz 1733 MHz 1076 MHz 1089 MHz 1216 MHz - - -
Texture Units 224 224 160 176 192 128 256 224 256
ROP Units 88 96 64 96 96 64 64 64 64
Memory 11GB 12GB 8GB 6GB 12GB 4GB 4GB 4GB 4GB
Memory Clock 11000 MHz 10000 MHz 10000 MHz 7000 MHz 7000 MHz 7000 MHz 500 MHz 500 MHz 500 MHz
Memory Interface 352-bit 384-bit G5X 256-bit G5X 384-bit 384-bit 256-bit 4096-bit (HBM) 4096-bit (HBM) 4096-bit (HBM)
Memory Bandwidth 484 GB/s 480 GB/s 320 GB/s 336 GB/s 336 GB/s 224 GB/s 512 GB/s 512 GB/s 512 GB/s
TDP 250 watts 250 watts 180 watts 250 watts 250 watts 165 watts 275 watts 275 watts 175 watts
Peak Compute 10.6 TFLOPS 10.1 TFLOPS 8.2 TFLOPS 5.63 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 8.60 TFLOPS 7.20 TFLOPS 8.19 TFLOPS
Transistor Count 12.0B 12.0B 7.2B 8.0B 8.0B 5.2B 8.9B 8.9B 8.9B
Process Tech 16nm 16nm 16nm 28nm 28nm 28nm 28nm 28nm 28nm
MSRP (current) $699 $1,200 $599 $649 $999 $499 $649 $549 $499

The GTX 1080 Ti looks a whole lot like the TITAN X launched in August of last year. Based on the 12B transistor GP102 chip, the new GTX 1080 Ti will have 3,584 CUDA core with a 1.60 GHz Boost clock. That gives it the same processor count as Titan X but with a slightly higher clock speed which should make the new GTX 1080 Ti slightly faster by at least a few percentage points and has a 4.7% edge in base clock compute capability. It has 28 SMs, 28 geometry units, 224 texture units.

View Full Size

Interestingly, the memory system on the GTX 1080 Ti gets adjusted – NVIDIA has disabled a single 32-bit memory controller to give the card a total of 352-bit wide bus and an odd-sounding 11GB memory capacity. The ROP count also drops to 88 units. Speaking of 11, the memory clock on the G5X implementation on GTX 1080 Ti will now run at 11 Gbps, a boost available to NVIDIA thanks to a chip revision from Micron and improvements to equalization and reverse signal distortion.

The move from 12GB of memory on the GP102-based Titan X to 11GB on the GTX 1080 Ti is an interesting move, and evokes memories of the GTX 970 fiasco where NVIDIA disabled a portion of that memory controller but left the memory that would have resided on it ON the board. At that point, what behaved as 3.5GB of memory at one speed and 500 MB at another speed, was the wrong move to make. But releasing the GTX 970 with "3.5GB" of memory would have seemed odd too. NVIDIA is not making the same mistake, instead building the GTX 1080 Ti with 11GB out the gate.

Continue reading our review of the NVIDIA GeForce GTX 1080 Ti 11GB graphics card!

NVIDIA spent time at the tech day detailing its tiled caching rendering method (that it has been using without disclosure since the 900-series launch) in an attempt to demonstrate the ability of GDDR5X at high speeds with high compression. Part of the reasoning for finally divulging this information is to counter the idea that HBM2 will have a fundamental advantage over G5/G5X technologies. With Vega coming mid-year and highly touting the inclusion of HBM2 and its HBCC, NVIDIA is getting ahead of the curve on messaging.

View Full Size

The TDP of the new part is 250 watts, identical to the Titan product. The cooler has been improved compared to the GTX 1080, offering quieter fan speeds and lower temperatures when operating at the same power envelope.

Performance estimates from NVIDIA put the GTX 1080 Ti about 35% faster than the GTX 1080, the largest “kicker performance increase” that we have seen from a flagship Ti launch.

View Full Size

Pricing is going to be set at $699 so don't expect to find this in any budget builds. But for the top performing GeForce card on the market, that's what we expect. Though we don't want to belittle a cost like this, the truth is NVIDIA undercut what many in the industry expected this card would cost in an attempt to prepare for the battle with Vega later in the summer. The question that will need to be answered by AMD is whether its new product line will be able to match or beat the best that NVIDIA puts forward.

The GeForce GTX 1080 Ti Graphics Card

View Full Size

To the surprise of no one, the primary design of the GTX 1080 Ti shares a lot with the GTX 1080 and the Titan X cards launch last year in both style, function and cooling capability. The Founders Edition, as NVIDIA continues to call its reference/early card builds, remains in place but with a very important change: the base MSRP of the card and the Founders Edition price will be the same. This means that complaints we had with the 10-series launch and the debut of the Founders Edition system and pricing should be dissuaded.

View Full Size

From a cooling and power design perspective, the GTX 1080 Ti shares more in common with the latest Titan X than with the GTX 1080. This includes a larger vapor chamber for improved cooling performance at the same or lower noise levels and a power design with 7-phases of 2x dual-FET components capable of 250 amps of power.

View Full Size

The back plate and base plate on the card are functional for cooling as well as for structural integrity for system that are moved after product installation.

View Full Size

This does mark the first time we have seen a consumer GeForce graphics card from NVIDIA be released without a DVI connection. NVIDIA decided instead to utilize that space for additional air flow out of the chassis from the radial fan on the blower design. As a frequent user of the DVI connection for our use and testing internally, that’s a letdown, but NVIDIA did include a DisplayPort to DVI adapter in the box to help alleviate any issues. (It’s a single link adapter, limited to 1920x1200…)

View Full Size

Power connectivity requires an 8-pin and a 6-pin from the power supply to reach and extend past the 250 watt product TDP. 


March 9, 2017 | 09:11 AM - Posted by Anonymous (not verified)

Small remark on 11 page (Detailed Power Consumption Testing):

Last Light testing at 4K with the Titan X running at a +150 MHz offset (should be 1080 Ti, instead of Titan X).

Other than that - thanks for review.

March 10, 2017 | 02:39 AM - Posted by djotter

Saw that too, copy pasta!

March 9, 2017 | 09:20 AM - Posted by khanmein

interesting power draw from 6-pin & any slow down once we hit peak of 11 GB VRAM?

weird no DOOM, For Honor, Sniper Elite 4 etc benchmark?

March 9, 2017 | 02:36 PM - Posted by Anonymous (not verified)

Doom isn't much of a stress test for anything, I think thats been demonstrated very well by many testing outlets. For Honor and Sniper Elite 4 are pretty damn new, hard to castigate them for not using them

March 10, 2017 | 04:38 AM - Posted by Anonymous (not verified)

Actually running with Vulcan is good for testing CPU OC stability.

March 9, 2017 | 09:26 AM - Posted by Anonymous (not verified)

2% faster then GTX1080 in Grand Theft Auto V and Gears of War at 1440p.

Woooooww!

March 9, 2017 | 11:53 AM - Posted by slizzo

Those two titles are heavily CPU dependent.

March 9, 2017 | 02:51 PM - Posted by Anonymous (not verified)

Gears of War isn't

March 9, 2017 | 04:54 PM - Posted by Anonymous (not verified)

You know it's not GOW4 right?

March 9, 2017 | 06:35 PM - Posted by Anonymous (not verified)

Unreal engine isn't cpu dependant

March 9, 2017 | 09:30 AM - Posted by Anonymous (not verified)

IB4 Ryzen testing requests.

March 9, 2017 | 09:34 AM - Posted by Geoff Peterson (not verified)

Thanks for the 1400p 980ti benchmark comparisons for those of us thinking of upgrading from that gpu. Looks like a pretty compelling upgrade.

My one question is the reference PCB layout. Is it identical to Titan X pascal? I like to watercool my GPUs to get the most out of them

March 9, 2017 | 09:59 AM - Posted by Jann5s

yes, it is the same PCB, see the pictures here

March 9, 2017 | 09:35 AM - Posted by Anonymous (not verified)

250 amps? I'd like to see that power supply!

March 9, 2017 | 10:51 AM - Posted by Anonymous (not verified)

At 3V, 5V, 12V... to name a few! :)

March 9, 2017 | 12:19 PM - Posted by Anonymousmouse (not verified)

Hopefully, it is at 1V or below:) It was mentioned in the launch live stream too, there must be a mix up somewhere.

March 9, 2017 | 11:18 AM - Posted by Anonymous (not verified)

"the NVIDIA GeForce GTX 1080 Ti is going to be selling for $699 starting today from both NVIDIA and it's partners"

But.. but.. it's not selling today anywhere..

March 9, 2017 | 11:54 AM - Posted by slizzo

Yeah, looks like NDA date was today, actual launch is tomorrow.

March 9, 2017 | 01:01 PM - Posted by Anonymous (not verified)

Any word on the 980ti step up to 1080ti?

March 9, 2017 | 01:04 PM - Posted by Anonymous (not verified)

A bigger chip off the GP 102 block, so more performance through more resources. It will be intresting to see what AMD's Vega will offer in performance with all of Vega's new features that are new to the Vega micro-arch relative to the Ti's more amounts of the same with some slightly higher clocks. that $699 price point will not be too difficult for AMD to beat and Vega will have that NCU, HBCC, and other all new features to take into account. So it's the Pascal refresh versus Vega's new features for AMD's latest Flagship offering when Vega is released 2Q of 2017.

AMD does have its more of the same in the mainstream also with the RX 500 series Polaris refresh offerings in April but then Vega will be following. So hopefully by the summer Vega will be here.

I hope that the games are being tuned for Vega in advance unlike some of the games that were/are still not tuned for Ryzen or Ryzen's extra cores for the affordable prices.

March 9, 2017 | 01:54 PM - Posted by Anonymous (not verified)

You realize that by the time Vega is out (may), we will be looking at preliminary Volta information, right?

AMD missed this generation entirely on the high-end.

March 9, 2017 | 03:19 PM - Posted by Anonymous (not verified)

Good more Volta news to force AMD to get its Navi to market on time! But with Zen/Ryzen and Zen/Naples making some mad revenues for AMD there will be no more excuses for AMD to be late with any flagship SKUs! So the competition moves from the two Ps in the mainstream market to the two Vs in the Flagship market. AMD does not have a flagship P SKU but some dual RX 480 prices deals will be had by some when the RX 500 series P refresh get here for AMD’s mainstream P SKUs along with that V flagship from AMD shortly after.

That Ps and Vs competition will lead to some N and whatever comes after competition as always.

March 10, 2017 | 09:04 AM - Posted by Anonymous (not verified)

Vega is coming to desktop in the next few months. Desktop Volta is now 1H next year. So at minimum it will be *at least* six months behind Vega, and possibly as much as a year.

Also, AMD will be launching Navi next year. So it won't be Vega that Volta finds itself up against.

Hate to break it to you, but it's AMD who are going to leap frog 'Paxwell' with Vega this year, and unless Volta is something really special, Navi will keep them ahead.

I'm not really sure how you managed to get things *completely* the wrong way round, but trust me, you have.

March 9, 2017 | 01:59 PM - Posted by Geforcepat (not verified)

march 2017. amd gives out "vega" tshirts. nvidia releases a new enthusiast card. and drops the price on another. #gofigure

March 9, 2017 | 02:42 PM - Posted by Anonymous (not verified)

you have an error in the summary, "Whether or not they can deliver on those promises has yet to be proven.", should say 'deliver on any promises has never been proven'.

Still very upset about Rysen.

PC Master Race, lets see those breath of wild & horizon zero dawn benches. LOL reviews 2 week's before launch suckas

March 9, 2017 | 02:56 PM - Posted by Dark_wizzie

Seems there is a typo in OC page:

'NVIDIA is slightly aggressive with target clock speeds and voltages ath the rated power targets, and that results in the variance that you see here.'

March 9, 2017 | 03:28 PM - Posted by Anonymous (not verified)

I think you should do an article on just how much of a lie TDP figures can be.

250W part? Might spike to 20% higher than that rating.

I think this is something that has real consequences. For supercomputers and other systems that have to last and be run at full load with minimal amounts of failures, this is unacceptable because it shortens the bathtub curve somewhat unpredictably.

Consumer GPUs with boost features are made to squeeze every last bit of performance out of them, rather than last long and be reliable. Hopefully theboard makers compensate, and hopefully their EDA, electromigration simulations and emulation when designing the chips treat the TDP as a complete lie as well.

Power virus games with shitty ScaleformUI menus could very well have a gpu like this sitting above its TDP long enough to probably cause some electromigration. Its really no wonder that larger GPU VRMs are a common failure.

March 9, 2017 | 04:28 PM - Posted by superj (not verified)

I think the key to TDP is that it is THERMAL DESIGN POWER. Not Total Input Power. As long as the short term average power stays below the rated TDP, then substantial brief spikes above the TDP are ok since the thermal cooling capability is buffered by the thermal mass and dissipation power of the cooler.

March 9, 2017 | 04:25 PM - Posted by Angelica (not verified)

Why not Doom test?

March 9, 2017 | 04:45 PM - Posted by Jaye1998

awesome

March 9, 2017 | 04:52 PM - Posted by Anonymous Nvidia User (not verified)

108% faster than Fury's in AMD sponsored Hitman 4k. Wow!!!!!

37% better than 1080 in GTA5 4k.

26% faster than 1080 Gears of War 4k.

Performance in 1440 may be the result of both ti and 1080 hitting engine limits not that ti is only 2% faster.

This card is designed for 4k isn't this what AMD users always bragging on.

Anyways I'm looking for a great 4k card and this looks like a winner.

Good review PCPer.

March 9, 2017 | 04:52 PM - Posted by retiredmtngryl07

great review.

March 9, 2017 | 05:03 PM - Posted by Stinky3toes (not verified)

WHY in your testing results do you not show any results for multiple monitors. With all the connectors on the GPU, Why don't you do any testing using them ?

March 9, 2017 | 05:27 PM - Posted by Jeremy Hellstrom

FCAT incompatibility unfortunately.  We are working on it with vendors, so hopefully we will be able to do this soon as we do want to provide that info in reviews.

March 9, 2017 | 05:07 PM - Posted by Ugiboy

I am very envious, looks amazing

March 9, 2017 | 05:17 PM - Posted by Michael Bertrand Pratte (not verified)

Seems to be the best card ATM! For quality and price, gtx 1080 ti FTW!

March 9, 2017 | 07:46 PM - Posted by ChuckyDiBi (not verified)

That power draw is much smarter when it come to repartition than the mess that was stock RX480 at launch. Still, I'd wait for a dual 8pin version from OEM!

March 10, 2017 | 02:13 AM - Posted by serpico (not verified)

I'm curious how you measure power through PCI-E. Measuring voltage is easy enough, do you use a precision resistor and op-amp to measure the current?

March 12, 2017 | 06:44 PM - Posted by crackshot91 (not verified)

Hey Ryan, just wondering if you could share the exact settings you used for the unigine heaven bench? It says 1440p Ultra, but what about tessellation (I imagine that's maxed out, of course) or more importantly, antialiasing? Only wondering because compared to your 980 score, mine is a few fps lower with 8xAA, though my card is heavily OC'd and scores significantly higher in 3DMark. Just trying to get a perfect-as-possible comparison :)

March 12, 2017 | 10:42 PM - Posted by Brandito (not verified)

Any word on wether that display port adapter is active or passive?

March 14, 2017 | 08:02 AM - Posted by Ninjawithagun

Get a new monitor that has DisplayPort 1.2 or higher. Problem solved. Why would you waste money buying an adapter for an older, inferior monitor?

March 16, 2017 | 11:41 AM - Posted by An extremely concerned reader (not verified)

I have 4 displays, I'd like to use all of them and am wondering whether I'd need to buy another adapter or use the one supplied with the card. What kind of pleb only runs one display? Do you even know how to PC Master Race?

March 14, 2017 | 08:01 AM - Posted by Ninjawithagun

Meh, my Titan X performs a bit better as it will hit 2100Mhz on the GPU. The only real advantage here is the price point. For me, it's a 'no thanks'. I'll wait for Volta to be released this fall ;-)