Review Index:
Feedback

The NVIDIA GeForce RTX 2080 and RTX 2080 Ti Review

Author:
Manufacturer: NVIDIA

New Generation, New Founders Edition

At this point, it seems that calling NVIDIA's 20-series GPUs highly anticipated would be a bit of an understatement. Between months and months of speculation about what these new GPUs would be called, what architecture they would be based off, and what features they would bring, the NVIDIA GeForce RTX 2080 and RTX 2080 Ti were officially unveiled in August, alongside the Turing architecture.

View Full Size

We've already posted our deep dive into the Turing architecture and the TU 102 and TU 104 GPUs powering these new graphics cards, but here's a short take away. Turing provides efficiency improvements in both memory and shader performance, as well as adds additional specialized hardware to accelerate both deep learning (Tensor cores), and enable real-time ray tracing (RT cores).

  RTX 2080 Ti Quadro RTX 6000 GTX 1080 Ti RTX 2080  Quadro RTX 5000 GTX 1080 TITAN V RX Vega 64 (Air)
GPU TU102 TU102 GP102 TU104 TU104 GP104 GV100 Vega 64
GPU Cores 4352 4608 3584 2944 3072 2560 5120 4096
Base Clock 1350 MHz 1455 MHz 1408 MHz 1515 MHz 1620 MHz 1607 MHz 1200 MHz 1247 MHz
Boost Clock 1545 MHz/
1635 MHz (FE)
1770 MHz 1582 MHz 1710 MHz/
1800 MHz (FE)
1820 MHz 1733 MHz 1455 MHz 1546 MHz
Texture Units 272 288 224 184 192 160 320 256
ROP Units 88 96 88 64 64 64 96 64
Tensor Cores 544 576 -- 368 384 -- 640 --
Ray Tracing Speed 10 GRays/s 10 GRays/s -- 8 GRays/s 8 GRays/s -- -- --
Memory 11GB 24GB 11GB 8GB 16GB 8GB 12GB  8GB
Memory Clock 14000 MHz  14000 MHz  11000 MHz 14000 MHz  14000 MHz  10000 MHz 1700 MHz 1890 MHz
Memory Interface 352-bit G6 384-bit G6 352-bit G5X 256-bit G6 256-bit G6 256-bit G5X 3072-bit HBM2 2048-bit HBM2
Memory Bandwidth 616GB/s 672GB/s 484 GB/s 448 GB/s 448 GB/s 320 GB/s 653 GB/s 484 GB/s
TDP 250 W/
260 W (FE)
260 W 250 watts 215W
225W (FE)
230 W 180 watts 250W 292
Peak Compute (FP32) 13.4 TFLOPS / 14.2 TFLOP (FE) 16.3 TFLOPS 10.6 TFLOPS 10 TFLOPS / 10.6 TFLOPS (FE) 11.2 TFLOPS 8.2 TFLOPS 14.9 TFLOPS 13.7 TFLOPS
Transistor Count 18.6 B 18.6B 12.0 B 13.6 B 13.6 B 7.2 B 21.0 B 12.5 B
Process Tech 12nm 12nm 16nm 12nm 12nm 16nm 12nm 14nm
MSRP (current) $1200 (FE)/
$1000
$6,300 $699 $800/
$700
$2,300 $549 $2,999 $499

 

As unusual as it is for them NVIDIA has decided to release both the RTX 2080 and RTX 2080 Ti at the same time, as the first products in the Turing family. 

The TU102-based RTX 2080 Ti features 4352 CUDA cores, while the TU104-based RTX 2080 features 2944, less than the GTX 1080 Ti. Also, these new RTX GPUs have moved to GDDR6 from the GDDR5X we found on the GTX 10-series.

View Full Size

Click here to continue reading our review of the RTX 2080 and 2080 Ti.

One of the most significant departures with the RTX 2080 and RTX 2080 Ti can be found in the newly redesigned NVIDIA Founders Edition products.

View Full Size

Finally moving away from the blower-style cooler, the Founders Edition cards now feature a dual axial fan design, along with a vapor chamber that's the entire length of the card, as well as a substantially redesigned look.

Utilizing this new found cooling capacity, the Founders Editions cards will also be shipped in a factory overclocked state for the first time, 90MHz for both SKUs.

Since NVIDIA is generally the only person that ships their graphics cards at reference clock speeds, it's difficult to predict if we'll see any RTX cards clocked lower than the Founders Editions, but NVIDIA is claiming that these cards are in fact "overclocked." 

View Full Size

Another addition to the Founders Edition cards and all of the third party RTX cards we've seen so far is the addition of a USB-C connector. This USB-C connector is VirtualLink compliant, a standard developed through coordination with NVIDIA, Oculus, Valve, Microsoft, and AMD to provide a one-cable solution for next-generation VR headsets.

The VirtualLink port is capable of providing 4 Lanes of HBR3 DisplayPort, USB 3.1 Gen 2 connectivity, as well as 27W of power delivery.

View Full Size

The RTX 2080 features an 8-pin and a 6-pin power connector, while the RTX 2080 Ti moves to two 8-pin connectors.

View Full Size

The other funky new connector on the RTX cards is NVLink. This communication protocol, previously seen in NVIDIA's Tesla and Quadro products provides a high-bandwidth replacement for SLI and will enable resolutions up to 8K (Single Link on the RTX 2080), and 8K Surround (Dual Link on the RTX 2080 Ti). However, NVLink is only compatible with 2 GPUs, officially ending the days of 3 and 4-way SLI, even if Pascal only featured support in select benchmarks.

Unlike the launch of the Pascal-based GTX 10-series cards, graphics card designs from third-party manufacturers such as MSI, EVGA, and ASUS will be ready and shipping on the same launch date as the Founders Edition (September 20th).

View Full Size

Here's just a small taste of what's in store for the coming weeks as we take a look at these new third-party designs.

Review Terms and Disclosure
All Information as of the Date of Publication
How product was obtained: The product is on loan from NVIDIA for the purpose of this review.
What happens to the product after review: The product remains the property of NVIDIA but is on extended loan for future testing and product comparisons.
Company involvement: NVIDIA had no control over the content of the review and was not consulted prior to publication.
PC Perspective Compensation: Neither PC Perspective nor any of its staff were paid or compensated in any way by NVIDIA for this review.
Advertising Disclosure: NVIDIA has purchased advertising at PC Perspective during the past twelve months.
Affiliate links: This article contains affiliate links to online retailers. PC Perspective may receive compensation for purchases through those links.
Consulting Disclosure: NVIDIA is not a current client of Shrout Research for products or services related to this review. 

September 19, 2018 | 09:11 AM - Posted by Anonymou5 (not verified)

So basically, skip it and wait for 30-series?

September 19, 2018 | 09:37 AM - Posted by tts (not verified)

If you've got a fairly modern card already (ie. 1080 or Vega64) then yeah pretty much.

Unless you're loaded and don't care about money at all then OK do whatever you want. For anyone else who works for a living a 2080 or 2080Ti is a real tough sell.

September 19, 2018 | 10:14 AM - Posted by Anonymou5 (not verified)

Insane pricing aside, it seems like ray-tracing is not worth the performance hit. I'm surprised to see the 2080 actually beaten by the 1080Ti in a few cases. I'm sure that some will chalk it up to drivers, etc., but it doesn't inspire confidence. That 2080Ti price is just absurd though.

September 19, 2018 | 11:47 AM - Posted by Thretosix (not verified)

This card teases ray-tracing really. I believe the card has to run at 1080p resolution just to get average at best framerates. I'm personally going to wait till the next round where these features will start to become more mainstream, more efficient and run at higher resolutions.

September 19, 2018 | 12:31 PM - Posted by Jim Lahey

You're wrong on that, bud. Check other reviews for 4k Ray tracing.

September 19, 2018 | 11:06 AM - Posted by Anonymously Anonymous (not verified)

I'd if you are wanting ray tracing at 1440p or 4k at playable framerates, then most definitely skip this gen, and possibly even the next gen.

Honestly, if one doesnt' care about ray tracing but you can get a higher tier card, then get yourself a VRR monitor and compatible GPU and call it a day. IMO, VRR makes much more of a difference anyway

September 19, 2018 | 11:55 AM - Posted by Anonymou5 (not verified)

Does this card support VRR? I know it supports G-Sync, but VRR is a very specific HDMI 2.1 feature.

September 20, 2018 | 07:24 PM - Posted by Dribble (not verified)

I'll wait a lot longer. Since before the 1080 ti my last card purchase was a 460.

September 19, 2018 | 09:15 AM - Posted by Dark_wizzie

Is DLSS supposed to look similar to TAA?

September 19, 2018 | 09:51 AM - Posted by Ken Addison

DLSS is actually supposed to match the visual quality of 64x SSAA. We haven't had a whole lot of time (or software) to dig into this with, but it's something we plan on focusing on soon!

September 20, 2018 | 05:14 AM - Posted by Dark_wizzie

Okay good, that's what I originally thought. If a big game somebody loves gets this AA and it works as advertised, then that alone could be enough to warrant upgrading. But that remains to be seen.

September 19, 2018 | 09:17 AM - Posted by Humanitarian

Have we stopped doing video reviews for launches now? Or do we just get the roundup in the podcast? I kinda liked those.

September 19, 2018 | 10:21 AM - Posted by DrMcDingle (not verified)

I'm really waiting for The Verge's review, they seem to really know what they are talking about.

September 19, 2018 | 10:48 AM - Posted by Anonymously Anonymous (not verified)

Isn't "the Verge" the same media outlet that did that insanely bad pc build guide? And then proceeded to block people for criticizing their so-called guide?

September 19, 2018 | 11:22 AM - Posted by WayneJetSki

Yes

September 19, 2018 | 11:25 AM - Posted by SetiroN

Yep, that's why he sarcastically said that.

It's a terrible tech blog that has has always been superficial and often facetious, the epitome of which was that video. Their owner, Vox media, owns multiple equally superficial and generally terrible blogs like polygon.

September 19, 2018 | 10:58 AM - Posted by kaldviking (not verified)

Best comment of the day!

September 19, 2018 | 10:55 AM - Posted by CS (not verified)

Running 970 and will wait for 2060 series of cards to be released and finally upgrading out of 900 series of cards.

September 19, 2018 | 10:57 AM - Posted by Kareha

No 1440p results?

September 19, 2018 | 11:09 AM - Posted by collie

It's interesting to see the reviews for this poping up, I think Linus summed it up with "It seems rushed"

At the end of the day, they are betting the farm on real-time ray tracing, and now I wonder how much seed money they will/already have spread to every corner of the game world to keep said farm.

Question, this ray tracing, is it proprietary or is it open standard? I've ready plenty on how it works and why its good but I'm unaware if it's a possible future industry standard, or a possible future monopoly.

September 19, 2018 | 11:35 AM - Posted by Anonymouse (not verified)

"Question, this ray tracing, is it proprietary or is it open standard?" Accessible via DXR (DirectX) or Vulkan RT (Vulkan).

September 19, 2018 | 11:29 AM - Posted by KingKookaluke (not verified)

I'm going to skip this series of cards. Maybe the 3000 series will offer something worth the cost of the upgrade.

September 19, 2018 | 01:37 PM - Posted by Jim Lahey

Have fun with 60% of the speed of the fastest graphics card on the market for the next two years.

September 20, 2018 | 07:43 AM - Posted by Anonymous0 (not verified)

I can't speak for him, but I'm having plenty of fun with my 970. I can wait for the 30-series and Navi (both within the next year) before making my decision.

September 19, 2018 | 11:40 AM - Posted by Anonymous5463563 (not verified)

Does SLI scale better with Nvlink than old bridge? Are AIBs or thinner Founders less hot in SLI? Useless to me with no SLI data

September 19, 2018 | 11:46 AM - Posted by Salick27 (not verified)

GTX 980Ti MSRP $649

card released next generation with similar performance

GTX 1070 MSRP $379 with FE card $449

Now

GTX 1080Ti MSRP $699 only price listed on wiki

card released next generation with similar performance

RTX 2080 MSRP $699 with FE card $799

That's what I am seeing here. That's quite the early adopter tax for ray tracing "coming soon"

Prices taken from wiki

September 19, 2018 | 12:50 PM - Posted by Anonymous2456 (not verified)

This was like a pin in my hype balloon...

I guess I'll be skipping this gen.

Come on Nvidia this feels like a pre-order for a product for which most of the good features are "Coming Soon to a RTX card near you".

September 19, 2018 | 01:24 PM - Posted by Ytterbium (not verified)

Does titian V support DLSS

September 19, 2018 | 02:01 PM - Posted by Godrilla

The price of the 2080 fell to $749 at microcenter for msi FE
And $649 for non reference 1080ti by gigabyte
Newegg has a vega 64 selling for $499

Fyi dont overpay!

September 19, 2018 | 02:55 PM - Posted by PayingOfBleedingEdgeBleedsTheWallet (not verified)

Nvidia's got some great new Technology just like AMD's Vega has that Explicit Primitive Shader IP/etc. The differce being that Nvidia has the billions of dollars extra to provide direct Games/Gaming engine development support so that Nvidia's New Ray Tracing/AI-Tensor-Core IP will get adopted and used in games.

AMD bit of more than they could financially chew with trying to create that Implicit Primitive Shader legacy gaming code conversion process in order to, on the fly, allow legacy games to make use of Vega's In GPU hardware Explicit Primitive Shaders IP.

Nvidia is not even trying to do any sorts of Implicit conversion of any legacy games on the fly so that legacy games can make use of any of Turing's RTX Ray Tracing/Tensor core AI IP. And that's because even Nvidia would not succeed with that complex of a software task. So Nvidia is just helping the games developers explicitly target Nvidia's New Ray Tracing and AI/Tensor Core IP for new games.

AMD's focused too much time and resources on that Implicit Primitive Shaders coding/conversion task's software engineering for legacy games and AMD should have instead spent the money to assist the Games developers in Explicitly Targeting Vega's Explicit Primitive Shader Hardware IP, and forget about legacy games.

So Now things Primitive Shader related will not be ready until Navi is to market as far as targeting the Explicit Primitive Shaders IP that will also be in Navi's GPU Micro-Arch. Hopefully once AMD gets the Eplicit Primitive Shader IP support into the Graphics APIs/its GPU drivers then that support should be back portable to Vega. AMD is still going to have to support the games/gaming engine developers directly if AMD wants its GPU featuers utilized by games.

The best GPU hardware featuers are useless no matter the GPU maker involved unless there is active support from that GPU's maker to assist the games/gaming engine makers and the Graphics API makers in adopting any new GPU hardware based IP. Nvidia will spend the billions necessary because Nvidia has the billions to get the job done. AMD has great technology inside Vega but AMD lacks the billions at the moment to get Vega's IP fully utilized as quickly as Nvidia can. No One's software/firmware enineers work for free and a lot of those folks can command 6 figure salaries or they can easily go and work some place that will pay six figures+.

I can see the Console Games makers being the first to make use of AMD's Explicit Primitive Shader IP but that will be done Via Microsoft's and Sony's/Others' dime more than AMD's. Nvidia's got some new Shader IP also in Turing so that's kind of similar to what AMD's Primitive Shader IP does. But software engineers don't work for free so expect those costs to be passed on to the consumer.

And Still to this Day Joe Gamer does not Know the difference between the Implicit kind of support that is done via a software/middleware conversion process on the Fly and via graphics APIs and the Explicit kid of directly in the hadrware kind of support that exists that is supported by the drivers and the graphics APIs.

Joe Gamer is even whining about Nvidia's Ray Tracing/AI IP as much as Joe Gamer whines about AMD's Explicit primitive Shaders. And there is only one solution and That's for the GPU makers to make the gamers pay more for all the extra costs involved for the GPU Makers in supporting the games/gaming engine makers to tweak their games.

Nvidia increasing its GPUs MSRP/ASPs will help AMD in the long run also. And there are still plenty of older generation GPUs available from both Nvidia and AMD. So if gamers just want to Increase their FPS metrics only and do not care about any new IP, well gamers can get Dual GPUs and game that way with both Vega and Pascal based GPUs and get the extra FPS metrics to brag to Vern about!

Really Nvidia is advancing Gaming for everyone with that new RT Cores/AI Cores IP the very same way that AMD is for Explicit Primitive Shaders that will be of use in games. It always takes time and some folks will always have the funds while others will not. So some will have to wait for the price to come down and game a generation behind, and there really is nothing wrong with that.

September 19, 2018 | 03:35 PM - Posted by Moyenni (not verified)

Is there any info on which USB controller the card is using?

September 19, 2018 | 08:20 PM - Posted by Ryan Shrout

It's internal on the GPU, I think using the same USB logic they integrated into Tegra.

September 20, 2018 | 04:41 AM - Posted by Moyenni (not verified)

Thanks!
Curious about the behaviour with VR headsets since they are known to be picky. Of course future headsets with this connector will be tested on NV cards, so not likely to be a problem.
Also, is it possible to plug standard USB storage there?

September 19, 2018 | 04:45 PM - Posted by elites2012

how is Rt any different from modified T&L?? i did not see a difference in the demo.

September 19, 2018 | 08:19 PM - Posted by 60000Cores (not verified)

Summary:

2080=1080ti with ray tracing.
2080ti is incredibly fast, but not priced for mere mortals.
7nm stuff next year will be really fast.
Awesome review Pcper

Happy with my 1080ti 2139/5800 under water-
Superposition benchmarks:
1080p Extreme 6,400
4k optimized 10,400

September 19, 2018 | 10:32 PM - Posted by Cellar Door

There is no frickin way this deserves a Gold Award. This literally is the most forced, lame Gold Award I have seen in the history of this site.

It is a Gold Award - because we probably should..or repercussions.

September 20, 2018 | 02:36 AM - Posted by Pit Nicker (not verified)

Sure, but does anyone take what award something gets into consideration, when making a purchasing decision? I don't.
But people who dole out repercussions at Nvidia probably only care about the conclusions page and a shiny badge puts them at ease.

September 20, 2018 | 07:55 AM - Posted by Anonymous0 (not verified)

Definitely a Silver for the 2080Ti, possibly Bronze.

Pro:
2080Ti is the fastest gaming card. Pure power. In a vacuum, that would be worth a platinum award. However...

Cons:
Exorbitant price - laughable even
Louder
Hotter
More power hungry
Limited OC demonstrated
No DLSS support at launch
No Ray-tracing support at launch

The 2080 barely deserves an honorable mention. Sub-$500 1080Ti are showing up in the used market already. Once the 30-series comes out, there'll be a firesale on any remaining 10-series in the wild and the 20-series will be like a fart on the wind.

September 20, 2018 | 02:53 PM - Posted by Jeremy Hellstrom

Out of curiousity, what GPU is it you feel the fastest available consumer graphics card? That is the verbatim reason the award was given.  

Your personal "Gold Award" may have different criteria; we've specified why it gets ours, just like various version of Titans have in the past.

September 20, 2018 | 04:09 PM - Posted by anonymous12859877893 (not verified)

Agreed. Just because its still technically the "fastest" doesn't mean you automatically have to give the product the highest rating. I think we've gotten so accustomed to Nvidia leading the charts with minimal effort that it's just become an automatic yawn slap sticker.

The pricing to performance ratio is really bad for being a next-gen card. Could also be showing GPU's are finally starting to hit the CPU/transistor brick wall, which was bound to happen sooner or later.

September 20, 2018 | 01:08 AM - Posted by ptxfun (not verified)

@Cellar Door, agreed.

"For being the fastest available consumer graphics card, the NVIDIA GeForce RTX 2080 Ti receives the PC Perspective Gold Award. While this GPU is quite pricey at it's $1200 price tag, the performance benefits over the RTX 2080, and previous GTX 10-series GPUs should provide an excellent PC gaming experience for years to come."

For years to come? With <60fps @ 1080p for RTX, that's only the case with current raster games. Let's not forget DLSS uses low res textures better called a blurfest, quincunx anyone? DLSSx2 has no perf gains over TAA. It's definitely useful to get into dev hands, though. 7nm Ampere will perform where Turing is constrained (even with the large die sizes). Prices won't fall with 7nm, unfortunately...

September 20, 2018 | 02:24 AM - Posted by Kareha

The £400 I recently paid for my 1080Ti is looking really good value right now, for 1440p/165Hz monitor it's absolutely perfect. I have no need for HDR or Ray Tracing so the pricing of the new cards is just over the top for the technology that I'd likely never switch on in the first place.

September 20, 2018 | 04:58 PM - Posted by AquaVixen (not verified)

So why did you not show us power consumption figures -AFTER- you overclocked the cards? From what I'm reading and understanding here... your displayed power consumption figures are only at stock with no overclocking applied? How can we know how much power it uses when overclocked if you wont show us? And we have to rely on you, pcper.com to show us because literally -NO ONE ELSE ON THE ENTIRE INTERNET- is showing any power consumption figures at all for the RTX 2080 Ti. So please, do show us how much it uses after being overclocked.

September 20, 2018 | 06:34 PM - Posted by cronson

I see that Gold Award but it sounds like a hard pass to me. Until prices drop, I don't see why anyone would buy these cards.

September 20, 2018 | 11:15 PM - Posted by sshk (not verified)

This's still confusing, so if I'm coming from a GTX770 as a productivity not gaming user (architecture student), would this or the 1080ti make sense? like does anyone know if the ray tracing thing is gonna reflect into rendering software or does that seem like a far outcome and software developer dependant...I don't really have the funds to drop on a buzzword tech and not actually be future proofing for anything.

September 22, 2018 | 10:25 PM - Posted by tmanini (not verified)

You can check Puget Systems for some nice charts comparing rendering times with a variety of software you likely use, and cards. IF the 1080Ti holds to be similar to the 2080 (ray tracing not withstanding) you might expect "similar" performance... It seems some games do reasonably better on the 2080 so far. While several are improved only 5-7%. We have a lot left to learn through testing, and why some are not necessarily improved.
So, your $ might be better spent on the 1080Ti... at least with initial tests just starting to roll out.

September 22, 2018 | 03:21 AM - Posted by PRH (not verified)

2080Ti (still waiting) ?/*

I think I should have kept my NVIDIA Titan Xp, it got 17,600 on Passmark, much more, without a RAID0 Volume. But, CPU and Disk, is always first for speed, no doubt, don't speak.

The scores posted on that web site look like what you get from 8 PCIE lanes, not 16 PCIE lanes. It is confusing, they always make the new devices look better, but, are they?

Just 17,600 vs. 14,800 right now (14,800 on passmark), and even worse without the RAID0 volume, 18,000 vs. there 14,800).

What are you supposed to believe for 2080 and 2080Ti exactly?

September 23, 2018 | 01:00 PM - Posted by Anonymous78777 (not verified)

understands that there is software that improves video viewing through an NVIDIA video card
https://developer.nvidia.com/rtx/ngx

1. Not exactly explained on the site How does it work ?
Is it real-time movie viewing software or film production ?
2. Suppose I install the SDK, how do I run the software ? Driver that will give me high quality video ?
How does this quality?
3. What video card do I need to get the best video quality ?
4. Do you need a special movie player ? Do it work with YouTube ?
5. DLSS: What Does It Mean for Game Developers?
Is this software to create games? Not suitable for those who want to improve videos (in real time)
https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-develo...

Nothing is understood in this software

September 23, 2018 | 09:52 PM - Posted by mthomas (not verified)

Would someone please test the 2080 TI via Cinebench? I've seen so many tests, but not one test of Cinebench.

What would be really nice is to compare the 2080 TI and the 1080 Ti's performance via Cinebench......

Please?

October 1, 2018 | 05:54 PM - Posted by Leebee (not verified)

Why are the cards not sorted by performance?
Vega always put on bottom regardless if it beats 1080...
Is that supposed to be a psycho trick?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.