NVIDIA teases RTX 2080 performance and features

Subject: General Tech, Graphics Cards, Shows and Expos | August 22, 2018 - 02:06 PM |
Tagged: turing, RTX 2080, nvidia, geforce, ansel

NVIDIA has been showing off a slideshow in Germany, offering a glimpse at the new features Turing brings to the desktop as well as in-house performance numbers.  As you can see below, their testing shows a significant increase in performance from Pascal, it will be interesting to see how the numbers match up once reviewers get their hands on these cards.

View Full Size

While those performance numbers should be taken with a grain of salt or three, the various features which the new generation of chip brings to the table will appear as presented.   For fans of Ansel, you will be able to upscale your screenshots to 8k with Ansel AI UpRes, which offers an impressive implementation of anti-aliasing.  They also showed off a variety of filtres you can utilize to make your screenshots even more impressive.

View Full Size

The GigaRays of real time ray tracing capability on Turing look very impressive but with Ansel, your card has a lot more time to process reflections, refractions and shadows which means your screenshots will look significantly more impressive than what the game shows while you are playing.  In the example below you can see how much more detail a little post-processing can add.

View Full Size

There are a wide variety of released and upcoming games which will support these features; 22 listed by name at the conference.  A few of the titles only support some of the new features, such as NVIDIA Highlights, however the games below should offer full support, as well as framerates high enough to play at 4k with HDR enabled.

View Full Size

Keep your eyes peeled for more news from NVIDIA and GamesCom.

Source: NVIDIA

August 22, 2018 | 02:16 PM - Posted by Jorge santana (not verified)

2080 getting 4k 60 means we can get 4k 144ish with the 2080ti OC and details turn down.

August 22, 2018 | 02:55 PM - Posted by Originalblu (not verified)

And we thought bullshots were bad before.

Now with Nvidia RTX no screenshots will be representative of in game experience.

All sarcasm aside, this stuff is pretty cool.

August 22, 2018 | 03:32 PM - Posted by Anonymous2 (not verified)

BS slides. Don't believe the hype.

August 22, 2018 | 03:38 PM - Posted by Cyclops

The heck is a DLSS?

Edit: Apparently it means "Deep Learning Super-Sampling" which sounds really pretentious.

August 22, 2018 | 04:33 PM - Posted by Jeremy Hellstrom

If you see DL anything ... you can bet it will be Deep Learning.   It is more than 50% likely to have (AI)  included somewhere very close as well.

August 22, 2018 | 09:08 PM - Posted by Luthair

DL = down low, that is all.

August 22, 2018 | 09:35 PM - Posted by Jeremy Hellstrom

AKA bottom up AI!  :P

August 23, 2018 | 01:46 AM - Posted by Anony mouse (not verified)

DLSS = Down Loading Shady Sh*t

What if the AI wants to kill you and in the form of a Game Ready driver GeForce Experience say look new file.

SkyNet = Win

August 22, 2018 | 05:04 PM - Posted by PretentiousAndMarketingGoHandInHand (not verified)

No that's just the Tensor Cores running a trained Super-Sampling algorithm instead of a trained Ray Tracing denoising algorithm. So if AI any sort of Image Prossesing can be done quicker then that's good for gaming at acceptable frame rates.

Turing is new technology but that's the same Problem for Nvidia that AMD has had with its Primitive Shaders IP, and AMD did not have the funds or the time to get their Implicit Primitive Shaders to Explicit Primitive Shaders on the fly conversion working for Legacy games. The Primitive Shaders are still there working on Vega and Once AMD has the money to give to the gaming engine makers and game developers for their time then Vega's Explicit Primitive shaders will be used for gaming. Nvidia's got deeper pockets for paying for games development/software engineer hours, and they pay software engineers big bucks.

Nvidia's Tensor core IP and Ray Tracing will take time also but Nvidia's got billions more dollars to invest to get that process tweaked for Turing. Nvidia's Ray Tracing on Turing is not going to perform perfectly at product introduction but Nvidia has the funds to fix any issues more quickly than AMD could with AMD not having the funds to throw at their problems.

Most Folks want to Know the Turing Variants' TMU/ROP numbers so they can guesstimate how their legacy games will run on Turing compared to Pascal.

Go watch AdoredTV's new video and That's very interesting with what he has to say, and he does not let Nvidia's marketing off the hook but he does think that the Ray Tracing and Tensor Core IP is a great example of Nvidia innovating.

The Marketing profession is always been predicated on fibs, half truths, and obsfucation and the marketing profession traces its roots back to the Snake Oil Salesman and the first lie ever told.

But for most gameers they are looking at how their current gaming libraries will perform on Turing so Nvidia make with Turings Full TMU/ROP and other specs, WTF Nvidia!

Of Note: Any sort of AI cores on the GPU will mean that all those AI Ray Tracing denosing algorithms, AI Based DLSS, AI based anti aliasing algorithms/others can and will undergo constant improvments over time and AI operations run a whole lot faster than doing things the old way once the AI gets the training part done(That Training part is done on Supercomputing/AI clusters before the traind AI gets loaded onto Turing's AI processing cores).

August 22, 2018 | 04:43 PM - Posted by Joe (not verified)

isn't this just 1080Ti level performance? Is the 2080 a 1080Ti with ray tracing & DLSS? Is that worth $200 more?

August 22, 2018 | 06:12 PM - Posted by Jorge santana (not verified)

NAVI for CONSOLES is basically 1080 performance for $400 consoles, no way it supports REAL TIME RAY TRACING and no matter how much $$$ NVIDIA shells out that alone could delay RAY TRACING adoption for a while.

August 23, 2018 | 01:35 AM - Posted by Anonymous11 (not verified)

Nvidia is pushing Ray Tracing to sell expensive videocards with no new games right now. Nothing wrong with doing this.

Do gamers even really really need 4k? We need bigger better worlds with better AI that's for sure. PUBG needs to be fixed too.

AMD is probably going to win the console hardware again and they aren't pushing ray tracing right now so why should anyone else care what flavor Nvidia is selling during this inbetween generations phase?

If i had extra deep pockets right now i'd buy the new card and brag about it for sure. Look at how many homeless people have iphones.

Must you have 1080p ray tracing now? All us master racers run our graphics in low settings to get a competitive edge with our $2K desktops anyways. LOL

Why can't Trump get a 25% tariff on video game bots coming out of china?

August 22, 2018 | 07:16 PM - Posted by TUsRumorsAbound (not verified)

"A picture taken by our spies confirms the full specs of TU102 GPU. The full silicon is said to be equipped with 4608 CUDA cores, so Quadro RTX 6000 and 8000 are both using all available cores. The RTX 2080 Ti is therefore not using the full chip.

The specs we are leaking today, also reveal that Turing TU102 has 576 Tensor cores, 72 RT cores, 32 Geometry Units and 288 TMUs. The chip also features 96 Raster Units." (1)

So the leaks says TU102 has 96 ROPs that's the same as GP102's 96 ROP's. Looks like the TMU counts are higher also for TU102. 72 SMs for Turing I'm not buying that just yet.

(1)

"Exclusive: NVIDIA GeForce RTX 2080 (Ti) Editors’ Day leaks"

https://videocardz.com/77696/exclusive-nvidia-geforce-rtx-2080-ti-editor...

August 22, 2018 | 09:03 PM - Posted by 50004000 (not verified)

So the magic sauce is DLSS looking forward to the 4k numbers for a card with 2944 cuda cores that is faster than a card with 3584, should be fun.

August 22, 2018 | 10:29 PM - Posted by Highly Illogical (not verified)

Just to be clear the 2080 is the 1080 Ti replacement in effect and that's what the benchmarks should compare. 2070 should be compared to 1080. Pure rip-off cards loaded with features gamers don't need, especially tensor cores.

August 24, 2018 | 04:32 PM - Posted by OhNozYaDontz (not verified)

Lot's of games will make use of the AI accelerated Denoising and AI accelerated AA, and other AI accelerated image filtering done via those Tensor Cores. Nvidia's got the big bucks to pay the Games/Gaming Engine developers to inlude that on many titles. Folks will also at some point in time begin to make use of Vega Explicit Primitive Shaders just like the Games are making use of Vega's Rapid Packed 16 bit math. Hell even sound processing can be done on those Tensor Cores(matrix math units). And those TensorFlow libraries can be run on any GPU or CPU and get some things done a little quicker, even if that's not as quicks as on Nvidia's in hardware Tensor Cores!

Don't you piss poor Gaming Gits complain about having to pay more for your gaming addictions! Both Nvidia and AMD should be charging more as both Nvidia and AMD have those professional market Graphis/Compute/AI market sales that pay an even greater markup than any little Gaming MOOKs could ever afford to pay.

Nvidia is a little more dependent on consumers in the discrete GPU market but AMD's still geting Vega 10 Die sales in the professional GPU market and will be getting Vega 20 Die sales also at those great pro markups. Vega comes with Every Raven Ridge APU also.

Gamers only deserve the poop TU102 DIEs that do not make the grade for Pro usage from Nvidia and likewise for AMD with the Vega 10 poop Dies that do not make the grade to be used in Pro GPU versions. AMD is going to make more money from Epyc CPU sales than Nvidia makes from it's GPU sales and AMD really needs Nvidia to keep those GPU prices even higher. That way AMD can undercut Nvidia's even higher pricing and AMD can still make better revenues in spite of AMD's rather limited consumer discrete GPU market share!

Don't Ya Gaming Gits know that is all ones and zeros and math based and even on those simulated neural synapses on those tensor cores they are ones and zeros based! Ah ha ha it's just a magic black box to all you gaming Gits in Ya Mamma's basement. Now get of that fat Azz of Yaz and get a job to afford the Green Team's latest, fat little Eric snuggums!

August 25, 2018 | 10:31 AM - Posted by DatSomezTrollinzRightTherezToms (not verified)

Ha Ha Haz, Ahz ha Ha HA, That Tom's Hardware marketing driven "Just Buy It"(Ordering Turing Cards without any independent reviews) article is better at trolling gamers than Me!

Ah Ha Ha Ha HAzzzzzz!

Can you not see gamers how stupid you are for letting those Filthy Marketing Monkeys get to you, Apple's Hipster hardware junkies are the same, if not even more egregious, in their madly insane desire for that conspicuous consumption high.

Ah HA HA HA, at the way the markeing monkeys and the social scientists and psychologists that these billions and trillion dollar+ market cap companies employ.

Ah Ha Ha How Tom's is going down the quality drain and probably Anandtech will follow, among the others that are so in bead with the makers of the products that they the "Tech journalists" are supposed to be independently and objectively reviewing!

AHz Haz HAz HAZ, look at all the fools say goodby to their money.

Just like the Great Psychohistorian Harry Sheldon's Maths had predicted in that Brilliant Asimov Dude's Foundation Trilogy Things are becoming rather difficult in keeping up with the Joneses(or those Kardashian sycophants). And it's great to watch the Empire crumble over the ensuing centuries, as Empires have a tendency to so regularly do!

But I'm Just a troll and I'm just in it for your sweet sweet angst over that whole foolish Gaming Card ePeen issues that the gaming Goobs suffer as do their urban Goob equivalents the Hipsters so suffer if they can not afford their Apple branded ePeen Bling Bling!

AH Ha ha ha Ha Ha ah ha ha ha...!

August 23, 2018 | 01:57 AM - Posted by nothingvaluableisacquiredduringaugust (not verified)

Im curious as to what will be the margins between the 2080ti and 1080ti sli. Last generation was about 5% iirc. Also, will this be achieved via raw horsepower or sponsored software implementation? Unfortunately, all signs appear to be pointing to the latter.

August 24, 2018 | 07:36 PM - Posted by TheHowzItGoenGuyIsGoingAtItAgain (not verified)

Go Watch AdoredTV's new video He's in his usual form getting to the bottom of that nefarious Nvidia marketing slide but it looks like The RTX 2080 is getting some better Shader core cache/cache size and omprovments in the form of SM/Cuda core tweaks that will alow for some better performance over Pascal! Memory Bandwitdth also is helping for any higher than 1080P performance where the CPU is less the bottleneck like 1440p on up for Turing Over Pascal.

So go and watch the New AdoredTV video, he goes where the more marketing oriented "Tech" websites fear to go! He does complement Nvidia's new Technology progress, as he does do for any maker that innovates AMD also. But he finds that AMD is far behind on gaming compared to Turing.

So It's all about changing over to Ray Tracing and AI denoising/AI AA and other features that Nvidia is now leading with. AMD's in need of some Tensor Cores running Ray Tracing denoising or AI based AA/Other image processing IP. AMD's Vega does does better than Pascal with HDR and 10 bit color mainly due to the strain on Nvidia's compression and the memory bandwidth constraints of Pascal(See Adored TV video, 8/24/2018)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.