Owning the 2080 Ti is still a fantasy but this will not be the final leak of benchmarks

Subject: General Tech | September 18, 2018 - 01:07 PM |
Tagged: RTX 2080 Ti, nvidia, leak, benchmark

A new leak has sprung from the green team, with a 2080 Ti purportedly showing up on some Final Fantasy XV benchmarks.  The cards are in reviewers hands so it is possible someone slipped up on their NDA and these accurately depict performance, though this being the internet it is also likely someone is trolling.  If true, the new card is almost 25% faster than the mighty Titan Xp, at least in a Final Fantasy XV benchmark.   Unfortunately it will also cost more than a Titan Xp when it does finally arrive. 

Drop by The Inquirer for a peek.

View Full Size

"At least that's according to results that popped up in a leaked database of Final Fantasy XV benchmarks, hat tip to TechRadar, in which the RTX 2080 Ti racked up a score of 5,897 compared to the 4,756 achieved by the Titan Xp."

Here is some more Tech News from around the web:

Tech Talk


Source: The Inquirer

Video News

September 18, 2018 | 02:12 PM - Posted by TheProfitsAreWithCPUsMoreThanGPUsForAMD (not verified)

So the Inquirer points to a Techradar article which points to Wccftech article that looks at the Final Fantasy XV benchmark database!

And Page hits all around for that one!

And the Wccftech primate house daily feces flinging contest begins again with gaming folk speculating on AMD's demise!

Really, Gamers, AMD will make more money off of It's Zen/Epyc sales than any of its GPU sales, even if AMD could actually compete on the Flagship GPU level with Nvidia. AMD does not have to care about Flagship Gaming ever with that x86 64 bit IP(AMD created the x86 64 bit ISA extentions that even Intel uses) AND x86 32 bit cross license that AMD has from from Intel. And Intel cross licenses the x86 64 bit ISA from AMD.

Nvidia has to win the GPU race or Nvidia will be the one to go out of business as AMD has its CPU market revenue stream that has a bit more potential for revenues/profits than any GPU market revenues alone. AMD can continue to humor the gamers and focus on consoles and mainstram GPUs because AMD has its Epyc/Server revenue potential that makes any consumer GPU market sales a non issue for AMD's continued prosperity.

If AMD's CEO continues to play the wise business manager game then AMD should let Nvidia drastically increase discrete GPU MSRPs/ASPs over the next few years while AMD will have sufficient time to create some RTX type of GPU competition of its own and then field a line of competative Mainstream discrete GPUs with some Tensor core/Ray Tracing cores of their own.

AMD should let both Microsoft and Sony fund most of AMD's Tensor Cores and Ray Tracing R&D costs while Nvidia continues to drive up the GPU market's average discrete GPU MSRP/ASP figures. This is so AMD can undercut Nvidia's already very high average MSRPs/ASPs for Turing and AMD can still make a handsome profit from Consumer GPU sales.

Nvidia driving up the ASPs on discrete GPUs can also benifit AMD's GPU revenues if AMD plays the business game properly. AMD has what Nvidia can not match with AMD's x86 based Epyc CPU SKUs and no discrete GPU can run on its own without some help from a CPU!

I say this to AMD: continue focusing on Epyc and Professional Vega(Vega 20 for the Professional GPU AI/Compute market) and let Nvidia get its GPU ASPs so high that Abdul Jabbar could not reach those Nvidia GPU ASPs with a Skyhook. And then AMD can come in and undercut Nvidia's sky high ASPs a little, just enough for AMD to ALSO profit also from those very same Nvidia like Sky High GPU MSRPs/ASPs.

Look at AMD's share pricese latey they are going up on Epyc News alone, AMD is no longer dependent on the consumer market for AMD's continued prosperity! Watch those Epyc CPU, and Pro Radeon WX/Instinct GPU, revenue streams grow and consumer/games sales do not matter as much for AMDs continued survival anymore!

September 18, 2018 | 02:21 PM - Posted by TheProfitsAreWithCPUsMoreThanGPUsForAMD (not verified)

Edit: share pricese latey
To: share prices lately

September 18, 2018 | 09:50 PM - Posted by quest4glory


As always.

September 18, 2018 | 11:39 PM - Posted by Anonymously3 (not verified)

... and someone comes in with a random rant about AMD which has nothing to do with the article :D

September 18, 2018 | 03:59 PM - Posted by Anony mouse (not verified)

Wonder how many Tech reviewers will ignore no ray tracing games at launch and the performance % to price % increase.

September 18, 2018 | 04:21 PM - Posted by LegacyGamesImprovements (not verified)

I want to see legacy games tested on Turing so readers can see if there are any substantial improvements in Turing's Shader Cores, TMUs and ROPs over the Pascal GPU Micro-Arch's Shader Cores, TMUs, and ROPs. It looks like the Turing SKUs have the same numbers of ROP's as the Pascal generation variants.

So those improved Turing Shader cores and Cache subsystems and ROPs, TMUs also should see some increases in performance for legacy games if one refers to the Turing whitepaper! Clock speed improvements will help Turing also as well as power management but the old legacy games that do not make use of RTX/Turing's RT cores and AI/Tensor cores IP will be agreat way to test Turing's non RTX/AI IP also.

Do not discount Proper GPU power management featuers as that can help with higher average clock speeds for gaming! So that's going to be interesting also to see for Turing over Pascal!

September 18, 2018 | 04:31 PM - Posted by Scott Michaud

While it's good to compare it dollar-to-dollar, the chip they made this time around is absolutely huge. NVIDIA's own costs skyrocketed with Turing, not just their prices. It's not like they're just pocketing the extra $500.

But yes, I would definitely like to see reviewers include conventional titles, dollar performance metrics, etc. in their reporting. That's the point of a review -- to see all relevant (to the reader) aspects of a product.

September 18, 2018 | 04:50 PM - Posted by LegacyGamesImprovements (not verified)

Scott, how long do you think it will take for Blender 3D/The other 3d graphics related software to begin to take advantage of Turing's In Hardware Ray Tracing and In Hardware AI/Tensor Core IP?

Could you Ask Nvidia about Blender 3D and Other 3D software support that may be incoming for RTX Turing and get some timeframe from the Blender Foundation/Other 3D graphics software makers if possible.

I know that Adobe/Others are already doing AI related processing on older generation Nvidia and AMD GPUs also for things like edge/content detection AI related processing. I'm also seeing more AI related image filtering being done using AI/Tensor Flow libraries running on GPUs or even CPUs.

September 19, 2018 | 04:37 AM - Posted by Anony mouse (not verified)

Forbes - New RTX 2080 Benchmarks: 'Final Fantasy XV' Results Reveal Pricing Problem


September 18, 2018 | 06:59 PM - Posted by Anonymous50000 (not verified)

My 1080ti at 2139/5800 does 5717@standard and 4788@high, the 2080 just looks like a 1080ti with ray tracing and the ti is just too expensive to be called gaming card.

September 18, 2018 | 09:14 PM - Posted by Odin (not verified)

I see the 2080 Ti is said to be a 1080 Ti replacement when it it's clearly a consumer version of the Titan with RTX and Tensor cores. The 2080 should be compared to the 1080 Ti, and when you do you see much less improvement, showing what waste of money these cards are. By the time we have a decent number of games supporting ray tracing we'll be on next gen cards anyway. AMD is in the box seat if they price their 7nm cards right.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.