Subject: Graphics Cards | September 19, 2018 - 01:35 PM | Jeremy Hellstrom
Tagged: turing, tu102, RTX 2080 Ti, rtx, ray tracing, nvidia, gtx, geforce, founders edition, DLSS
Today is the day the curtain is pulled back and the performance of NVIDIA's Turing based consumer cards is revealed. If there was a benchmark, resolution or game that was somehow missed in our review then you will find it below, but make sure to peek in at the last page for a list of the games which will support Ray Tracing, DLSS or both!
The Tech Report found that the RTX 2080 Ti is an amazing card to use if you are playing Hellblade: Senua's Sacrifice as it clearly outperforms cards from previous generations as well as the base RTX 2080. In many cases the RTX 2080 matches the GTX 1080 Ti, though with the extra features it is an attractive card for those with GPUs several generations old. There is one small problem for those looking to adopt one of these cards, we have not seen prices like these outside of the Titan series before now.
"Nvidia's Turing architecture is here on board the GeForce RTX 2080 Ti, and we put it through its paces for 4K HDR gaming with some of today's most cutting-edge titles. We also explore the possibilities of Nvidia's Deep Learning Super-Sampling tech for the future of 4K gaming. Join us as we put Turing to the test."
Here are some more Graphics Card articles from around the web:
- MSI GeForce RTX 2080 Ti Gaming X TRIO @ Guru of 3D
- Nvidia RTX 2080 and 2080 Ti review: A tale of two very expensive graphics cards @ Ars Technica
- GeForce RTX 2080 @ Guru of 3D
- RTX 2080 Ti Founder Edition @ Guru of 3D
- Turing RTX 2080 and RTX 2080 Ti Benchmarked with 36 Games @ BabelTechReviews
- NVIDIA GeForce RTX IS HERE. Introducing the GeForce RTX 2080 & RTX 2080 Ti – 4K 60 FPS or bust! Review @ Bjorn3d
- Nvidia GeForce RTX 2080TI & RTX 2080 @ Modders-Inc
- MSI GeForce RTX 2080 Gaming X Trio 8 GB @ TechPowerUp
- ASUS GeForce RTX 2080 STRIX OC 8 GB @ TechPowerUp
- Palit GeForce RTX 2080 Gaming Pro OC 8 GB @ TechPowerUp
- MSI GeForce RTX 2080 Ti Duke 11 GB @ TechPowerUp
- Nvidia GeForce RTX 2080 & 2080 Ti @ Techspot
- ASUS GeForce RTX 2080 Ti STRIX OC 11 GB @ TechPowerUp
- MSI GeForce RTX 2080 Ti Gaming X Trio 11 GB @ TechPowerUp
- NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Reviewed @ OCC
- NVIDIA GeForce RTX 2080 Founders Edition 8 GB @ TechPowerUp
- Nvidia RTX 2080 @ Kitguru
- Nvidia RTX 2080 Ti @ Kitguru
- NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB @ TechPowerUp
- Nvidia Turing GeForce 2080 (Ti) architecture @ Guru of 3D
- NVIDIA Turing GeForce RTX Technology & Architecture @ TechPowerUp
New Generation, New Founders Edition
At this point, it seems that calling NVIDIA's 20-series GPUs highly anticipated would be a bit of an understatement. Between months and months of speculation about what these new GPUs would be called, what architecture they would be based off, and what features they would bring, the NVIDIA GeForce RTX 2080 and RTX 2080 Ti were officially unveiled in August, alongside the Turing architecture.
We've already posted our deep dive into the Turing architecture and the TU 102 and TU 104 GPUs powering these new graphics cards, but here's a short take away. Turing provides efficiency improvements in both memory and shader performance, as well as adds additional specialized hardware to accelerate both deep learning (Tensor cores), and enable real-time ray tracing (RT cores).
|RTX 2080 Ti||Quadro RTX 6000||GTX 1080 Ti||RTX 2080||Quadro RTX 5000||GTX 1080||TITAN V||RX Vega 64 (Air)|
|Base Clock||1350 MHz||1455 MHz||1408 MHz||1515 MHz||1620 MHz||1607 MHz||1200 MHz||1247 MHz|
|Boost Clock||1545 MHz/
1635 MHz (FE)
|1770 MHz||1582 MHz||1710 MHz/
1800 MHz (FE)
|1820 MHz||1733 MHz||1455 MHz||1546 MHz|
|Ray Tracing Speed||10 GRays/s||10 GRays/s||--||8 GRays/s||8 GRays/s||--||--||--|
|Memory Clock||14000 MHz||14000 MHz||11000 MHz||14000 MHz||14000 MHz||10000 MHz||1700 MHz||1890 MHz|
|Memory Interface||352-bit G6||384-bit G6||352-bit G5X||256-bit G6||256-bit G6||256-bit G5X||3072-bit HBM2||2048-bit HBM2|
|Memory Bandwidth||616GB/s||672GB/s||484 GB/s||448 GB/s||448 GB/s||320 GB/s||653 GB/s||484 GB/s|
260 W (FE)
|260 W||250 watts||215W
|230 W||180 watts||250W||292|
|Peak Compute (FP32)||13.4 TFLOPS / 14.2 TFLOP (FE)||16.3 TFLOPS||10.6 TFLOPS||10 TFLOPS / 10.6 TFLOPS (FE)||11.2 TFLOPS||8.2 TFLOPS||14.9 TFLOPS||13.7 TFLOPS|
|Transistor Count||18.6 B||18.6B||12.0 B||13.6 B||13.6 B||7.2 B||21.0 B||12.5 B|
|MSRP (current)||$1200 (FE)/
As unusual as it is for them NVIDIA has decided to release both the RTX 2080 and RTX 2080 Ti at the same time, as the first products in the Turing family.
The TU102-based RTX 2080 Ti features 4352 CUDA cores, while the TU104-based RTX 2080 features 2944, less than the GTX 1080 Ti. Also, these new RTX GPUs have moved to GDDR6 from the GDDR5X we found on the GTX 10-series.
Subject: Graphics Cards | September 16, 2018 - 11:18 AM | Scott Michaud
Tagged: nvidia, rtx, RTX 2080 Ti, RTX 2080
There are two changes to the launch of NVIDIA’s GeForce RTX 20-series of cards. The first change is that the general availability, as in the first possible moment to purchase a GeForce RTX 2080 Ti without a pre-order, has slipped a week, from September 20th to September 27th. The second is that pre-orders of the GeForce RTX 2080 Ti have also slipped. They will ship between September 20th and September 27th, rather than all of them shipping on September 20th.
The GeForce RTX 2080 (without the Ti) will still launch on September 20th.
This was all announced on the NVIDIA forums. The brief, ~six-sentence post did not clarify whether this applied to the OEMs, such as ASUS, EVGA, MSI, PNY, ZOTAC, and Gigabyte. It’s entirely possible that they are just referring to the Founder’s Edition. NVIDIA also did not mention why the delay occurred. Given the relatively short duration, it could be anything from one of the recent natural disasters to accidentally forgetting to add an automatic stop threshold to the pre-order page. Who knows?
The NVIDIA website has been updated to show “Notify Me” instead of “Pre-Order” for the GeForce RTX 2080 Ti, so pre-orders have officially shut down for that product. The regular RTX 2080 is still available for pre-order on NVIDIA’s website, though, so you still have a little time to pre-order those.
You can also, of course, wait for the reviews to make a more informed decision later.
A Look Back and Forward
Although NVIDIA's new GPU architecture, revealed previously as Turing, has been speculated about for what seems like an eternity at this point, we finally have our first look at exactly what NVIDIA is positioning as the future of gaming.
Unfortunately, we can't talk about this card just yet, but we can talk about what powers it
First though, let's take a look at the journey to get here over the past 30 months or so.
Unveiled in early 2016, Pascal marked by the launch of the GTX 1070 and 1080 was NVIDIA's long-awaited 16nm successor to Maxwell. Constrained by the oft-delayed 16nm process node, Pascal refined the shader unit design original found in Maxwell, while lowering power consumption and increasing performance.
Next, in May 2017 came Volta, the next (and last) GPU architecture outlined in NVIDIA's public roadmaps since 2013. However, instead of the traditional launch with a new GeForce gaming card, Volta saw a different approach.
Retesting the 2990WX
Earlier today, NVIDIA released version 399.24 of their GeForce drivers for Windows, citing Game Ready support for some newly released games including Shadow of the Tomb Raider, The Call of Duty: Black Ops 4 Blackout Beta, and Assetto Corsa Competizione early access.
While this in and of itself is a normal event, we shortly started to get some tips from readers about an interesting bug fix found in NVIDIA's release notes for this specific driver revision.
Specifically addressing performance differences between 16-core/32-thread processors and 32-core/64-thread processors, this patched issue immediately rang true of our experiences benchmarking the AMD Ryzen Threadripper 2990WX back in August, where we saw some games resulting in frames rates around 50% slower than the 16-core Threadripper 2950X.
This particular patch note lead us to update out Ryzen Threadripper 2990WX test platform to this latest NVIDIA driver release and see if there were any noticeable changes in performance.
The full testbed configuration is listed below:
|Test System Setup|
AMD Ryzen Threadripper 2990WX
|Motherboard||ASUS ROG Zenith Extreme - BIOS 1304|
16GB Corsair Vengeance DDR4-3200
Operating at DDR4-2933
|Storage||Corsair Neutron XTi 480 SSD|
|Graphics Card||NVIDIA GeForce GTX 1080 Ti 11GB|
|Graphics Drivers||NVIDIA 398.26 and 399.24|
|Power Supply||Corsair RM1000x|
|Operating System||Windows 10 Pro x64 RS4 (17134.165)|
Included at the end of this article are the full results from our entire suite of game benchmarks from our CPU testbed, but first, let's take a look at some of the games that provided particularly bad issues with the 2990WX previously.
The interesting data points for this testing are the 2990WX scores across both the driver revision we tested across every CPU, 398.26, as well as the results from the 1/4 core compatibility mode, and the Ryzen Threadripper 2950X. From the wording of the patch notes, we would expect gaming performance between the 16-core 2950X and the 32-core 2990WX to be very similar.
Grand Theft Auto V
GTA V was previously one of the worst offenders in our original 2990WX testing, with the frame rate almost halving compared to the 2950X.
However, with the newest GeForce driver update, we see this gap shrinking to around a 20% difference.
Subject: General Tech, Graphics Cards | September 7, 2018 - 01:36 PM | Jeremy Hellstrom
Tagged: jon peddie, gpu market share, amd, nvidia
Last week we had a peek at the overall GPU market, including APUs, and the news was not great. This week Jon Peddie released details on the discrete GPU market, which also saw contractions. When you look at this quarter versus last quarter, sales dropped by 28% and are down 5.7% from this time last year, similar to the trend we saw with the total market. If you look back over time Q2 tends to be a bad quarter for GPU sales and the current market is actually larger in total volume than two years ago, before the mining craze was fully underway.
You can see the details of AMD and NVIDIA's quarter below.
The market shares for the desktop discrete GPU suppliers shifted in the quarter, Nvidia increased market share from last quarter, while AMD enjoyed an increase in share year-to-year."
Here is some more Tech News from around the web:
- Valve Explains How It Decides Who's a 'Straight Up Troll' Publishing Video Games On Steam @ Slashdot
- Memory production value growth to slow in 2019, says Digitimes Research @ DigiTimes
- British Airways breach sees hackers take-off with customers' payment details @ The Inquirer
- Do you really think crims would do that? Just go on the 'net and exploit a Windows zero-day? @ The Register
- iPhone XS release date, price and specs: Apple's 2018 iPhones look set to be most expensive yet @ The Inquirer
- Voyager 1 left the planet 41 years ago. SpaceX hopes to land on it on Saturday @ The Register
- Tech ARP 20th Anniversary Giveaway Week 2 by Dell!
- CORSAIR T2 ROAD WARRIOR Gaming Chair @ [H]ard|OCP
Subject: Graphics Cards | September 5, 2018 - 05:50 PM | Jeremy Hellstrom
Tagged: amd, GCN, R9 290X, r9 390x, R9 Fury X, RX VEGA 64
[H]ard|OCP have been examining the generational performance differences between GPUs, starting with NVIDIA and moving onto AMD. In this review they compare Hawaii GCN 1.1, Fiji GCN 1.3 and Vega10 GCN 1.5 on a wide variety of games. AMD is a more interesting case as they have made more frequent changes to their architecture, while at the same time tending towards mid-range performance as opposed to aiming for the high end of performance and pricing. This has led to interesting results, with certain GCN versions offering more compelling upgrade paths than others. Take a close look to see how AMD's GPUs have changed over the past five years.
"Wonder how much performance you are truly getting from GPU to GPU upgrade in games? We take GPUs from AMD and compare performance gained from 2013 to 2018. This is our AMD GPU Generational Performance Part 1 article focusing on the Radeon R9 290X, Radeon R9 390X, Radeon R9 Fury X, and Radeon RX Vega 64 in 14 games."
Here are some more Graphics Card articles from around the web:
- The New 3GB GeForce GTX 1050: Good Product or Misleading Product? @ TechSpot
- Razer Core X @ Kitguru
- Blackmagic external GPU review: A very Apple graphics solution @ Ars Technica
Subject: Graphics Cards | August 28, 2018 - 01:46 PM | Jeremy Hellstrom
Tagged: Radeon Software Adrenalin Edition, radeon, amd, 18.8.2
Hot on the heels of the NVIDIA update, AMD has released a new driver for your Radeon and Vega cards or your APU, with optimizations for Strange Brigade and F1 2018 with a focus on high resolution performance.
In addition to the new games, there are fixes for Far Cry 5 and solutions to problems some users encountered with FRTC and Instant Replay enabled. You can grab them right here.
- Strange Brigade
- Up to 5% faster performance in Strange BrigadeTM using Radeon Software Adrenalin Edition 18.8.2 on the RadeonTM RX Vega 64 (8GB) graphics card than with RadeonTM Software Adrenalin Edition 18.8.1 at 3840x2160 (4K).
- Up to 3% faster performance in Strange BrigadeTM using Radeon Software Adrenalin Edition 18.8.2 on the RadeonTM RX 580 (8GB) graphics card than with RadeonTM Software Adrenalin Edition 18.8.1 at 2560x1440 (1440p).
- F1 2018
- Some games may experience instability or stutter when playing with FRTC and Instant Replay enabled.
- Upgrade Advisor may not appear in Radeon Settings game manager.
- Far Cry 5 may experience dimmed or grey images with HDR10 enabled on some system configurations.
- Far Cry 5 may experience an application hang when changing video settings on some system configurations.
- Radeon Chill min and max values may not sync on multi GPU system configurations.
- Radeon FreeSync may fail to enable when playing Call of Duty®: Black Ops 4.
Your Mileage May Vary
One of the most interesting things going around in the computer hardware communities this past weekend was the revelation from a user named bryf50 on Reddit that they somehow had gotten his FreeSync display working with his NVIDIA GeForce GPU.
For those of you that might not be familiar with the particular ins-and-outs of these variable refresh technologies, getting FreeSync displays to work on NVIDIA GPUs is potentially a very big deal.
While NVIDIA GPUs support the NVIDIA G-SYNC variable refresh rate standard, they are not compatible with Adaptive Sync (the technology on which FreeSync is based) displays. Despite Adaptive Sync being an open standard, and an optional extension to the DisplayPort specification, NVIDIA so far has chosen not to support these displays.
However, this provides some major downsides to consumers looking to purchase displays and graphics cards. Due to the lack of interoperability, consumers can get locked into a GPU vendor if they want to continue to use the variable refresh functionality of their display. Plus, Adaptive-Sync/FreeSync monitors, in general, seem to be significantly more inexpensive for similar specifications.
Subject: General Tech, Graphics Cards, Shows and Expos | August 22, 2018 - 02:06 PM | Jeremy Hellstrom
Tagged: turing, RTX 2080, nvidia, geforce, ansel
NVIDIA has been showing off a slideshow in Germany, offering a glimpse at the new features Turing brings to the desktop as well as in-house performance numbers. As you can see below, their testing shows a significant increase in performance from Pascal, it will be interesting to see how the numbers match up once reviewers get their hands on these cards.
While those performance numbers should be taken with a grain of salt or three, the various features which the new generation of chip brings to the table will appear as presented. For fans of Ansel, you will be able to upscale your screenshots to 8k with Ansel AI UpRes, which offers an impressive implementation of anti-aliasing. They also showed off a variety of filtres you can utilize to make your screenshots even more impressive.
The GigaRays of real time ray tracing capability on Turing look very impressive but with Ansel, your card has a lot more time to process reflections, refractions and shadows which means your screenshots will look significantly more impressive than what the game shows while you are playing. In the example below you can see how much more detail a little post-processing can add.
There are a wide variety of released and upcoming games which will support these features; 22 listed by name at the conference. A few of the titles only support some of the new features, such as NVIDIA Highlights, however the games below should offer full support, as well as framerates high enough to play at 4k with HDR enabled.
Keep your eyes peeled for more news from NVIDIA and GamesCom.