Subject: Graphics Cards | July 22, 2018 - 03:10 PM | Scott Michaud
Tagged: nvidia, gtx 1170, geforce
Take these numbers with a grain of salt, but WCCFTech has published what they claim is leaked GeForce GTX 1170 benchmarks, found “on Polish hardware forums”. If true, the results show that the graphics card, which would be below the GTX 1180 in performance, is still above the enthusiast-tier GTX 1080 Ti (at least on 3DMark FireStrike). It also suggests that both the GPU core and 16GB of memory are running at ~2.5 GHz.
Image Credit: “Polish Hardware Forums” via WCCFTech
So not only would the GTX 1180 be above the GTX 1080 Ti… but the GTX 1170 apparently is too? Also… 16GB on the second-tier card? Yikes.
Beyond the raw performance, new architectures also give NVIDIA the chance to add new features directly to the silicon. That said, FireStrike is an old-enough benchmark that it won’t take advantage of tweaks for new features, like NVIDIA RTX, so those should be above-and-beyond the increase seen in the score.
Don’t trust every screenshot you see…
Again, if this is true. The source is a picture of a computer monitor, which begs the question, “Why didn’t they just screenshot it?” Beyond that, it’s easy to make a website say whatever you want with the F12 developer tools of any mainstream web browser these days… as I’ve demonstrated in the image above.
Subject: Graphics Cards | June 26, 2018 - 10:01 PM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
NVIDIA aligns their graphics driver releases with game launches, and today’s 398.36 is for Ubisoft’s The Crew 2. The game comes out on Friday, but the graphics vendors like to give a little room if possible (and a Friday makes that much easier than a Tuesday). NVIDIA is also running a bundle deal – you get The Crew 2 Standard Edition free when you purchase a qualifying GTX 1080, GTX 1080 Ti, GeForce gaming desktop, or GeForce gaming laptop. Personally, I would wait for new graphics cards to launch, but if you need one now then – hey – free game!
Now onto the driver itself.
GeForce 398.36 is actually from the 396.xx branch, which means that it’s functionally similar to the previous drivers. NVIDIA seems to release big changes with the start of an even-numbered branch, such as new API support, and then spend the rest of the release, and its odd-numbered successor, fixing bugs and adding game-specific optimizations. While this means that there shouldn’t be anything surprising, it also means that it should be stable and polished.
This brings us to the bug fixes.
If you were waiting for the blue-screen issue with Gears of War 4 to be fixed on Pascal GPUs, then grab your chainsaws it should be good to go. Likewise, if you had issues with G-SYNC causing stutter outside of G-SYNC games, such as the desktop, then that has apparently been fixed, too.
When you get around to it, the new driver is available on GeForce Experience and NVIDIA’s site.
Subject: Systems | January 8, 2018 - 09:00 AM | Scott Michaud
Tagged: CES, ROG, nvidia, Intel, GTX 1080, geforce, coffee lake, asus, CES 2018
ASUS has just announced a high-end gaming desktop: the ROG Strix GL12. It looks like it will be a standard mid-tower form factor with a highly stylized design and, of course, RGB lights. They will pair with Aura Sync, so you make your case match your keyboard and pretty much whatever else you have from ASUS with RGB lights in it.
The main selling feature of the system, however, is the factory-overclocked Coffee Lake CPU – up to six cores at 4.8 GHz. You can also pair this with an NVIDIA GTX 1080. At first, I found it odd that they didn’t go up to the GTX 1080 Ti given the rest of the system, although I guess they would need to produce stock ahead of time, and it would be risky to have too many enthusiast parts sitting in a warehouse. They don’t state the maximum configurable RAM, but Coffee Lake maxes out at 64 GB so we know that it won’t be more than that. It all depends on whether ASUS wants to make a 32 GB or a 64 GB SKU.
The ASUS ROG Strix GL12 gaming desktop will launch in April. Pricing TBA.
Subject: General Tech, Graphics Cards | January 5, 2018 - 02:59 PM | Jeremy Hellstrom
Tagged: meltdown, spectre, geforce, quadro, NVS, nvidia, tesla, security
If you were wondering if NVIDIA products are vulnerable to some of the latest security threats, the answer is yes. Your Shield device or GPU is not vulnerable to CVE-2017-5754, aka Meltdown, however the two variants of Spectre could theoretically be used to infect you.
Variant 1 (CVE-2017-5753): Mitigations are provided with the security update included in this bulletin. NVIDIA expects to work together with its ecosystem partners on future updates to further strengthen mitigations.
Variant 2 (CVE-2017-5715): Mitigations are provided with the security update included in this bulletin. NVIDIA expects to work together with its ecosystem partners on future updates to further strengthen mitigations.
Variant 3 (CVE-2017-5754): At this time, NVIDIA has no reason to believe that Shield TV/tablet is vulnerable to this variant.
The Android based Shield tablet should be updated to Shield Experience 5.4, which should arrive before the end of the month. Your Shield TV, should you actually still have a working on will receive Shield Experience 6.3 along the same time frame.
The GPU is a little more complex as there are several product lines and OSes which need to be dealt with. There should be a new GeForce driver appearing early next week for gaming GPUs, with HPC cards receiving updates on the dates you can see below.
There is no reason to expect Radeon and Vega GPUs to suffer from these issues at this time. Intel could learn a bit from NVIDIA's response, which has been very quick and includes ther older hardware.
Subject: Graphics Cards | November 17, 2017 - 02:08 PM | Ryan Shrout
Tagged: titan xp, Star Wars, nvidia, jedi order, jedi, geforce, galactic empire, empire
NVIDIA has a coup on its hands this holiday. With the release of Battlefront II today and The Last Jedi next month, a new series of Titan Xp cards is available that will make Star Wars fans giggle with excitement! This is the same Titan Xp performance we expect but with a completely new external design and style, available in both a red-themed Galactic Empire version and a green-themed Jedi Order option.
Check out the video above for the unboxing and my thoughts as I swoon over them...
If you want some more pictures of the goods, I have them here as well.
Do note - though it's hard to recommend a $1200 graphics card to many people, these cards almost seem like a steal considering they are priced at the same cost as the standard Titan Xp models. I know that the price for these custom shrouds in short runs was not cheap, so its almost like NVIDIA is giving Star Wars that double as PC enthusiasts a little gift for the holidays.
Okay, that might be a stretch... But come on, look how awesome these graphics cards look!!
We are working up a full system build (time for my personal upgrade!) with these two GPUs and will have a build log of that up before Christmas. Don't worry, we plan on properly presenting this hardware through an all-glass chassis!
Subject: Graphics Cards | November 7, 2017 - 03:21 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce, msi
NVIDIA chose to limit the release of their GTX 1070 Ti to reference cards, all sporting the same clocks regardless of the model. That does not mean that the manufacturers skimped on the features which help you overclock successfully. As a perfect example, the MSI GTX 1070 Ti GAMING TITANIUM was built with Hi-C CAPs, Super Ferrite Chokes, and Japanese Solid Caps and 10-phase PWM. This resulted in an impressive overclock of 2050MHz on the GPU and a memory frequency of 9GHz once [H]ard|OCP boosted the power delivered to the card. That boost is enough to meet or even exceed the performance of a stock GTX 1080 or Vega 64 in most of the games they tested.
"NVIDIA is launching the GeForce GTX 1070 Ti today, and we’ve got a custom retail MSI GeForce GTX 1070 Ti GAMING TITANIUM video card to test and overclock, yes overclock, to the max. We’ll make comparisons against GTX 1080/1070, AMD Radeon RX Vega 64 and 56 for a complete review."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 1070 Ti @ The Tech Report
- NVIDIA GeForce GTX 1070 Ti, Takes On The Radeon RX Vega 64 Under Linux @ Phoronix
- MSI GeForce GTX 1070 Ti Titanium 8G @ Guru of 3D
- NVIDIA GeForce GTX 1070 Ti Founders Edition Review @ OCC
- ASUS GTX 1070 Ti STRIX 8 GB @ TechPowerUp
- Colorful iGame GTX 1070 Ti Vulcan X TOP 8 GB @ TechPowerUp
- 34-Way Graphics Card Comparison On Ubuntu 17.10 @ Phoronix
Subject: Graphics Cards | November 2, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce
It should come as no surprise to anyone how the GTX 1070 Ti performs, better than a GTX 1070 but not quite as fast as a GTX 1080 ... unless you overclock. With the push of two buttons Ryan was able to hit 1987 MHz which surpasses your average GTX 1080 by a fair margin. Hardware Canucks saw 2088MHz when they overclocked as well as memory of 8.9Gbps which pushed the performance past the reference GTX 1080 in many games. Their benchmark suite encompasses a few different games so you should check to see if your favourites are there.
The real hope of this launch was that prices would change, not so much the actual prices you pay but the MSRP of cards both AMD and NVIDIA. For now that has not happened but perhaps soon it will, though Bitcoin hitting $7000 does not help.
"NVIDIA’s launch of their new GTX 1070 Ti is both senseless and completely sensible depending on which way you tend to look at things. The emotional among you are going to wonder why NVIDIA is even bothering to introduce a new product into a lineup that’s more than a year old."
Here are some more Graphics Card articles from around the web:
- GeForce GTX 1070 Ti Founder Edition @ Guru of 3D
- GeForce GTX 1070 Ti 2-way FCAT SLI @ Guru of 3D
- MSI GeForce GTX 1070 Ti Gaming @ Guru of 3D
- Palit GeForce GTX 1070 Ti Super Jetstream @ Guru of 3D
- Nvidia GTX 1070 Ti review: A fine graphics card—but price remains high @ Ars Technica
- GTX 1070 Ti Review- 35 Games benchmarked @ BabelTechReviews
- MSI GTX 1070 Ti Gaming 8 GB @ TechPowerUp
- NVIDIA GeForce GTX 1070 Ti Founders Edition 8 GB @ TechPowerUp
- A Quick Look At NVIDIA’s GeForce GTX 1070 Ti @ Techgage
- MSI GeForce GTX 1070 Ti Gaming
- MSI GTX 1070 Ti Gaming 8G @ Kitguru
- Palit GTX 1070 Ti Super JetStream 8 GB @ TechPowerUp
- Palit GTX 1070 Ti Super JetStream @ Kitguru
- The NVIDIA GeForce GTX 1080 Ti Founders Edition @ TechARP
- MSI GeForce GTX 1080 Ti GAMING X TRIO @ [H]ard|OCP
- Sapphire RX VEGA 64 Limited Edition @ Modders-Inc
- The AMD Radeon RX Vega 64 @ TechARP
Here comes a new challenger
The release of the GeForce GTX 1070 Ti has been an odd adventure. Launched into a narrow window of a product stack between the GTX 1070 and the GTX 1080, the GTX 1070 Ti is a result of the competition from the AMD RX Vega product line. Sure, NVIDIA might have speced out and prepared an in-between product for some time, but it was the release of competitive high-end graphics cards from AMD (for the first time in forever it seems) that pushed NVIDIA to launch what you see before us today.
With MSRPs of $399 and $499 for the GTX 1070 and GTX 1080 respectively, a new product that fits between them performance wise has very little room to stretch its legs. Because of that, there are some interesting peculiarities involved with the release cycle surrounding overclocks, partner cards, and more.
But before we get into that concoction, let’s first look at the specifications of this new GPU option from NVIDIA as well as the reference Founders Edition and EVGA SC Black Edition cards that made it to our offices!
GeForce GTX 1070 Ti Specifications
We start with our classic table of details.
|RX Vega 64 Liquid||RX Vega 64 Air||RX Vega 56||Vega Frontier Edition||GTX 1080 Ti||GTX 1080||GTX 1070 Ti||GTX 1070|
|Base Clock||1406 MHz||1247 MHz||1156 MHz||1382 MHz||1480 MHz||1607 MHz||1607 MHz||1506 MHz|
|Boost Clock||1677 MHz||1546 MHz||1471 MHz||1600 MHz||1582 MHz||1733 MHz||1683 MHz||1683 MHz|
|Memory Clock||1890 MHz||1890 MHz||1600 MHz||1890 MHz||11000 MHz||10000 MHz||8000 MHz||8000 MHz|
|Memory Interface||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||352-bit G5X||256-bit G5X||256-bit||256-bit|
|Memory Bandwidth||484 GB/s||484 GB/s||410 GB/s||484 GB/s||484 GB/s||320 GB/s||256 GB/s||256 GB/s|
|TDP||345 watts||295 watts||210 watts||300 watts||250 watts||180 watts||180 watts||150 watts|
|Peak Compute||13.7 TFLOPS||12.6 TFLOPS||10.5 TFLOPS||13.1 TFLOPS||11.3 TFLOPS||8.2 TFLOPS||7.8 TFLOPS||5.7 TFLOPS|
If you have followed the leaks and stories over the last month or so, the information here isn’t going to be a surprise. The CUDA core count of the GTX 1070 Ti is 2432, only one SM unit less than the GTX 1080. Base and boost clock speeds are the same as the GTX 1080. The memory system includes 8GB of GDDR5 running at 8 GHz, matching the performance of the GTX 1070 in this case. The TDP gets a bump up to 180 watts, in line with the GTX 1080 and slightly higher than the GTX 1070.
Forza Motorsport 7 Performance
The first full Forza Motorsport title available for the PC, Forza Motorsport 7 on Windows 10 launched simultaneously with the Xbox version earlier this month. With native 4K assets, HDR support, and new visual features like fully dynamic weather, this title is an excellent showcase of what modern PC hardware can do.
Now that both AMD and NVIDIA have released drivers optimized for Forza 7, we've taken an opportunity to measure performance across an array of different GPUs. After some significant performance mishaps with last year's Forza Horizon 3 at launch on PC, we are excited to see if Forza Motorsport 7 brings any much-needed improvements.
For this testing, we used our standard GPU testbed, including an 8-core Haswell-E processor and plenty of memory and storage.
|PC Perspective GPU Testbed|
|Processor||Intel Core i7-5960X Haswell-E|
|Motherboard||ASUS Rampage V Extreme X99|
|Memory||G.Skill Ripjaws 16GB DDR4-3200|
|Storage||OCZ Agility 4 256GB (OS)
Adata SP610 500GB (games)
|Power Supply||Corsair AX1500i 1500 watt|
|OS||Windows 10 x64|
|Drivers||AMD: 17.10.1 (Beta)
As with a lot of modern console-first titles, Forza 7 defaults to "Dynamic" image quality settings. This means that the game engine is supposed to find the best image settings for your hardware automatically, and dynamically adjust them so that you hit a target frame rate (adjustable between 30 and 60fps) no matter what is going on in the current scene that is being rendered.
While this is a good strategy for consoles, and even for casual PC gamers, it poses a problem for us trying to measure equivalent performance across GPUs. Luckily the developers of Forza Motorsport 7, Turn 10 Studios, still let you disable the dynamic control and configure the image quality settings as you desire.
One quirk however though is that in order for V-Sync to be disabled, the rendering resolution within the game must match the native resolution of your monitor. This means that if you are running 2560x1440 on your 4K monitor, you must first set the resolution within windows to 2560x1440 in order to run the game in V-Sync off mode.
We did our testing with an array of three different resolutions (1080p, 1440p, and 4K) at maximum image quality settings. We tested both AMD and NVIDIA graphics cards in similar price and performance segments. The built-in benchmark mode for this game was used, which does feature some variance due to dynamic weather patterns. However, our testing within the full game matched the results of the benchmark mode closely, so we used it for our final results.
Right off the bat, I have been impressed at how well optimized Forza Motorsport 7 seems to be on the PC. Compared to the unoptimized disaster that was Forza Horizon 3 when it launched on PC last year, it's clear that Turn 10 Studios and Microsoft have come a long way.
Even gamers looking to play on a 4K display at 60Hz can seemingly get away with the cheaper, and more mainstream GPUs such as the RX 580 or the GTX 1060 with acceptable performance in most scenarios.
Games on high-refresh-rate displays don't appear to have the same luxury. If you want to game at a resolution such as 2560x1440 at a full 144Hz, neither the RX Vega 64 or GTX 1080 will do this with maximum image quality settings. Although these GPUs appear to be in the margin where you could turn down a few settings to achieve your full refresh rate.
For some reason, the RX Vega cards didn't seem to show any scaling in performance when moving from 2560x1440 to 1920x1080, unlike the Polaris-based RX 580 and the NVIDIA options. We aren't quite sure of the cause of this and have reached out to AMD for clarification.
As far as frame times are concerned, we also gathered some data with our Frame Rating capture analysis system.
Taking a look at the first chart, we can see while the GTX 1080 frame times are extremely consistent, the RX Vega 64 shows some additional variance.
However, the frame time variance chart shows that over 95% of the frame times of the RX Vega 64 come in at under 2ms of variance, which will still provide a smooth gameplay experience in most scenarios. This matches with our experience while playing on both AMD and NVIDIA hardware where we saw no major issues with gameplay smoothness.
Forza Motorsport 7 seems to be a great addition to the PC gaming world (if you don't mind using the Microsoft store exclusively) and will run great on a wide array of hardware. Whether or not you have a NVIDIA or AMD GPU, you should be able to enjoy this fantastic racing simulator.
Can you hear me now?
One of the more significant downsides to modern gaming notebooks is noise. These devices normally have small fans that have to spin quickly to cool the high-performance components found inside. While the answer for loud gaming desktops might be a nice set of headphones, for notebooks that may be used in more public spaces, that's not necessarily a good solution for friends or loved ones.
Attempting to address the problem of loud gaming notebooks, NVIDIA released a technology called WhisperMode. WhisperMode launched alongside NVIDIA's Max-Q design notebooks earlier this year, but it will work with any notebook enabled with an NVIDIA GTX 1060 or higher. This software solution aims to limit noise and power consumption of notebooks by restricting the frame rate of your game to a reasonable compromise of performance, noise, and power levels. NVIDIA has profiled over 400 games to find this sweet spot and added profiles for those games to WhisperMode technology.
WhisperMode is enabled through the NVIDIA GeForce Experience application.
From GFE, you can also choose to "Optimize games for WhisperMode." This will automatically adjust settings (in-game) to complement the frame rate target control of WhisperMode.
If you want to adjust the Frame Rate Target, that must be done in the traditional NVIDIA Control Panel and is done on a per app basis. The target can be set at intervals of 5 FPS from 30 to the maximum refresh of your display. Having to go between two pieces of software to tweak these settings seems overly complex and hopefully some upcoming revamp of the NVIDIA software stack might address this user interface falacy.
To put WhisperMode through its paces, we tried it on two notebooks - one with a GTX 1070 Max-Q (the MSI GS63VR) and one with a GTX 1080 Max-Q (the ASUS ROG Zephyrus). Our testing consisted of two games, Metro: Last Light and Hitman. Both of these games were run for 15 minutes to get the system up to temperature and achieve sound measurements that are more realistic to extended gameplay sessions. Sound levels were measured with our Extech 407739 Sound Level Meter placed at a distance of 6 inches from the given notebooks, above the keyboard and offset to the right.