Subject: Graphics Cards | November 17, 2017 - 02:08 PM | Ryan Shrout
Tagged: titan xp, Star Wars, nvidia, jedi order, jedi, geforce, galactic empire, empire
NVIDIA has a coup on its hands this holiday. With the release of Battlefront II today and The Last Jedi next month, a new series of Titan Xp cards is available that will make Star Wars fans giggle with excitement! This is the same Titan Xp performance we expect but with a completely new external design and style, available in both a red-themed Galactic Empire version and a green-themed Jedi Order option.
Check out the video above for the unboxing and my thoughts as I swoon over them...
If you want some more pictures of the goods, I have them here as well.
Do note - though it's hard to recommend a $1200 graphics card to many people, these cards almost seem like a steal considering they are priced at the same cost as the standard Titan Xp models. I know that the price for these custom shrouds in short runs was not cheap, so its almost like NVIDIA is giving Star Wars that double as PC enthusiasts a little gift for the holidays.
Okay, that might be a stretch... But come on, look how awesome these graphics cards look!!
We are working up a full system build (time for my personal upgrade!) with these two GPUs and will have a build log of that up before Christmas. Don't worry, we plan on properly presenting this hardware through an all-glass chassis!
Subject: Graphics Cards | November 7, 2017 - 03:21 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce, msi
NVIDIA chose to limit the release of their GTX 1070 Ti to reference cards, all sporting the same clocks regardless of the model. That does not mean that the manufacturers skimped on the features which help you overclock successfully. As a perfect example, the MSI GTX 1070 Ti GAMING TITANIUM was built with Hi-C CAPs, Super Ferrite Chokes, and Japanese Solid Caps and 10-phase PWM. This resulted in an impressive overclock of 2050MHz on the GPU and a memory frequency of 9GHz once [H]ard|OCP boosted the power delivered to the card. That boost is enough to meet or even exceed the performance of a stock GTX 1080 or Vega 64 in most of the games they tested.
"NVIDIA is launching the GeForce GTX 1070 Ti today, and we’ve got a custom retail MSI GeForce GTX 1070 Ti GAMING TITANIUM video card to test and overclock, yes overclock, to the max. We’ll make comparisons against GTX 1080/1070, AMD Radeon RX Vega 64 and 56 for a complete review."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 1070 Ti @ The Tech Report
- NVIDIA GeForce GTX 1070 Ti, Takes On The Radeon RX Vega 64 Under Linux @ Phoronix
- MSI GeForce GTX 1070 Ti Titanium 8G @ Guru of 3D
- NVIDIA GeForce GTX 1070 Ti Founders Edition Review @ OCC
- ASUS GTX 1070 Ti STRIX 8 GB @ TechPowerUp
- Colorful iGame GTX 1070 Ti Vulcan X TOP 8 GB @ TechPowerUp
- 34-Way Graphics Card Comparison On Ubuntu 17.10 @ Phoronix
Subject: Graphics Cards | November 2, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce
It should come as no surprise to anyone how the GTX 1070 Ti performs, better than a GTX 1070 but not quite as fast as a GTX 1080 ... unless you overclock. With the push of two buttons Ryan was able to hit 1987 MHz which surpasses your average GTX 1080 by a fair margin. Hardware Canucks saw 2088MHz when they overclocked as well as memory of 8.9Gbps which pushed the performance past the reference GTX 1080 in many games. Their benchmark suite encompasses a few different games so you should check to see if your favourites are there.
The real hope of this launch was that prices would change, not so much the actual prices you pay but the MSRP of cards both AMD and NVIDIA. For now that has not happened but perhaps soon it will, though Bitcoin hitting $7000 does not help.
"NVIDIA’s launch of their new GTX 1070 Ti is both senseless and completely sensible depending on which way you tend to look at things. The emotional among you are going to wonder why NVIDIA is even bothering to introduce a new product into a lineup that’s more than a year old."
Here are some more Graphics Card articles from around the web:
- GeForce GTX 1070 Ti Founder Edition @ Guru of 3D
- GeForce GTX 1070 Ti 2-way FCAT SLI @ Guru of 3D
- MSI GeForce GTX 1070 Ti Gaming @ Guru of 3D
- Palit GeForce GTX 1070 Ti Super Jetstream @ Guru of 3D
- Nvidia GTX 1070 Ti review: A fine graphics card—but price remains high @ Ars Technica
- GTX 1070 Ti Review- 35 Games benchmarked @ BabelTechReviews
- MSI GTX 1070 Ti Gaming 8 GB @ TechPowerUp
- NVIDIA GeForce GTX 1070 Ti Founders Edition 8 GB @ TechPowerUp
- A Quick Look At NVIDIA’s GeForce GTX 1070 Ti @ Techgage
- MSI GeForce GTX 1070 Ti Gaming
- MSI GTX 1070 Ti Gaming 8G @ Kitguru
- Palit GTX 1070 Ti Super JetStream 8 GB @ TechPowerUp
- Palit GTX 1070 Ti Super JetStream @ Kitguru
- The NVIDIA GeForce GTX 1080 Ti Founders Edition @ TechARP
- MSI GeForce GTX 1080 Ti GAMING X TRIO @ [H]ard|OCP
- Sapphire RX VEGA 64 Limited Edition @ Modders-Inc
- The AMD Radeon RX Vega 64 @ TechARP
Here comes a new challenger
The release of the GeForce GTX 1070 Ti has been an odd adventure. Launched into a narrow window of a product stack between the GTX 1070 and the GTX 1080, the GTX 1070 Ti is a result of the competition from the AMD RX Vega product line. Sure, NVIDIA might have speced out and prepared an in-between product for some time, but it was the release of competitive high-end graphics cards from AMD (for the first time in forever it seems) that pushed NVIDIA to launch what you see before us today.
With MSRPs of $399 and $499 for the GTX 1070 and GTX 1080 respectively, a new product that fits between them performance wise has very little room to stretch its legs. Because of that, there are some interesting peculiarities involved with the release cycle surrounding overclocks, partner cards, and more.
But before we get into that concoction, let’s first look at the specifications of this new GPU option from NVIDIA as well as the reference Founders Edition and EVGA SC Black Edition cards that made it to our offices!
GeForce GTX 1070 Ti Specifications
We start with our classic table of details.
|RX Vega 64 Liquid||RX Vega 64 Air||RX Vega 56||Vega Frontier Edition||GTX 1080 Ti||GTX 1080||GTX 1070 Ti||GTX 1070|
|Base Clock||1406 MHz||1247 MHz||1156 MHz||1382 MHz||1480 MHz||1607 MHz||1607 MHz||1506 MHz|
|Boost Clock||1677 MHz||1546 MHz||1471 MHz||1600 MHz||1582 MHz||1733 MHz||1683 MHz||1683 MHz|
|Memory Clock||1890 MHz||1890 MHz||1600 MHz||1890 MHz||11000 MHz||10000 MHz||8000 MHz||8000 MHz|
|Memory Interface||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||2048-bit HBM2||352-bit G5X||256-bit G5X||256-bit||256-bit|
|Memory Bandwidth||484 GB/s||484 GB/s||410 GB/s||484 GB/s||484 GB/s||320 GB/s||256 GB/s||256 GB/s|
|TDP||345 watts||295 watts||210 watts||300 watts||250 watts||180 watts||180 watts||150 watts|
|Peak Compute||13.7 TFLOPS||12.6 TFLOPS||10.5 TFLOPS||13.1 TFLOPS||11.3 TFLOPS||8.2 TFLOPS||7.8 TFLOPS||5.7 TFLOPS|
If you have followed the leaks and stories over the last month or so, the information here isn’t going to be a surprise. The CUDA core count of the GTX 1070 Ti is 2432, only one SM unit less than the GTX 1080. Base and boost clock speeds are the same as the GTX 1080. The memory system includes 8GB of GDDR5 running at 8 GHz, matching the performance of the GTX 1070 in this case. The TDP gets a bump up to 180 watts, in line with the GTX 1080 and slightly higher than the GTX 1070.
Forza Motorsport 7 Performance
The first full Forza Motorsport title available for the PC, Forza Motorsport 7 on Windows 10 launched simultaneously with the Xbox version earlier this month. With native 4K assets, HDR support, and new visual features like fully dynamic weather, this title is an excellent showcase of what modern PC hardware can do.
Now that both AMD and NVIDIA have released drivers optimized for Forza 7, we've taken an opportunity to measure performance across an array of different GPUs. After some significant performance mishaps with last year's Forza Horizon 3 at launch on PC, we are excited to see if Forza Motorsport 7 brings any much-needed improvements.
For this testing, we used our standard GPU testbed, including an 8-core Haswell-E processor and plenty of memory and storage.
|PC Perspective GPU Testbed|
|Processor||Intel Core i7-5960X Haswell-E|
|Motherboard||ASUS Rampage V Extreme X99|
|Memory||G.Skill Ripjaws 16GB DDR4-3200|
|Storage||OCZ Agility 4 256GB (OS)
Adata SP610 500GB (games)
|Power Supply||Corsair AX1500i 1500 watt|
|OS||Windows 10 x64|
|Drivers||AMD: 17.10.1 (Beta)
As with a lot of modern console-first titles, Forza 7 defaults to "Dynamic" image quality settings. This means that the game engine is supposed to find the best image settings for your hardware automatically, and dynamically adjust them so that you hit a target frame rate (adjustable between 30 and 60fps) no matter what is going on in the current scene that is being rendered.
While this is a good strategy for consoles, and even for casual PC gamers, it poses a problem for us trying to measure equivalent performance across GPUs. Luckily the developers of Forza Motorsport 7, Turn 10 Studios, still let you disable the dynamic control and configure the image quality settings as you desire.
One quirk however though is that in order for V-Sync to be disabled, the rendering resolution within the game must match the native resolution of your monitor. This means that if you are running 2560x1440 on your 4K monitor, you must first set the resolution within windows to 2560x1440 in order to run the game in V-Sync off mode.
We did our testing with an array of three different resolutions (1080p, 1440p, and 4K) at maximum image quality settings. We tested both AMD and NVIDIA graphics cards in similar price and performance segments. The built-in benchmark mode for this game was used, which does feature some variance due to dynamic weather patterns. However, our testing within the full game matched the results of the benchmark mode closely, so we used it for our final results.
Right off the bat, I have been impressed at how well optimized Forza Motorsport 7 seems to be on the PC. Compared to the unoptimized disaster that was Forza Horizon 3 when it launched on PC last year, it's clear that Turn 10 Studios and Microsoft have come a long way.
Even gamers looking to play on a 4K display at 60Hz can seemingly get away with the cheaper, and more mainstream GPUs such as the RX 580 or the GTX 1060 with acceptable performance in most scenarios.
Games on high-refresh-rate displays don't appear to have the same luxury. If you want to game at a resolution such as 2560x1440 at a full 144Hz, neither the RX Vega 64 or GTX 1080 will do this with maximum image quality settings. Although these GPUs appear to be in the margin where you could turn down a few settings to achieve your full refresh rate.
For some reason, the RX Vega cards didn't seem to show any scaling in performance when moving from 2560x1440 to 1920x1080, unlike the Polaris-based RX 580 and the NVIDIA options. We aren't quite sure of the cause of this and have reached out to AMD for clarification.
As far as frame times are concerned, we also gathered some data with our Frame Rating capture analysis system.
Taking a look at the first chart, we can see while the GTX 1080 frame times are extremely consistent, the RX Vega 64 shows some additional variance.
However, the frame time variance chart shows that over 95% of the frame times of the RX Vega 64 come in at under 2ms of variance, which will still provide a smooth gameplay experience in most scenarios. This matches with our experience while playing on both AMD and NVIDIA hardware where we saw no major issues with gameplay smoothness.
Forza Motorsport 7 seems to be a great addition to the PC gaming world (if you don't mind using the Microsoft store exclusively) and will run great on a wide array of hardware. Whether or not you have a NVIDIA or AMD GPU, you should be able to enjoy this fantastic racing simulator.
Can you hear me now?
One of the more significant downsides to modern gaming notebooks is noise. These devices normally have small fans that have to spin quickly to cool the high-performance components found inside. While the answer for loud gaming desktops might be a nice set of headphones, for notebooks that may be used in more public spaces, that's not necessarily a good solution for friends or loved ones.
Attempting to address the problem of loud gaming notebooks, NVIDIA released a technology called WhisperMode. WhisperMode launched alongside NVIDIA's Max-Q design notebooks earlier this year, but it will work with any notebook enabled with an NVIDIA GTX 1060 or higher. This software solution aims to limit noise and power consumption of notebooks by restricting the frame rate of your game to a reasonable compromise of performance, noise, and power levels. NVIDIA has profiled over 400 games to find this sweet spot and added profiles for those games to WhisperMode technology.
WhisperMode is enabled through the NVIDIA GeForce Experience application.
From GFE, you can also choose to "Optimize games for WhisperMode." This will automatically adjust settings (in-game) to complement the frame rate target control of WhisperMode.
If you want to adjust the Frame Rate Target, that must be done in the traditional NVIDIA Control Panel and is done on a per app basis. The target can be set at intervals of 5 FPS from 30 to the maximum refresh of your display. Having to go between two pieces of software to tweak these settings seems overly complex and hopefully some upcoming revamp of the NVIDIA software stack might address this user interface falacy.
To put WhisperMode through its paces, we tried it on two notebooks - one with a GTX 1070 Max-Q (the MSI GS63VR) and one with a GTX 1080 Max-Q (the ASUS ROG Zephyrus). Our testing consisted of two games, Metro: Last Light and Hitman. Both of these games were run for 15 minutes to get the system up to temperature and achieve sound measurements that are more realistic to extended gameplay sessions. Sound levels were measured with our Extech 407739 Sound Level Meter placed at a distance of 6 inches from the given notebooks, above the keyboard and offset to the right.
A few months ago at Computex, NVIDIA announced their "GeForce GTX with Max-Q Design" initiative. Essentially, the heart of this program is the use of specifically binned GTX 1080, 1070 and 1060 GPUs. These GPUs have been tested and selected during the manufacturing process to ensure lower power draw at the same performance levels when compared to the GPUs used in more traditional form factors like desktop graphics cards.
In order to gain access to these "Max-Q" binned GPUs, notebook manufacturers have to meet specific NVIDIA guidelines on noise levels at thermal load (sub-40 dbA). To be clear, NVIDIA doesn't seem to be offering reference notebook designs (as demonstrated by the variability in design across the Max-Q notebooks) to partners, but rather ideas on how they can accomplish the given goals.
At the show, NVIDIA and some of their partners showed off several Max-Q notebooks. We hope to take a look at all of these machines in the coming weeks, but today we're focusing on one of the first, the ASUS ROG Zephyrus.
|ASUS ROG Zephyrus (configuration as reviewed)|
|Processor||Intel Core i7-7700HQ (Kaby Lake)|
|Graphics||NVIDIA Geforce GTX 1080 with Max-Q Deseign (8GB)|
|Memory||24GB DDR4 (8GB Soldered + 8GBx2 DIMM)|
|Screen||15.6-in 1920x1080 120Hz G-SYNC|
512GB Samsung SM961 NVMe
4 x USB 3.0
Audio combo jack
|Power||50 Wh Battery, 230W AC Adapter|
|Dimensions||378.9mm x 261.9mm x 17.01-17.78mm (14.92" x 10.31" x 0.67"-0.70")
4.94 lbs. (2240.746 g)
|OS||Windows 10 Home|
|Price||$2700 - Amazon.com|
As you can see, the ASUS ROG Zephyrus has the specifications of a high-end gaming desktop, let alone a gaming notebook. In some gaming notebook designs, the bottleneck comes down to CPU horsepower more than GPU horsepower. That doesn't seem to be the case here. The powerful GTX 1080 GPU is paired with a quad-core HyperThread Intel processor capable of boosting up to 3.8 GHz.
Subject: Graphics Cards | June 28, 2017 - 11:00 PM | Scott Michaud
Tagged: epic games, ue4, nvidia, geforce, giveaway
If you are an indie game developer, and you could use a little more GPU performance, NVIDIA is hosting a hardware giveaway. Starting at the end of July, and ongoing until Summer 2018, NVIDIA and Epic Games will be giving away GeForce GTX 1080 and GeForce GTX 1080 Ti cards to batches of Unreal Engine 4 projects.
To enter, you need to share screenshots and videos of your game on Twitter, Facebook, and Instagram, tagging both UnrealEngine and NVIDIA. (The specific accounts are listed on the Unreal Engine blog post that announces this initiative.) They will also feature these projects on both the Unreal Engine and the NVIDIA blog, which is just as valuable for indie projects.
So... hey! Several chances at free hardware!
Subject: Graphics Cards | June 26, 2017 - 12:21 PM | Ryan Shrout
Tagged: radeon, nvidia, mining, geforce, cryptocurrency, amd
It appears that the prediction of mining-specific graphics cards was spot on and we are beginning to see the release of them from various AMD and NVIDIA board partners. ASUS has launched both a GP106-based solution and an RX 470 offering, labeled as being built exclusively for mining. And Sapphire has tossed it's hat into the ring with RX 470 options as well.
The most interesting release is the ASUS MINING-P106-6G, a card that takes no official NVIDIA or GeForce branding, but is clearly based on the GP106 GPU that powers the GeForce GTX 1060. It has no display outputs, so you won't be able to use this as a primary graphics card down the road. It is very likely that these GPUs have bad display controllers on the chip, allowing NVIDIA to make use of an otherwise unusable product.
The specifications on the ASUS page list this product as having 1280 CUDA cores, a base clock of 1506 MHz, a Boost clock of 1708 MHz, and 6GB of GDDR5 running at 8.0 GHz. Those are identical specs to the reference GeForce GTX 1060 product.
The ASUS MINING-RX470-4G is a similar build but using the somewhat older, but very efficient for mining, Radeon RX 470 GPU.
Interestingly, the ASUS RX 470 mining card has openings for a DisplayPort and HDMI connection, but they are both empty, leaving the single DVI connection as the only display option.
The Mining RX 470 has 4GB of GDDR5, 2048 stream processors, a base clock of 926 MHz and a boost clock of 1206 MHz, again, the same as the reference RX 470 product.
We have also seen Sapphire versions of the RX 470 for mining show up on Overclockers UK with no display outputs and very similar specifications.
In fact, based on the listings at Overclockers UK, Sapphire has four total SKUs, half with 4GB and half with 8GB, binned by clocks and by listing the expected MH/s (megahash per second) performance for Ethereum mining.
These releases show both NVIDIA and AMD (and its partners) desire to continue cashing in on the rising coin mining and cryptocurrency craze. For AMD, this allows them to find an outlet for the RX 470 GPU that might have otherwise sat in inventory with the upgraded RX 500-series out on the market. For NVIDIA, using GPUs that have faulty display controllers for mining-specific purposes allows it to be better utilize production and gain some additional profit with very little effort.
Those of you still looking to buy GPUs at reasonable prices for GAMING...you remember, what these products were built for...are still going to have trouble finding stock on virtual or physical shelves. Though the value of compute power has been dropping over the past week or so (an expected result of increase interesting in the process), I feel we are still on the rising side of this current cryptocurrency trend.
Subject: General Tech | June 23, 2017 - 05:13 PM | Ryan Shrout
Tagged: nvidia, gtx, geforce gtx usb drive, geforce
What started as merely an April Fool's prank by NVIDIA has now turned into one of the cutest little promotions I've ever seen. Originally "launched" as part of the GeForce G-ASSIST technology that purported to offer AI-enabled gaming if you were away from your keyboard, NVIDIA actually built the tiny, adorable, GeForce GTX USB Key.
This drive was made to look like the GeForce GTX 1080 Founders Edition graphics card and was only produced in a quantity of 1080. I happen to find a 64GB option in a Fedex box this morning when I cam into the office.
Performance on this USB 3.0 based drive is pretty solid, peaking at 111 MB/s on reads and 43 MB/s on writes.
If you want of these for yourself, you need to be signed up through GeForce Experience and opting in to the GeForce newsletter. Do that, and you're entered.
We have some more pictures of the USB drive below (including the surprising interior shot!), so click this link to see them.