Subject: Graphics Cards | December 9, 2018 - 05:40 PM | Tim Verry
Tagged: pascal, msi, GP104, GeForce GTX 1060, armor
MSI is launching a refreshed GTX 1060 graphics card that uses GDDR5X for its 6GB of video memory rather than GDDR5. The aptly named GTX 1060 Armor 6GD5X OC graphics card shares many features of the existing Armor 6G OC (and OCV1) that the new card is a refresh of including the dual TORX fan Armor 2X cooler and maximum 4 display outputs among three DisplayPort 1.4, one HDMI 2.0b, and one DVI-D.
The new Pascal-based GPU in the upcoming graphics card is reportedly a cut-down variant of NVIDIA's larger GP104 chip rather than the GP106-400 used for previous GTX 1060s, but the core count and other compute resources remain the same at 1,280 CUDA cores, 80 TMUs, 48 ROPs, and a 192-bit memory bus. Clock speeds have been increased slightly versus reference specifications however at 1544 MHz base and up to 1759 MHz boost. The GPU is paired with 6 GB of GDDR5X that is curiously clocked at 8 GHz. The memory more than likely has quite a bit of overclocking headroom vs GTX 1060 6GB cards using GDDR5 but it appears MSI is leaving those pursuits for enthusiasts to explore on their own.
MSI is equipping its GTX 1060 Armor 6GD5X OC graphics cards with a 8+6 pin PCI-E power connection setup which should help overclockers push the cards as far as they can (previous GTX 1060 Armor OC cards had only a single 8-pin). Looking at the specification page the new card will be slightly shorter but with a thicker cooler at 276mm x 140mm x 41mm than the GDDR5-based card. As part of the Armor series the card has a white and black design like its predecessors.
MSI has not yet released pricing or availability information but with the GDDR5-based graphics cards priced at around $275 I would suspect the MSI GTX 1060 Armor 6GD5X OC to sit around $290 at launch.
I am curious how well new GTX 1060 graphics cards will perform when paired with faster GDDR5X memory and how the refreshed cards stack up against AMD's refreshed Polaris 30 based RX 590 graphics cards.
- The GeForce GTX 1060 6GB Review - GP106 Starting at $249
- NVIDIA GeForce GTX 1060 Preview: Pascal with GP106
Subject: Graphics Cards | December 7, 2017 - 01:31 AM | Tim Verry
Tagged: pascal, GTX 1070Ti, GP104, gigabyte, aorus
Gigabyte is jumping into the custom GTX 1070 Ti fray with the Aorus branded GeForce GTX 1070 Ti Aorus. The new custom graphics card measures 280 x 111 x 38mm and features a WindForce 3 cooler with backplate and a custom 6+2 power phase.
Backlit by RGB Fusion LEDs, the Aorus logo sits on the side of the card and can be configured to the color of your choice. The shroud is black with orange accents and has sharp stealth angles that is minimal by comparison with other cards. Gigabyte is using a fairly beefy heatsink with this card. Specifically, three 80mm fans push air over a beefy heatsink that consists of three fin stacks connected by four composite heatpipes. Further, the cooler uses direct contact for the heatpipes above the GPU and a metal plate with thermal pads to cover the GDDR5 memory chips. The rightmost fin stack cools the MOSFETs. Additionally, the full cover backplate adds rigidity to the card and has a copper plate to draw excess heat from the underside of the GPU.
The Aorus graphics card is powered by a single 8-pin PCI-E power connector that feeds a 6+2 power phase. External video outputs include one DVI, one HDMI 2.0b, and three DisplayPort 1.4 ports.
The Pascal-based GP104-300 GPU (2432 CUDA cores, 152 TMUs, and 64 ROPs) is clocked at 1607 MHz base and 1683 MHz boost which is the maximum vendors can clock the cards out of the box. Gigabyte does offer 1-click overclocking using its Aorus Graphics Engine software and guarantees at least 88 MHz overclocks to 1683 MHz base and 1771 MHz boost. Users can, of course, use other software like MSI Afterburner or EVGA Precision X if they wish but will need to use the Gigabyte tool if they want the single click automatic overclock. The 8GB of GDDR5 memory is stock clocked at 8008 MHz and sits on a 256-bit bus.
Looking online, the Aorus GTX 1070 Ti Aorus doesn’t appear to be available for sale quite yet, but it should be coming soon. With the Gigabyte GTX 1070 Ti Gaming card coming in at $469, I’m betting the Aorus card with guaranteed overclock will have a MSRP around $500.
Subject: Graphics Cards | September 27, 2016 - 10:04 PM | Tim Verry
Tagged: water cooling, pascal, hybrid cooler, gtx 1070, GP104, evga
EVGA is preparing to launch the GTX 1070 FTW Hybrid which is a water cooled card that pairs NVIDIA's GTX 1070 GPU with EVGA's Hybrid cooler and custom FTW PCB. The factory overclocked graphics card is currently up for pre-order for $500 on EVGA's website.
The GTX 1070 FTW Hybrid uses EVGA's custom PCB that features two 8-pin power connectors that drive a 10+2 power phase and dual BIOS chips. The Hybrid cooler includes a shrouded 100mm axial fan and a water block that directly touches both the GPU and the memory chips. The water block connects to an external 120mm radiator and a single fan that can be swapped out and/or powered by a motherboard using a standard four pin connector. Additionally, the cooler has a metal back plate and RGB LED back-lit EVGA logos on the side and windows on the front. Display outputs include one DVI, one HDMI, and three DisplayPort connectors.
As far as specification go, EVGA did not get too crazy with the factory overclock, but users should be able to push it quite far on their own assuming they get a decent chip from the silicon lottery. The GP104 GPU has 1920 CUDA cores clocked at 1607 MHz base and 1797 MHz boost. However, the 8 GB of memory is clocked at the stock 8,000 MHz. For comparison, reference clock speeds are 1506 MHz base and 1683 MHz boost.
Interestingly, EVGA rates the GTX 1070 FTW Hybrid at 215 watts versus the reference card's 150 watts. It is also the same TDP rating as the GTX 1080 FTW Hybrid card.
The table below outlines the specifications of EVGA's water cooled card compared to the GTX 1070 reference GPU and the GTX 1080 FTW Hybrid.
|GTX 1070||GTX 1070 FTW Hybrid||GTX 1080 FTW Hybrid|
|Rated Clock||1506 MHz||1607 MHz||1721 MHz|
|Boost Clock||1683 MHz||1797 MHz||1860 MHz|
|Memory Clock||8000 MHz||8000 MHz||10000 MHz|
|TDP||150 watts||215 watts||215 watts|
|MSRP (current)||$379 ($449 FE)||$500||$730|
According to EVGA, the Hybrid cooler offers up GPU and memory temperatures to 45°C and 57°C respectively compared to reference temperatures of 80°C and 85°C. Keeping in mind that these are EVGA's own numbers (you can see our Founder's Edition temperature results here), the Hybrid cooler seems to be well suited for keeping Pascal GPUs in check even when overclocked. In reviews of the GTX 1080 FTW Hybrid, reviewers found that the Hybrid cooler allowed stable 2GHz+ GPU clock speeds that let the card hit their maximum boost clocks and stay there under load. Hopefully the GTX 1070 version will have similar results. I am interested to see whether the memory chips they are using will be capable of hitting at least the 10 GHz of the 1080 cards if not more since they are being cooled by the water loop.
You can find more information on the factory overclocked water cooled graphics card on EVGA's website. The card is available for pre-order at $500 with a 3 year warranty.
Pricing does seem a bit high at first glance, but looking around at other custom GTX 1070 cards, it is only at about a $50 premium which is not too bad in my opinion. I will wait to see actual reviews before I believe it, but if I had to guess the upcoming card should have a lot of headroom for overclocking and I'm interested to see how far people are able to push it!
Is Enterprise Ascending Outside of Consumer Viability?
So a couple of weeks have gone by since the Quadro P6000 (update: was announced) and the new Titan X launched. With them, we received a new chip: GP102. Since Fermi, NVIDIA has labeled their GPU designs with a G, followed by a single letter for the architecture (F, K, M, or P for Fermi, Kepler, Maxwell, and Pascal, respectively), which is then followed by a three digit number. The last digit is the most relevant one, however, as it separates designs by their intended size.
Typically, 0 corresponds to a ~550-600mm2 design, which is about as larger of a design that fabrication labs can create without error-prone techniques, like
multiple exposures (update for clarity: trying to precisely overlap multiple designs to form a larger integrated circuit). 4 corresponds to ~300mm2, although GM204 was pretty large at 398mm2, which was likely to increase the core count while remaining on a 28nm process. Higher numbers, like 6 or 7, fill back the lower-end SKUs until NVIDIA essentially stops caring for that generation. So when we moved to Pascal, jumping two whole process nodes, NVIDIA looked at their wristwatches and said “about time to make another 300mm2 part, I guess?”
The GTX 1080 and the GTX 1070 (GP104, 314mm2) were born.
NVIDIA already announced a 600mm2 part, though. The GP100 had 3840 CUDA cores, HBM2 memory, and an ideal ratio of 1:2:4 between FP64:FP32:FP16 performance. (A 64-bit chunk of memory can store one 64-bit value, two 32-bit values, or four 16-bit values, unless the register is attached to logic circuits that, while smaller, don't know how to operate on the data.) This increased ratio, even over Kepler's 1:6 FP64:FP32, is great for GPU compute, but wasted die area for today's (and tomorrow's) games. I'm predicting that it takes the wind out of Intel's sales, as Xeon Phi's 1:2 FP64:FP32 performance ratio is one of its major selling points, leading to its inclusion in many supercomputers.
Despite the HBM2 memory controller supposedly being actually smaller than GDDR5(X), NVIDIA could still save die space while still providing 3840 CUDA cores (despite disabling a few on Titan X). The trade-off is that FP64 and FP16 performance had to decrease dramatically, from 1:2 and 2:1 relative to FP32, all the way down to 1:32 and 1:64. This new design comes in at 471mm2, although it's $200 more expensive than what the 600mm2 products, GK110 and GM200, launched at. Smaller dies provide more products per wafer, and, better, the number of defective chips should be relatively constant.
Anyway, that aside, it puts NVIDIA in an interesting position. Splitting the xx0-class chip into xx0 and xx2 designs allows NVIDIA to lower the cost of their high-end gaming parts, although it cuts out hobbyists who buy a Titan for double-precision compute. More interestingly, it leaves around 150mm2 for AMD to sneak in a design that's FP32-centric, leaving them a potential performance crown.
Image Credit: ExtremeTech
On the other hand, as fabrication node changes are becoming less frequent, it's possible that NVIDIA could be leaving itself room for Volta, too. Last month, it was rumored that NVIDIA would release two architectures at 16nm, in the same way that Maxwell shared 28nm with Kepler. In this case, Volta, on top of whatever other architectural advancements NVIDIA rolls into that design, can also grow a little in size. At that time, TSMC would have better yields, making a 600mm2 design less costly in terms of waste and recovery.
If this is the case, we could see the GPGPU folks receiving a new architecture once every second gaming (and professional graphics) architecture. That is, unless you are a hobbyist. If you are? I would need to be wrong, or NVIDIA would need to somehow bring their enterprise SKU into an affordable price point. The xx0 class seems to have been pushed up and out of viability for consumers.
Or, again, I could just be wrong.
Take your Pascal on the go
Easily the strongest growth segment in PC hardware today is in the adoption of gaming notebooks. Ask companies like MSI and ASUS, even Gigabyte, as they now make more models and sell more units of notebooks with a dedicated GPU than ever before. Both AMD and NVIDIA agree on this point and it’s something that AMD was adamant in discussing during the launch of the Polaris architecture.
Both AMD and NVIDIA predict massive annual growth in this market – somewhere on the order of 25-30%. For an overall culture that continues to believe the PC is dying, seeing projected growth this strong in any segment is not only amazing, but welcome to those of us that depend on it. AMD and NVIDIA have different goals here: GeForce products already have 90-95% market share in discrete gaming notebooks. In order for NVIDIA to see growth in sales, the total market needs to grow. For AMD, simply taking back a portion of those users and design wins would help its bottom line.
But despite AMD’s early talk about getting Polaris 10 and 11 in mobile platforms, it’s NVIDIA again striking first. Gaming notebooks with Pascal GPUs in them will be available today, from nearly every system vendor you would consider buying from: ASUS, MSI, Gigabyte, Alienware, Razer, etc. NVIDIA claims to have quicker adoption of this product family in notebooks than in any previous generation. That’s great news for NVIDIA, but might leave AMD looking in from the outside yet again.
Technologically speaking though, this makes sense. Despite the improvement that Polaris made on the GCN architecture, Pascal is still more powerful and more power efficient than anything AMD has been able to product. Looking solely at performance per watt, which is really the defining trait of mobile designs, Pascal is as dominant over Polaris as Maxwell was to Fiji. And this time around NVIDIA isn’t messing with cut back parts that have brand changes – GeForce is diving directly into gaming notebooks in a way we have only seen with one release.
The ASUS G752VS OC Edition with GTX 1070
Do you remember our initial look at the mobile variant of the GeForce GTX 980? Not the GTX 980M mind you, the full GM204 operating in notebooks. That was basically a dry run for what we see today: NVIDIA will be releasing the GeForce GTX 1080, GTX 1070 and GTX 1060 to notebooks.
Subject: Graphics Cards | July 6, 2016 - 07:15 AM | Scott Michaud
Tagged: pascal, nvidia, htc vive, GTX 1080, gtx 1070, GP104
NVIDIA is working on a fix to allow the HTC Vive to be connected to the GeForce GTX 1070 and GTX 1080 over DisplayPort. The HTC Vive apparently has the choice between HDMI and Mini DisplayPort, but the headset will not be identified when connected over that connection. Currently, the two workarounds are to connect the HTC Vive over HDMI, or use a DisplayPort to HDMI adapter if your card's HDMI output is already occupied.
It has apparently been an open issue for over a month now. That said, NVIDIA's Manuel Guzman has acknowledged the issue. Other threads claim that there are other displays that have a similar issue, and, within the last 24 hours, some users have experienced luck with modifying their motherboard's settings. I'd expect that it's something the can fix in an upcoming driver, though. For now, I guess plan your monitor outputs accordingly if you were planning on getting the HTC Vive.
Subject: Graphics Cards | July 5, 2016 - 07:01 AM | Scott Michaud
Tagged: msi, GTX 1080, gtx 1070, GP104, duke
Getting a custom-cooled GTX 1080 (for around its MSRP) basically involves monitoring Newegg for a good business week or two, several times per day, pouncing on whatever isn't marked-up. Whether it's low supply or high demand, add-in board vendors haven't stopped announcing new models.
Image Credit: EXPReview
The MSI GTX 1080 8G DUKE is a three-fan (“TriFrozr”) design with an 8-pin and a 6-pin PCIe power connector, which provides 75W more headroom than the Founders Edition. EXPReview claims that it slides between the AERO and the GAMING lines. Although they don't claim how it matches up to ARMOR, which is also between AERO and GAMING, it looks like it's slightly above it, with its RGB LEDs. The GTX 1080 GPU is factory overclocked to 1708 MHz and boosts to 1847 MHz, and the GTX 1070 is overclocked to 1607 MHz with a 1797 MHz boost.
Launch regions are not listed for the cards, but the launch price is supposedly 5399 Chinese Yuan (which converts to $810 USD) and 3499 Chinese Yuan ($524.70 USD) for the GTX 1070. This is quite a bit higher than we would expect, but I'm not sure how regional pricing on electronics works between the USA and China.
Subject: Graphics Cards | June 27, 2016 - 04:55 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, GTX 1080, gtx, GP104, geforce, founders edition
You have already seen our delve into the frame times provided by the GTX 1080 but perhaps you would like another opinion. The Tech Report also uses the FCAT process which we depend upon to bring you frame time data, however they present the data in a slightly different way which might help you to comprehend the data. They also included Crysis 3 to ensure that the card can indeed play it. Check out their full review here.
"Nvidia's GeForce GTX 1080 is the company's first consumer graphics card to feature its new Pascal architecture, fabricated on a next-generation 16-nm process. We dig deep into the GTX 1080 to see what the confluence of these advances means for the high-end graphics market."
Here are some more Graphics Card articles from around the web:
- Gigabyte GeForce GTX 1080 G1 Gaming Review @HiTech Legion
- NVIDIA GeForce GTX 1080 SLI @ techPowerUp
- NVIDIA GeForce GTX 1070 Review: A Look At 1440p, 4K & Ultrawide Gaming @ Techgage
- Asus Republic Of Gamers Strix GTX 1070 Aura RGB OC @ Kitguru
- MSI Gaming 3 and 4-way SLI Bridge Connector Review @ OCC
- Radeon R9 380 vs. GeForce GTX 960 @ Hardware Secrets
Subject: Graphics Cards | June 21, 2016 - 05:22 PM | Scott Michaud
Tagged: nvidia, fermi, kepler, maxwell, pascal, gf100, gf110, GK104, gk110, GM204, gm200, GP104
Techspot published an article that compared eight GPUs across six, high-end dies in NVIDIA's last four architectures: Fermi to Pascal. Average frame rates were listed across nine games, each measured at three resolutions:1366x768 (~720p HD), 1920x1080 (1080p FHD), and 2560x1600 (~1440p QHD).
The results are interesting. Comparing GP104 to GF100, mainstream Pascal is typically on the order of four times faster than big Fermi. Over that time, we've had three full generational leaps in fabrication technology, leading to over twice the number of transistors packed into a die that is almost half the size. It does, however, show that prices have remained relatively constant, except that the GTX 1080 is sort-of priced in the x80 Ti category despite the die size placing it in the non-Ti class. (They list the 1080 at $600, but you can't really find anything outside the $650-700 USD range).
It would be interesting to see this data set compared against AMD. It's informative for an NVIDIA-only article, though.
Subject: Graphics Cards | June 8, 2016 - 08:44 PM | Ryan Shrout
Tagged: sli, pascal, nvidia, GTX 1080, GP104, geforce, 4-way sli, 3-way sli
IMPORTANT UPDATE: After writing this story, but before publication, we went to NVIDIA for comment. As we were getting ready to publish, the company updated me with a shift in its stance on multi-GPU configurations. NVIDIA will no longer require an "enthusiast key" to enable SLI on more than two GPUs. However, NVIDIA will also only be enabling 3-Way and 4-Way SLI for a select few applications. More details are at the bottom of the story!
You'll likely recall that during our initial review of the GeForce GTX 1080 Founders Edition graphics card, we mentioned that NVIDIA was going to be moving people towards the idea that "only 2-Way SLI will be supported" and promoted. There would still be a path for users that wanted 3 and 4 GPU configurations anyway, and it would be called the Enthusiast Key.
As it turns out, after returning from an AMD event focused on its upcoming Polaris GPUs, I happen to have amassed a total of four GeForce GTX 1080 cards.
Courtesy of some friends at EVGA and two readers that were awesome enough to let me open up their brand new hardware for a day or so, I was able to go through the 3-Way and 4-Way SLI configuration process. Once all four were installed, and I must point out how great it is that each card only required a single 8-pin power connector, I installed the latest NVIDIA driver I had on hand, 368.19.
Knowing about the need for the Enthusiast Key, and also knowing that I did not yet have one and that the website that was supposed to be live to enable me to get one is still not live, I thought I might have stumbled upon some magic. The driver appeared to let me enable SLI anyway.
Enthusiasts will note however that the green marker under the four GPUs with the "SLI" text is clearly only pointing at two of the GTX 1080s, leaving the remaining two...unused. Crap.
At this point, if you have purchased more than two GeForce GTX 1080 cards are simply out of luck and are waiting on NVIDIA to make good on it's promise to allow for 3-Way and 4-Way configurations via the Enthusiast Key. Or some other way. It's way too late now to simply say "we aren't supporting it at all."
While I wait...what is there for a gamer with four GeForce GTX 1080 cards to do? Well, you could run Ashes of the Singularity. It's multi-GPU mode uses MDA mode, which means the game engine itself accesses each GPU on its own, without the need for the driver to handle anything regarding GPU load balancing. Unfortunately, Ashes only supports two GPUs today.
Well...you could run an OpenCL based benchmark like LuxMark that access all the GPUs independently as well.
I did so, and the result is an impressive score of 17,127!!
How does that compare to some other products?
The four GTX 1080 cards produce a score that is 2.57x the result provided by the AMD Radeon Pro Duo and 2.29x the score of SLI GeForce GTX 980 Ti cards. Nice!
So there you go! We are just as eager to get our hands on the ability to test 3-Way and 4-Way SLI with new Pascal GPUs as some of the most extreme and dedicated enthusiasts out there are. With any luck, NVIDIA will finally figure out a way to allow it - no matter how it finally takes place.
IMPORTANT UPDATE: Before going to press with this story I asked NVIDIA for comment directly: when was the community finally going to get the Enthusiast Key website to unlock 3-Way and 4-Way SLI for those people crazy enough to have purchased that many GTX 1080s? The answer was quite surprising: NVIDIA is backing away from the idea of an "Enthusiast Key" and will no longer require it for enabling 3-Way and 4-Way SLI.
Here is the official NVIDIA statement given to PC Perspective on the subject:
With the GeForce 10-series we’re investing heavily in 2-way SLI with our new High Bandwidth bridge (which doubles the SLI bandwidth for faster, smoother gaming at ultra-high resolutions and refresh rates) and NVIDIA Game Ready Driver SLI profiles. To ensure the best possible gaming experience on our GeForce 10-series GPUs, we’re focusing our efforts on 2-way SLI only and will continue to include 2-way SLI profiles in our Game Ready Drivers.
DX12 and NVIDIA VR Works SLI technology also allows developers to directly implement and control multi-GPU support within their games. If a developer chooses to use these technologies then their game will not need SLI profiles. Some developers may also decide to support more than 2 GPUs in their games. We continue to work with all developers creating games and VR applications that take advantage of 2 or more GPUs to make sure they’ll work great on GeForce 10-series GPUs.
For our overclocking community, our Game Ready Drivers will also include SLI profiles for 3- and 4-way configurations for specific OC applications only, including Fire Strike, Unigine and Catzilla.
NVIDIA clearly wants to reiterate that only 2-Way SLI will get the attention that we have come to expect from the GeForce driver dev team. As DX12 and Vulkan next-generation APIs become more prolific, the game developers will still have the ability to directly access more than two GeForce GTX 10-series GPUs, though I expect that be a very narrow window of games simply due to development costs and time.
NVIDIA will enable support for three and four card configurations in future drivers (without a key) for specific overclocking/benchmarking tools only, as a way to make sure the GeForce brand doesn't fall off the 3DMark charts. Only those specific applications will be able operate in the 3-Way and 4-Way SLI configurations that you have come to know. There are no profiles to change manually and even the rare games that might have "just worked" with three or four GPUs will not take advantage of more than two GTX 10-series cards. It's fair to say at this point that except for the benchmarking crowd, NVIDIA 3-Way and 4-Way SLI is over.
We expect the "benchmark only" mode of 3-Way and 4-Way SLI to be ready for consumers with the next "Game Ready" driver release. If you happened to get your hands on more than two GTX 1080s but aren't into benchmarking, then find those receipts and send a couple back.
So there you have it. Honestly, this is what I was expecting from NVIDIA with the initial launch of Pascal and the GeForce GTX 1080/1070 and I was surprised when I first heard about the idea of the "enthusiast key." It took a bit longer than expected, and NVIDIA will get more flak for the iterated dismissal of this very niche, but still pretty cool, technology. In the end, this won't have much impact on the company's bottom line as the quantity of users that were buying 3+ GTX GPUs for a single system was understandably small.