The new EVGA GTX 1080 FTW2 with iCX Technology
Back in November of 2016, EVGA had a problem on its hands. The company had a batch of GTX 10-series graphics cards using the new ACX 3.0 cooler solution leave the warehouse missing thermal pads required to keep the power management hardware on its cards within reasonable temperature margins. To its credit, the company took the oversight seriously and instituted a set of solutions for consumers to select from: RMA, new VBIOS to increase fan speeds, or to install thermal pads on your hardware manually. Still, as is the case with any kind of product quality lapse like that, there were (and are) lingering questions about EVGA’s ability to maintain reliable product; with features and new options that don’t compromise the basics.
Internally, the drive to correct these lapses was…strong. From the very top of the food chain on down, it was hammered home that something like this simply couldn’t occur again, and even more so, EVGA was to develop and showcase a new feature set and product lineup demonstrating its ability to innovate. Thus was born, and accelerated, the EVGA iCX Technology infrastructure. While this was something in the pipeline for some time already, it was moved up to counter any negative bias that might have formed for EVGA’s graphics cards over the last several months. The goal was simple: prove that EVGA was the leader in graphics card design and prove that EVGA has learned from previous mistakes.
EVGA iCX Technology
Previous issues aside, the creation of iCX Technology is built around one simple question: is one GPU temperature sensor enough? For nearly all of today’s graphics cards, cooling is based around the temperature of the GPU silicon itself, as measured by NVIDIA (for all of EVGA’s cards). This is how fan curves are built, how GPU clock speeds are handled with GPU Boost, how noise profiles are created, and more. But as process technology has improved, and GPU design has weighed towards power efficiency, the GPU itself is often no longer the thermally limiting factor.
As it turns out, converting 12V (from the power supply) to ~1V (necessary for the GPU) is a simple process that creates a lot of excess heat. The thermal images above clearly demonstrate that and EVGA isn’t the only card vendor to take notice of this. As it turns out, EVGA’s product issue from last year was related to this – the fans were only spinning fast enough to keep the GPU cool and did not take into account the temperature of memory or power delivery.
The fix from EVGA is to ratchet up the number of sensors on the card PCB and wrap them with intelligence in the form of MCUs, updated Precision XOC software and user viewable LEDs on the card itself.
EVGA graphics cards with iCX Technology will include 9 total thermal sensors on the board, independent of the GPU temperature sensor directly integrated by NVIDIA. There are three sensors for memory, five for power delivery and an additional sensor for the GPU temperature. Some are located on the back of the PCB to avoid any conflicts with trace routing between critical components, including the secondary GPU sensor.
Subject: General Tech | January 31, 2017 - 12:57 PM | Jeremy Hellstrom
Tagged: game, nvidia, GTX 1080, gtx 1070, For Honor, tom clancy, Ghost Recon Wildlands
Today NVIDIA offers a new free Ubisoft game for those picking up a GTX 1070, GTX 1080 or a system containing one or more of those cards. You can choose either For Honor, an arena stlye game pitting Knights, Samurai and Vikings in hand to hand combat or Tom Clancy's Ghost Recon Wildlands which will lie somewhere between Arma and Just Cause. Neither game is yet released, For Honor arrives February 14th while Ghost Recon Wildlands doesn't launch until March 7th but you can get an early look at the game.
NVIDIA has also made the process to collect your game somewhat easier, as long as your GeForce and Ubisoft accounts are linked you can simply enter the code to chose your free game. If you are one to avoid Uplay at all costs you could always give your code away as a gift.
"We are also debuting a new easier way to redeem codes through GeForce Experience, it means customers no longer have to tolerate long sign up webpages but can simply enter their code within GeForce Experience itself and have their choice of game automatically added to their Uplay account."
Here is some more Tech News from around the web:
- Magnetic skyrmion 'brain' connections save energy @ Nanotechweb
- Flashy Intel sees the XPoint of solid state @ The Register
- Microsoft rumoured to be remixing Windows RT as Windows Cloud @ The Inquirer
- Android 7.1.2 beta release signals end of life for the Nexus 6 and Nexus 9 @ The Inquirer
- IPv6 for Server Admins and Client Developers @ Linux.com
- 'It's Tricky': Apple Misses the Deadline To Pay $13.9 Bn To Ireland in Illegal Tax Benefit @ Slashdot
Subject: Graphics Cards | January 10, 2017 - 10:11 PM | Tim Verry
Tagged: CES, CES 2017, aorus, gigabyte, xtreme gaming, GTX 1080, pascal
One interesting development from Gigabyte at this year’s CES was the expansion of its Aorus branding and the transition from Xtreme Gaming. Initially used on its RGB LED equipped motherboards, the company is rolling out the brand to its other higher end products including laptops and graphics cards. While it appears that Xtreme Gaming is not going away entirely, Aorus is taking the spotlight with the introduction of the first Aorus branded graphics card: the GTX 1080.
Paul's Hardware got hands on with the new card (video) at the Gigabyte CES booth.
Featuring a similar triple 100mm fan cooler as the GTX 1080 Xtreme Gaming 8G, the Aorus GTX 1080 comes with x patterned LED lighting as well as a backlit Aorus logo on the side and a backlit Eagle on the backplate. The cooler is comprised of three 100mm double stacked fans (the center fan is recessed and spins in the opposite direction of the side fans) over a shrouded angled aluminum fin stack that connects to the GPU over five large copper heatpipes.
The graphics card is powered by two 8-pin PCI-E power connectors.
In an interesting twist, the card has two HDMI ports on the back of the card that are intended to be used to hook up front panel HDMI outputs for things like VR headsets. Another differentiator between the upcoming card and the Xtreme Gaming 8G is the backplate which has a large copper plate secured over the underside of the GPU. Several sites are reporting that this area can be used for watercooling, but I am skeptical of this as if you are going to go out and buy a waterblock for your graphics card you might as well buy a block to put on top of the GPU and not on the area of the PCB opposite the GPU!). As is, the copper plate on the backplate certainly won’t hurt cooling, and it looks cool, but that’s all I suspect it is.
Think Computers also checked out the Aorus graphics card. (video above)
Naturally, Gigabyte is not talking clock speeds on this new card, but I expect it to hit at least the same clocks as its Xtreme Gaming 8G predecessor which was clocked at 1759 MHz base and 1848 MHz boost out of the box and 1784 MHz base and 1936 MHz boost in OC Mode respectively. Gigabyte also overlocked the memory on that card up to 10400 MHz on OC Mode.
Gigabyte also had new SLI HB bridges on display bearing the Aorus logo to match the Aorus GPU. The company also had Xtreme Gaming SLI HB bridges though which further suggests that they are not completely retiring that branding (at least not yet).
Pricing has not been announced, but the card will be available in February.
Gigabyte has yet to release official photos of the card or a product page, but it should show up on their website shortly. In the meantime, Paul's Hardware and Think Computers shot some video of the card on the show floor which I have linked above if you are interested in the card. Looking on Amazon, the Xtreme Gaming 1080 8GB is approximately $690 before rebate so I would guess that the Aorus card would come out at a slight premium over that if only for the fact that it is a newer release, has a more expensive backplate and additional RGB LED backlighting.
What are your thoughts on the move to everything-Aorus?
Follow all of our coverage of the show at https://pcper.com/ces!
Subject: General Tech | December 7, 2016 - 03:04 PM | Jeremy Hellstrom
Tagged: nvidia, amd, gaming, watch dogs 2, GTX 1080, gtx 1070, gtx 1060, rx 480x, rx 470
[H]ard|OCP have spent a lot of with Watch Dogs 2 recently, enough to create three articles covering the game of which two are now published. The first article focuses on performance at ultra settings and finding the highest playable settings that the GPUs they tested were capable of, without installing the high resolution texture pack. As it turns out, the game is much more graphically demanding than many other recent releases, so much so that only the Titan X and GTX 1080 was able to perform at 4k resolutions, the GTX 1070 and 1060, as well as the RX 480 and 470 only feature at lower resolutions.
The second article looks at performance with the texture pack installed, which did not have much effect on overall performance but significantly increased VRAM usage. Even the mighty Titan X struggled with this game, we will need a new generation of GPUs to utilize all the available graphics features available in this game. The last review will be up soon and will focus on what effect each of the graphical settings have on the visual appearance of the game.
"Watch Dogs 2 has been released on the PC. We will have a three part evaluation of performance and image quality starting today with performance comparisons. We will also find the highest playable settings for each graphics card and the gameplay experience delivered. Finally, a graphically demanding game."
Here is some more Tech News from around the web:
- Ubisoft Giving Away Yet Another Free Game @ [H]ard|OCP
- Dishonored 2 update 1.3 brings performance boosts @ Rock, Paper, SHOTGUN
- Tobii Tech 4C eye tracker for gaming @ Kitguru
- Wot I Think: Tyranny @ Rock, Paper, SHOTGUN
- Gears of War 4 DirectX 12 Graphics Performance @ eTeknix
- Sniper Ghost Warrior 3 is an off-brand Far Cry game @ Rock, Paper, SHOTGUN
- The Last Guardian Is Finally Here—and Yes, It Was Worth the Wait @ Wired
- Dead Rising 4 shambles onto Windows 10 @ Rock, Paper, SHOTGUN
- Nvidia launches GeForce GTX 1050 and 1060 Indie Bundle @ HEXUS
- Deus Ex: Mankind Divided Graphics Performance Analysis @ eTeknix
- Mugs and mayhem: eight minutes of Prey @ Rock, Paper, SHOTGUN
- Tenebra is a free horror game inspired by silent films @ Rock, Paper, SHOTGUN
Subject: General Tech | November 3, 2016 - 10:35 AM | Ryan Shrout
Tagged: vrm, video, skyrim, qualcomm, prodigy, powercolor, podcast, nxp, multi-gpu, msi, micron, logitech, GTX 1080, gtx 1070, g231, evga, dx12, devil box, deus ex: mankind divided, amd, Alienware 13
PC Perspective Podcast #423 - 11/03/16
Join us this week as we discuss the Logitech Prodigy G231, multi-GPU scaling with DX12, Qualcomm buying NXP, issues with GTX 1070 and 1080 cards and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom
Program length: 1:10:25
Fragging Frogs VLAN 14 (summary)
Week in Review:
Today’s episode is brought to you by Harry’s! Use code PCPER at checkout!
News items of interest:
0:28:45 Qualcomm is going for a drive
Hardware/Software Picks of the Week
Jeremy: Need big long term storage
Subject: Graphics Cards | November 2, 2016 - 07:10 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, GTX1070, GTX1060, GTX 1080, fail, evga, ACX 3.0
Checklist time readers, do you have the following:
- A GTX 1060/1070/1080
- Which is from EVGA
- With an ACX 3.0 cooler
- With one of the model numbers above
If not, make like Bobby McFerrin.
If so, you have a reason to be concerned and EVGA offers their apologies and more importantly, a fix. EVGA's tests, which emulate the ones performed at Tom's show that the thermal temperature of the PWM and memory was just marginally within spec. That is a fancy way of saying that in certain circumstances the PWM was running just short of causing a critical thermal incident, also know as catching on fire and letting out the magic smoke. They claim that this was because the testing focused on GPU temperature and the lowest acoustic levels possible and did not involve measuring the heat produced on memory or the VRM which is, as they say, a problem.
You have several choices of remedy from EVGA, please remember that you should reach out directly to their support, not NVIDIA's. You can try requesting a refund from the store you purchased it at but your best bet is EVGA.
The first option is a cross-ship RMA. Contact EVGA as a guest or with your account to set up an RMA and they will ship you a replacement card with a new VBIOS which will not have this issue and you won't need to send yours back until the replacement arrives.
You can flash to the new VBIOS which will adjust the fan-speed curve to ensure that your fans are running higher than 30% and will provide sufficient cooling to additional portions of the GPU. Your card will be louder but it will also be less likely to commit suicide in a dramatic fashion.
Lastly you can request a thermal pad kit, which EVGA suggests is unnecessary but certainly sounds like a good idea especially as it is free although requires you sign up for an EVGA account. Hopefully in the spare seconds currently available to the team we can get our hands on an ACX 3.0 cooled Pascal card with the VBIOS update and thermal pads so we can verify this for you.
This issue should not have happened and does reflect badly on certain factors of EVGA's testing. Their response has been very appropriate on the other hand, if you are affected then you can get a replacement card with no issues or you can fix the issue yourself. Any cards shipped, though not necessarily purchased, after Nov. 1st will have the new VBIOS so be careful if you are sticking with a new EVGA Pascal card.
Subject: Graphics Cards | October 28, 2016 - 12:01 AM | Tim Verry
Tagged: water cooling, GTX 1080, gtx 1070, gpu cooler, Alphacool, AIO
Alphacool recently launched an interesting liquid GPU cooling product under its Eiswolf branding. Coming in an AIO kit or as a standalone GPU cooler, the Eiswolf GPX Pro is currently compatible with the GTX 1070 and GTX 1080 graphics cards.
The Eiswolf GPX is a GPU water block that pairs a removable copper water block with a large aluminum fin stack that passively cools the memory chips and VRM hardware while also feeding some of the heat into the copper block (and then the water loop). Alphacool has custom milled the aluminum to exactly fit the GTX 1070 or GTX 1080 such that users do not need thermal pads for the memory (just a small amount of thermal paste) and only tiny and thin thermal pads for the VRM chips. The GPU block is all copper and houses the pump. A backplate is included and when installed the block hides the card’s PCB behind the aluminum plate with ocool logo. When it comes time to upgrade the graphics card, you can remove the block and only replace the aluminum block that is custom to a specific card, which is nice to see.
The Eiswolf GPX AIO is the kit version and gives users a fully functioning loop. In addition to the Eiswolf GPX GPU cooler, the AIO kit includes a 120mm radiator with two fans in push-pull configuration and tubing with quick disconnects on both tubes. The fan cables are sleeved and the 11/8mm tubing is resistant to kinking. The loop is all copper save for brass fittings. The quick disconnects make it easy to remove the GPU from the system or to expand the loop. Users can add a second GPU (which also gets them a second pump) and/or connect it to the company’s AIO CPU coolers. Of course, it would also be possible to connect it to your custom loop if you wanted.
Reportedly, when running two GPX coolers in a SLI (dual GPU) setup, it is possible to undervolt both pumps to reduce pump noise such that they are near silent.
The ability to expand the AIO loop and to upgrade to newer graphics cards easily makes this an interesting product though I would have liked to see a larger radiator option especially for those wanting to go the dual GPU / dual pump route!
The Alphacool GPX Pro 120 AIO kit is available for 150 Euros (~$164 USD) and the GPX Pro (the cooler Itself) is available for 120 Euros (~$131 USD). Pricing is a bit high, but it has the potentially to have a much longer useable life than other GPU AIOs. I am looking forward to the reviews of this new cooler. I would like to see support for other graphics cards though.
If you are interested in this cooler, Alphacool has a video on YouTube with more information.
Subject: Graphics Cards | October 6, 2016 - 03:17 PM | Tim Verry
Tagged: windforce, pascal, nvidia, GTX 1080, gigabyte
Gigabyte is launching a new graphics card with a blower style cooler that it is calling the GTX 1080 TT. The card, which is likely based on the NVIDIA reference PCB, uses a lateral-blower style single “WindForce Turbo Fan” fan. The orange and black shrouded fan takes design cues from the company’s higher end Xtreme Gaming cards and it has a very Mass Effect / Halo Forerunners vibe to it.
The GV-N1080TTOC-8GD is powered by a single 8-pin PCI-E power connector and has a 180W TDP. Despite not using more than one external power connector, the card does still have a bit of overclocking headroom (a total of 225W from the PCI-E spec, though overdrawing on the 8-pin has been done before if the card is not locked in the BIOS to not do so heh). External video outputs include one DVI, one HDMI, and three DisplayPorts. I wish that the DVI port had been cut so that the blower cooler could have a much larger vent to exhaust air out of the case with, but it is what it is.
Out of the box the Gigabyte GTX 1080 TT runs the Pascal-based 2560 CUDA core GPU at 1632 MHz base and 1772 MHz boost. In OC Mode the GPU runs at 1657 MHz base and 1797 MHz boost. The 8 GB of GDDR5X memory is left untouched at the stock 10 GHz in either case. For comparison, reference clock speeds are 1607 MHz base and 1733 MHz boost. As far as factory overclocks go, these are not bad (they are usually at least this conservative).
The heatsink uses three direct contact 6mm copper heat pipes for the GPU and aluminum plates on the VRM and memory chips that transfer heat to an aluminum fin channels that the blower fan at the back of the card uses to push case air over and out of the case. It may be possible to push the card beyond the OC mode clocks though it is not clear how stable boost clocks will be under load (or how loud the fan will be). We will have to wait for reviews on that. If you have a cramped case this may be a decent GTX 1080 option that is cheaper than the Founder's Edition desgin.
There is no word on pricing or an exact release date yet, but I would estimate it at around $640 at launch.
Subject: Graphics Cards | September 20, 2016 - 03:58 PM | Scott Michaud
Tagged: microsoft, xbox, xbox one, pc gaming, nvidia, GTX 1080, gtx 1070
NVIDIA has just announced that specially marked, 10-series GPUs will be eligible for a Gears of War 4 download code. This bundle applies to GeForce GTX 1080 and GeForce GTX 1070 desktop GPUs, as well as laptops which integrate either of those two GPUs. As always, if you plan on purchasing a GPU due to this bundle, make sure that the product page for your retailer mentions the bundle.
Also, through the Xbox Play Anywhere initiative, NVIDIA claims that this code can be used to play the game on Xbox One as well. Xbox Play Anywhere allows users to purchase a game on either of Microsoft's software stores, Xbox Store or Windows Store, and it will automatically count as a purchase for the cross-platform equivalent. It also has implications for cloud saves, but that's a story for another day.
The bundle begins today, September 20th. Gears of War 4 launches on October 11th.
Subject: Graphics Cards | September 20, 2016 - 03:35 PM | Jeremy Hellstrom
Tagged: gigabyte, GTX 1080, GTX 1080 Xtreme Gaming Premium, factory overclocked, GIGABYTE Xtreme Engine, vr link
Gigabyte's GeForce GTX 1080 Xtreme Gaming comes with a nice overclock right out of the box, 1759MHz base, 1898MHz boost clock and a small bump to the VRAM frequency to 10.2GHz. At the push of a button you can add an extra 25MHz to the GPUs clocks assuming you install the bundled GIGABYTE Xtreme Engine which also allows you to manually tweak your settings. The Package part of the official name indicates that Gigabyte's Xtreme VR Link header panel is included with the card, you can install it in the front of your case to provide easy access to two HDMI connectors and two USB 3.0 ports for a VR headset.
Pop on over to [H]ard|OCP to see how much more they could get out of the card as well as the effect it had on gameplay.
"GIGABYTE’s GeForce GTX 1080 Xtreme Gaming Premium Pack is one premium package of goodness. Not only have we got one of the fastest GeForce GTX 1080 video cards, but GIGABYTE has thrown in the kitchen sink in this Premium Package with enthusiast oriented gaming as the focus."
Here are some more Graphics Card articles from around the web:
- MSI GTX 1070 Gaming Z 8 GB @ techPowerUp
- MSI GeForce GTX 1080 and GTX 1070 Gaming X 8G Review @ Neoseeker
- ASUS GTX 1080 & GTX 1070 STRIX OC Review @ Hardware Canucks
- ASUS GTX 1060 STRIX OC 6 GB @ techPowerUp