EVGA iCX Technology - Adding Thermal Sensors and Improved Cooling

Manufacturer: EVGA

The new EVGA GTX 1080 FTW2 with iCX Technology

Back in November of 2016, EVGA had a problem on its hands. The company had a batch of GTX 10-series graphics cards using the new ACX 3.0 cooler solution leave the warehouse missing thermal pads required to keep the power management hardware on its cards within reasonable temperature margins. To its credit, the company took the oversight seriously and instituted a set of solutions for consumers to select from: RMA, new VBIOS to increase fan speeds, or to install thermal pads on your hardware manually. Still, as is the case with any kind of product quality lapse like that, there were (and are) lingering questions about EVGA’s ability to maintain reliable product; with features and new options that don’t compromise the basics.

Internally, the drive to correct these lapses was…strong. From the very top of the food chain on down, it was hammered home that something like this simply couldn’t occur again, and even more so, EVGA was to develop and showcase a new feature set and product lineup demonstrating its ability to innovate. Thus was born, and accelerated, the EVGA iCX Technology infrastructure. While this was something in the pipeline for some time already, it was moved up to counter any negative bias that might have formed for EVGA’s graphics cards over the last several months. The goal was simple: prove that EVGA was the leader in graphics card design and prove that EVGA has learned from previous mistakes.

EVGA iCX Technology

Previous issues aside, the creation of iCX Technology is built around one simple question: is one GPU temperature sensor enough? For nearly all of today’s graphics cards, cooling is based around the temperature of the GPU silicon itself, as measured by NVIDIA (for all of EVGA’s cards). This is how fan curves are built, how GPU clock speeds are handled with GPU Boost, how noise profiles are created, and more. But as process technology has improved, and GPU design has weighed towards power efficiency, the GPU itself is often no longer the thermally limiting factor.

View Full Size

As it turns out, converting 12V (from the power supply) to ~1V (necessary for the GPU) is a simple process that creates a lot of excess heat. The thermal images above clearly demonstrate that and EVGA isn’t the only card vendor to take notice of this. As it turns out, EVGA’s product issue from last year was related to this – the fans were only spinning fast enough to keep the GPU cool and did not take into account the temperature of memory or power delivery.

The fix from EVGA is to ratchet up the number of sensors on the card PCB and wrap them with intelligence in the form of MCUs, updated Precision XOC software and user viewable LEDs on the card itself.

View Full Size

EVGA graphics cards with iCX Technology will include 9 total thermal sensors on the board, independent of the GPU temperature sensor directly integrated by NVIDIA. There are three sensors for memory, five for power delivery and an additional sensor for the GPU temperature. Some are located on the back of the PCB to avoid any conflicts with trace routing between critical components, including the secondary GPU sensor.

Continue reading about EVGA iCX Technology!

View Full Size

These new sensors are monitored through EVGA Precision XOC software and a new window that clearly shows positions and current temperatures. You can also track them through the logging and graphing tools including with XOC.

View Full Size

The fans on iCX enabled graphics cards will run asynchronously, with the left fan spinning based on the GPU temp and the right fan based on the memory and power delivery temp. Both fans can be controlled independently in the software with uniquely created profiles at the user’s discretion.

View Full Size

On top of the card you’ll find a Thermal Display System, a fancy name for an area where three LEDs will indicate the current temperature status of the three different location. All adjustable and customizable through the Precision XOC software, the default behavior changes between blue, green and red based on crossing different temperature thresholds. The display has a G for the GPU temp, P for the PWM and M for memory.

This gives users with windowed chassis or open air test beds an easy place to look to check on the status of their GPU and if any components are running hotter than expected. It also means that if you have a defect in your product, or you are applying too much voltage to component, you may get a heads up before any larger issues arise.

And, to be blunt, this kind of system would have made preventing last year’s ACX 3.0 issue a breeze.

The iCX Technology also includes a new cooler as well, with high-contact base plates and back plates.

View Full Size

Trust me, this thing touches EVERYTHING it can, including the memory, PWM componentry, and more, while integrating unique touches like interlaced pin fins in an attempt to increase cooling overall with more surface area. The back plate is split between the GPU and the PWM sections, minimizing the amount of crossover between the two sources.

View Full Size

EVGA has updated the primary cooler with small holes along the fins and fins that open up halfway down when contacting with the heatpipe itself. The result is air that can more easily move down over the fins and between them, improving overall airflow and reducing fan bounce-back.

View Full Size

A new safety fuse has been put in place on iCX designs between the motherboard and the graphics card, with a 10A rating, meant to prevent any potential damage to your graphics card from finding its way to the rest of your system. This is not a user replaceable part – when it blows you have an RMA on your hands anyway.

View Full Size

I don’t have like-for-like comparison data for the hardware in-house, but EVGA did provide me with this table to showcase the advantages of the new cooler and how it can improve the temperatures in all areas. GPU, memory and power delivery are all running cooler and at deltas of 1-6C.

Testing EVGA iCX

In my time with an EVGA GeForce GTX 1080 FTW2 iCX this week, everything seemed to be working exactly as EVGA had explained. The new version of Precision XOC shows the temperatures of the GPU, power, and memory directly on the first screen, taking the highest of the current readings.

View Full Size

View Full Size

Hitting the sensor button on the left reveals the detailed imagery, showing temps for all nine additional sensors. There is variance amongst the various locations on the PCB, though I tend to see that one is always hotter than the rest. For example, the memory reading to the right of the GPU is higher than the others due to its proximity to the power components. It seems likely that the software will and MCUs will always base their algorithms based on that data point, but having fallback for oddities or dead/incorrect sensors is a positive.

Despite my best efforts, at stock settings in Precision XOC, I was never able to get any of the temperature indicators to break into the “red” zones of 83C – the cooler on the FTW2 card kept everything at 75C or below even after extended use.

View Full Size

I did cheat a little – by lowering the “red” zones to 70C to make sure the functionality was complete, and it was; no issues to be found.

View Full Size

Including benchmarks or graphs here seems a little counterproductive – we don’t have identical matching ACX 3.0 and iCX card options so the performance, temperatures and clock speeds are obviously going to be different. And whether or not the new cooler has anything to do with it would be nearly impossible to tell anyway. I can say that the noise levels for both our GTX 1080 FTW2 with iCX and the GTX 1080 SC with ACX 3.0 were identical in practice.

View Full Size

Also, I used our FLIR One to look at the front and back of each card and didn’t see anything that stood out.

View Full Size

EVGA GTX 1080 FTW2 with iCX

View Full Size

EVGA GTX 1080 FTW2 with iCX

View Full Size

EVGA GTX 1080 SC with ACX 3.0

View Full Size

EVGA GTX 1080 SCwith ACX 3.0

The split backplate definitely has some effect in keeping the heat on one zone from spreading the other on the iCX design.

Final Thoughts

Overall I think that the new EVGA iCX Technology is the best solution for graphics card monitoring and cooling on the market today. The ability to monitor not just GPU temperature, but also VRM and memory temps, without having to add your own thermocouples or monitoring hardware, will be a great advantage for enthusiasts and overclockers that like to push the edge when overclocking. If you are curious about swapping out coolers for an AIO water cooler or even a custom water block, having access to this data is even more important to you.

View Full Size

The problem for EVGA is that, exceptions of specific defects aside, they have made just excellent coolers for the last generation or two of parts that the direct advantages to the average consumer are going to be low. Unless you want to be able to monitor temperatures to prevent or find failures, the ACX 3.0 cooler design is great and keeps memory, power components and the GPU at well under the recommended spec.

EVGA expects the iCX Technology and upgraded cooler to add a $30 premium to the current GTX 10-series lineups when compared to the ACX 3.0 offerings, which will continue to co-exist. For current EVGA customers that have a 10-series card with the ACX 3.0 cooler, they will be offering a one-time upgrade to the new iCX iteration for $99. Consider the cost difference and that EVGA will have to sell old cards are refurbished, that seems like a reasonable asking price.

View Full Size

The new EVGA iCX Technology is impressive and will likely serve as a roadmap for future GPU development from competing AICs as well as reference designs coming from both NVIDIA and AMD. If you are upgrading to a new system and GPU in the immediate time frame, then the iCX options will be great solutions. But I would imagine many of our readers and gamers would be prepared to wait for this technology integration on the next generation of graphics chips. 

Video News

February 10, 2017 | 12:52 PM - Posted by Anonymous (not verified)

Is that whole "only one sensor on most graphics cards and only in thr GPU chip itself" even true?

I was under the impression that there were several sensors on modern graphics cards.

If there's only one and its in the GPU silicon itself, no wonder consumer level GPUs dont last worth a shit.

This would be especially bad if you have a semi passive design, and why i hate such designs. If the GPU is 60C, but there are no sensors on the VRM, thats a recipe for failure.

February 11, 2017 | 04:20 AM - Posted by Anonymous (not verified)

Ok well after googling a bit i found that there ARE indeed nvidia GPUs that have had VRM temperature sensors for years, particularly MSIs, which i have been under the impression are the best AIBs anyway.

February 11, 2017 | 07:10 AM - Posted by Anonymous (not verified)

Yes, EVGA should offer upgrade to a different brand. It would be worth 99USD.

February 10, 2017 | 12:55 PM - Posted by Anonymous (not verified)

Oh and the article also says they made great coolers. Didnt their 970 have a heatpipe not contacting the GPU at all? I know only Gigabyte 970s had a properly cooled VRM on their way too long windforce.

February 10, 2017 | 04:02 PM - Posted by FallenBytes (not verified)

They also don't make great fan controllers. Their 1080's have been plagued with fan ramping, stuttering, and instability (I myself have experienced this) RMA didn't help because the fan controller itself is flawed. This is what happens to the card under load - The fan RPM goes crazy any time the card uses more than 10% of it's power limit.

February 10, 2017 | 01:03 PM - Posted by Anonymous (not verified)

With VRAM not pumping out a huge amount of energy (unless you;re overclocking and overvolting) and the power FETs having a hilariously high heat tolerance compared to large chips (~150°C Tjmax is not unusual) this seems like massive overkill for what is really a nonissue.

February 10, 2017 | 02:48 PM - Posted by Anonymous (not verified)

If its a non issue why do power components seem to fail more often than anything else on a gpu?

February 10, 2017 | 01:24 PM - Posted by Master Chen (not verified)

Fooled me once...

February 10, 2017 | 02:22 PM - Posted by willmore

Nice article Ryan!

I'm concerned by the two fans running at different speeds. Even when fans run 'at the same speed', they never really run at the exact same speed, so you get some 'zero beating' where there is a side tone produced from the difference in fan frequency--that's the droning sound you hear from some video cards.

Now that the fans are allowed to run at completely different speeds, I'm concerned that the artifact will creap up into the audible range. did you notice anything like that in your testing? Do you have any recordings that we could feed into a spectral analyzer? I'd be glad to do that.

February 10, 2017 | 02:34 PM - Posted by Ryan Shrout

I'd have to do some more research, but my understanding was that having one fan running at a different frequency than the other might actually help drone out the noise of the slower fan, because the frequencies are non-additive.

February 10, 2017 | 05:35 PM - Posted by willmore

The sounds the two fans produce will add and subtract in the normal way that sound waves do, but they'll also mix--which is a multiplicitive effect--whenever they encounter something non-linear. A handy non-linear area is the compressed air in front of a fan blade. :)

They're going to mix with each other. The only question is will the mixing products be more more or less annoying/noticable than they currently are with fans of about the same speed.

After all, they don't drive the fans to the same speed when they twin them, they drive them with the same PWM value, but due to differences in air flow causing different back pressure, they end up running at different speeds. They could, if they wanted to, control the fans actual speed and try to avoid that.

February 10, 2017 | 03:48 PM - Posted by KevinM (not verified)

I really like this. I wish EVGA had done these coolers from the beginning of the Nvidia 10 series.

I have not bought a 1060 yet, and am not sure if I will wait until the next gen Nvidia products are released. However, if I do purchase a 1060, EVGA iCX would be at the top of the list.

Very happy with my EVGA 750w G2 power supply. So happy that I bought sis a 650w G2 for Xmas! My impressions are that EVGA is a good company that cares about it customers and PC gaming.

Thanks Ryan for the article and information.

February 10, 2017 | 03:48 PM - Posted by TeraFloppy64 (not verified)

One of the Best Articles i had the pleasure to read...nice attention to the details,well Done Ryan.

February 10, 2017 | 07:21 PM - Posted by Phocks (not verified)

"There may have been reports of damage"?
There have been multiple instances of the cards exploding. Really disappointed in you Ryan for giving EVGA a free pass on this issue.

February 11, 2017 | 02:36 AM - Posted by KevinM (not verified)

Exploding? I thought I read some cards were damaged, and I am sure if any were that EVGA replaced them.

But yes it should have never been a problem in the first place. I still think they are a good company that makes quality PC components, although obviously not perfect.

February 11, 2017 | 07:31 PM - Posted by CNote

None of them "exploded" as much as caps overheated and popped. Some caught fire but thats not an explosion. The only one I could find that was "explosion-esc" was one that made a loud bang and messed up the mesh on the side of the card.

February 12, 2017 | 06:31 PM - Posted by KevinM (not verified)

Ok well that is technically a small explosion. I hope EVGA did right by their customers who had this problem and replaced the video card and any damaged components.

Not sure how something like this could happen. Quality control, poor design, cost cutting? Someone who should have been paying attention wasn't.

February 12, 2017 | 08:55 PM - Posted by CNote

Usually cost cutting at both ends, EVGA and who ever makes them in China.

February 13, 2017 | 01:26 AM - Posted by Anonymous (not verified)

Pretty sure my Gigabyte was made in Taiwan.

February 10, 2017 | 08:27 PM - Posted by Anonymous (not verified)

Most important question of all was left out

Q: How many sensors are dedicated to the RGB lighting

February 14, 2017 | 04:32 PM - Posted by Anonymous (not verified)

The article says "10 series", but only the 1080 is announced here. Is this only a halo product for the 1080, or available on midrange too?

February 14, 2017 | 07:41 PM - Posted by Anonymous (not verified)

"10 series" would indicate all skus, not just the 1080, that use the "10-XX" nomenclature. So anything from the entry level gtx1050 to the rumored gtx1080ti would be eligible.

That being said, I speculate that this enthusiast level cooling solution will be reserved only for their enthusiast level cards. I wouldn't expect to see this solution being implemented on anything lower than a 1070, and perhaps not even then. Just my two cents tho

February 14, 2017 | 08:10 PM - Posted by KevinM (not verified)

If you go to EVGA's website, they list two GTX 1060's as having the iCX cooling, although I am guessing they are not available yet.

I'd post link if I could but their website seems to be down or just not loading for me at the moment.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.