Author:
Manufacturer: EVGA

The new EVGA GTX 1080 FTW2 with iCX Technology

Back in November of 2016, EVGA had a problem on its hands. The company had a batch of GTX 10-series graphics cards using the new ACX 3.0 cooler solution leave the warehouse missing thermal pads required to keep the power management hardware on its cards within reasonable temperature margins. To its credit, the company took the oversight seriously and instituted a set of solutions for consumers to select from: RMA, new VBIOS to increase fan speeds, or to install thermal pads on your hardware manually. Still, as is the case with any kind of product quality lapse like that, there were (and are) lingering questions about EVGA’s ability to maintain reliable product; with features and new options that don’t compromise the basics.

Internally, the drive to correct these lapses was…strong. From the very top of the food chain on down, it was hammered home that something like this simply couldn’t occur again, and even more so, EVGA was to develop and showcase a new feature set and product lineup demonstrating its ability to innovate. Thus was born, and accelerated, the EVGA iCX Technology infrastructure. While this was something in the pipeline for some time already, it was moved up to counter any negative bias that might have formed for EVGA’s graphics cards over the last several months. The goal was simple: prove that EVGA was the leader in graphics card design and prove that EVGA has learned from previous mistakes.

EVGA iCX Technology

Previous issues aside, the creation of iCX Technology is built around one simple question: is one GPU temperature sensor enough? For nearly all of today’s graphics cards, cooling is based around the temperature of the GPU silicon itself, as measured by NVIDIA (for all of EVGA’s cards). This is how fan curves are built, how GPU clock speeds are handled with GPU Boost, how noise profiles are created, and more. But as process technology has improved, and GPU design has weighed towards power efficiency, the GPU itself is often no longer the thermally limiting factor.

slides05.jpg

As it turns out, converting 12V (from the power supply) to ~1V (necessary for the GPU) is a simple process that creates a lot of excess heat. The thermal images above clearly demonstrate that and EVGA isn’t the only card vendor to take notice of this. As it turns out, EVGA’s product issue from last year was related to this – the fans were only spinning fast enough to keep the GPU cool and did not take into account the temperature of memory or power delivery.

The fix from EVGA is to ratchet up the number of sensors on the card PCB and wrap them with intelligence in the form of MCUs, updated Precision XOC software and user viewable LEDs on the card itself.

slides10.jpg

EVGA graphics cards with iCX Technology will include 9 total thermal sensors on the board, independent of the GPU temperature sensor directly integrated by NVIDIA. There are three sensors for memory, five for power delivery and an additional sensor for the GPU temperature. Some are located on the back of the PCB to avoid any conflicts with trace routing between critical components, including the secondary GPU sensor.

Continue reading about EVGA iCX Technology!

Author:
Manufacturer: NVIDIA

NVIDIA P100 comes to Quadro

At the start of the SOLIDWORKS World conference this week, NVIDIA took the cover off of a handful of new Quadro cards targeting professional graphics workloads. Though the bulk of NVIDIA’s discussion covered lower cost options like the Quadro P4000, P2000, and below, the most interesting product sits at the high end, the Quadro GP100.

As you might guess from the name alone, the Quadro GP100 is based on the GP100 GPU, the same silicon used on the Tesla P100 announced back in April of 2016. At the time, the GP100 GPU was specifically billed as an HPC accelerator for servers. It had a unique form factor with a passive cooler that required additional chassis fans. Just a couple of months later, a PCIe version of the GP100 was released under the Tesla GP100 brand with the same specifications.

quadro2017-2.jpg

Today that GPU hardware gets a third iteration as the Quadro GP100. Let’s take a look at the Quadro GP100 specifications and how it compares to some recent Quadro offerings.

  Quadro GP100 Quadro P6000 Quadro M6000 Full GP100
GPU GP100 GP102 GM200 GP100 (Pascal)
SMs 56 60 48 60
TPCs 28 30 24 (30?)
FP32 CUDA Cores / SM 64 64 64 64
FP32 CUDA Cores / GPU 3584 3840 3072 3840
FP64 CUDA Cores / SM 32 2 2 32
FP64 CUDA Cores / GPU 1792 120 96 1920
Base Clock 1303 MHz 1417 MHz 1026 MHz TBD
GPU Boost Clock 1442 MHz 1530 MHz 1152 MHz TBD
FP32 TFLOPS (SP) 10.3 12.0 7.0 TBD
FP64 TFLOPS (DP) 5.15 0.375 0.221 TBD
Texture Units 224 240 192 240
ROPs 128? 96 96 128?
Memory Interface 1.4 Gbps
4096-bit HBM2
9 Gbps
384-bit GDDR5X
6.6 Gbps
384-bit
GDDR5
4096-bit HBM2
Memory Bandwidth 716 GB/s 432 GB/s 316.8 GB/s ?
Memory Size 16GB 24 GB 12GB 16GB
TDP 235 W 250 W 250 W TBD
Transistors 15.3 billion 12 billion 8 billion 15.3 billion
GPU Die Size 610mm2 471 mm2 601 mm2 610mm2
Manufacturing Process 16nm 16nm 28nm 16nm

There are some interesting stats here that may not be obvious at first glance. Most interesting is that despite the pricing and segmentation, the GP100 is not the de facto fastest Quadro card from NVIDIA depending on your workload. With 3584 CUDA cores running at somewhere around 1400 MHz at Boost speeds, the single precision (32-bit) rating for GP100 is 10.3 TFLOPS, less than the recently released P6000 card. Based on GP102, the P6000 has 3840 CUDA cores running at something around 1500 MHz for a total of 12 TFLOPS.

gp102-blockdiagram.jpg

GP100 (full) Block Diagram

Clearly the placement for Quadro GP100 is based around its 64-bit, double precision performance, and its ability to offer real-time simulations on more complex workloads than other Pascal-based Quadro cards can offer. The Quadro GP100 offers 1/2 DP compute rate, totaling 5.2 TFLOPS. The P6000 on the other hand is only capable of 0.375 TLOPS with the standard, consumer level 1/32 DP rate. Inclusion of ECC memory support on GP100 is also something no other recent Quadro card has.

quadro2017-3.jpg

Raw graphics performance and throughput is going to be questionable until someone does some testing, but it seems likely that the Quadro P6000 will still be the best solution for that by at least a slim margin. With a higher CUDA core count, higher clock speeds and equivalent architecture, the P6000 should run games, graphics rendering and design applications very well.

There are other important differences offered by the GP100. The memory system is built around a 16GB HBM2 implementation which means more total memory bandwidth but at a lower capacity than the 24GB Quadro P6000. Offering 66% more memory bandwidth does mean that the GP100 offers applications that are pixel throughput bound an advantage, as long as the compute capability keeps up on the backend.

m.jpg

Continue reading our preview of the new Quadro GP100!

Gigabyte Shows Off Half Height GTX 1050 and GTX 1050 Ti Graphics Cards

Subject: Graphics Cards | January 18, 2017 - 03:31 AM |
Tagged: SFF, pascal, low profile, GTX 1050 Ti, gtx 1050, gigabyte

Without much fanfare Gigabyte recently launched two new low profile half height graphics cards packing factory overclocked GTX 1050 and GTX 1050 Ti GPUs. The new cards measure 6.6” x 2.7” x 1.5” (167mm long) and are cooled by a small shrouded single fan cooler. 
 
Gigabyte GTX 1050 OC Low Profile 2G.png
 
Around back, both the Gigabyte GTX 1050 OC Low Profile 2G and GTX 1050 Ti OC Low Profile 4G offer four display outputs in the form of two HDMI 2.0b, one DisplayPort 1.4, and one dual-link DVI-D. It appears that Gigabyte is using the same cooler for both cards. There is not much information on this cooler, but it utilizes an aluminum heatsink and what looks like a ~50mm fan. Note that while the cards are half-height, they use a dual slot design which may limit the cases it can be used in.
 
The GTX 1050 OC Low Profile 2G features 640 Pascal-based CUDA cores clocked at 1366 MHz base and 1468 MHz boost out of the box (1392 MHz base and 1506 MHz boost in OC Mode using Gigabyte’s software) and 2GB of GDDR5 memory at 7008 MHz (7GT/s). For comparison, the GTX 1050 reference clock speeds are 1354 MHz base and 1455 MHz boost.
 
Meanwhile, the GTX 1050 Ti OC Low Profile 4G has 768 cores clocked at 1303 MHz base and 1417 MHz boost by default and 1328 MHz base and 1442 MHz boost in OC Mode. The GPU is paired with 4GB of GDDR5 memory at 7GT/s. NVIDIA’s reference GPU clocks are 1290 MHz base and 1392 MHz boost.
 
The pint-sized graphics cards would certainly allow for gaming on your SFF home theater or other desktop PC as well as being an easy upgrade to make a tiny OEM PC gaming capable (think those thin towers HP, Lenovo, and Dell like to use). 
 
Of course, Gigabyte is not yet talking pricing and availability has only been narrowed down to a general Q1 2017 time frame. I would expect the cards to hit retailers within a month or so and be somewhere around $135 for their half height GTX 1050 OC LP 2G and approximately $155 for the faster GTX 1050 Ti variant. That is to say that the low profile cards should be available at a slight premium over the company's larger GTX 1050 and GTX 1050 Ti graphics cards.
Source: Gigabyte

CES 2017: Gigabyte Shows Off First Aorus Branded Graphics Card

Subject: Graphics Cards | January 11, 2017 - 03:11 AM |
Tagged: CES, CES 2017, aorus, gigabyte, xtreme gaming, GTX 1080, pascal

One interesting development from Gigabyte at this year’s CES was the expansion of its Aorus branding and the transition from Xtreme Gaming. Initially used on its RGB LED equipped motherboards, the company is rolling out the brand to its other higher end products including laptops and graphics cards. While it appears that Xtreme Gaming is not going away entirely, Aorus is taking the spotlight with the introduction of the first Aorus branded graphics card: the GTX 1080.

Aorus GTX 1080 CES 2017 Pauls Hardware.png

Paul's Hardware got hands on with the new card (video) at the Gigabyte CES booth.

Featuring a similar triple 100mm fan cooler as the GTX 1080 Xtreme Gaming 8G, the Aorus GTX 1080 comes with x patterned LED lighting as well as a backlit Aorus logo on the side and a backlit Eagle on the backplate. The cooler is comprised of three 100mm double stacked fans (the center fan is recessed and spins in the opposite direction of the side fans) over a shrouded angled aluminum fin stack that connects to the GPU over five large copper heatpipes.

The graphics card is powered by two 8-pin PCI-E power connectors.

In an interesting twist, the card has two HDMI ports on the back of the card that are intended to be used to hook up front panel HDMI outputs for things like VR headsets. Another differentiator between the upcoming card and the Xtreme Gaming 8G is the backplate which has a large copper plate secured over the underside of the GPU. Several sites are reporting that this area can be used for watercooling, but I am skeptical of this as if you are going to go out and buy a waterblock for your graphics card you might as well buy a block to put on top of the GPU and not on the area of the PCB opposite the GPU!). As is, the copper plate on the backplate certainly won’t hurt cooling, and it looks cool, but that’s all I suspect it is.

Think Computers also checked out the Aorus graphics card. (video above)

Naturally, Gigabyte is not talking clock speeds on this new card, but I expect it to hit at least the same clocks as its Xtreme Gaming 8G predecessor which was clocked at 1759 MHz base and 1848 MHz boost out of the box and 1784 MHz base and 1936 MHz boost in OC Mode respectively. Gigabyte also overlocked the memory on that card up to 10400 MHz on OC Mode.

Gigabyte also had new SLI HB bridges on display bearing the Aorus logo to match the Aorus GPU. The company also had Xtreme Gaming SLI HB bridges though which further suggests that they are not completely retiring that branding (at least not yet).

Pricing has not been announced, but the card will be available in February.

Gigabyte has yet to release official photos of the card or a product page, but it should show up on their website shortly. In the meantime, Paul's Hardware and Think Computers shot some video of the card on the show floor which I have linked above if you are interested in the card. Looking on Amazon, the Xtreme Gaming 1080 8GB is approximately $690 before rebate so I would guess that the Aorus card would come out at a slight premium over that if only for the fact that it is a newer release, has a more expensive backplate and additional RGB LED backlighting.

What are your thoughts on the move to everything-Aorus? 

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

Serious Sam VR, now with tag teaming NVIDIA cards

Subject: General Tech | November 30, 2016 - 08:31 PM |
Tagged: serious sam vr, nvidia, gaming, pascal

Having already looked at AMD's performance with two RX 480's in a system, the recent patch which enables support for multiple NVIDIA GPUs have dragged [H]ard|OCP back into the game.  Lacking a pair of Titan X cards, they tested the performance of a pair of GTX 1080s and 1070s; the GTX 1060 will not be receiving support from Croteam.  It would seem that adding a second Pascal card to your system will benefit you, however the scaling they saw was nowhere near as impressive as with the AMD RX 480 which saw a 36% boost.  Check out the full results here and yes ... in this case the m in mGPU indicates multiple GPUs, not mobile.

1480447611nZ8LrzvbVG_5_1.jpg

"Serious Sam VR was the first commercial enthusiast gaming title to include multi-GPU support with AMD's RX 480 GPU. Now the folks at Croteam have added mGPU support for NVIDIA cards as well. We take a look at how well NVIDIA's VRSLI technology fares in this VR shooter title."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

A Holiday Project

A couple of years ago, I performed an experiment around the GeForce GTX 750 Ti graphics card to see if we could upgrade basic OEM, off-the-shelf computers to become competent gaming PCs. The key to this potential upgrade was that the GTX 750 Ti offered a great amount of GPU horsepower (at the time) without the need for an external power connector. Lower power requirements on the GPU meant that even the most basic of OEM power supplies should be able to do the job.

That story was a success, both in terms of the result in gaming performance and the positive feedback it received. Today, I am attempting to do that same thing but with a new class of GPU and a new class of PC games.

The goal for today’s experiment remains pretty much the same: can a low-cost, low-power GeForce GTX 1050 Ti graphics card that also does not require any external power connector offer enough gaming horsepower to upgrade current shipping OEM PCs to "gaming PC" status?

Our target PCs for today come from Dell and ASUS. I went into my local Best Buy just before the Thanksgiving holiday and looked for two machines that varied in price and relative performance.

01.jpg

  Dell Inspiron 3650 ASUS M32CD-B09
Processor Intel Core i3-6100 Intel Core i7-6700
Motherboard Custom Custom
Memory 8GB DDR4 12GB DDR4
Graphics Card Intel HD Graphics 530 Intel HD Graphics 530
Storage 1TB HDD 1TB Hybrid HDD
Case Custom Custom
Power Supply 240 watt 350 watt
OS Windows 10 64-bit Windows 10 64-bit
Total Price $429 (Best Buy) $749 (Best Buy)

The specifications of these two machines are relatively modern for OEM computers. The Dell Inspiron 3650 uses a modest dual-core Core i3-6100 processor with a fixed clock speed of 3.7 GHz. It has a 1TB standard hard drive and a 240 watt power supply. The ASUS M32CD-B09 PC has a quad-core HyperThreaded processor with a 4.0 GHz maximum Turbo clock, a 1TB hybrid hard drive and a 350 watt power supply. Both of the CPUs share the same Intel brand of integrated graphics, the HD Graphics 520. You’ll see in our testing that not only is this integrated GPU unqualified for modern PC gaming, but it also performs quite differently based on the CPU it is paired with.

Continue reading our look at upgrading an OEM machine with the GTX 1050 Ti!!

Author:
Manufacturer: MSI

We have a lot of gaming notebooks

Back in April I did a video with MSI that looked at all of the gaming notebook lines it built around the GTX 900-series of GPUs. Today we have stepped it up a notch, and again are giving you an overview of MSI's gaming notebook lines that now feature the ultra-powerful GTX 10-series using NVIDIA's Pascal architecture. That includes the GTX 1060, GTX 1070 and GTX 1080.

What differentiates the various series of notebooks from MSI? The GE series is for entry level notebook gaming, the GS series offers slim options while the GT series is the ultimate PC gaming mobile platforms. 

  GE series GS series GT62/72 series GT 73/83 series
MSRP $1549-1749 $1499-2099 $1499-2599 $2199-4999
Screen 15.6" and 17.3"
1080p
14", 15.6" and 17.3"
1080p and 4K
15.6" and 17.3"
1080p, G-Sync
17.3" and 18"
1080p, 4K
G-Sync (varies)
CPU Core i7-6700HQ Core i7-6700HQ Core i7-6700HQ Core i7-6820HK
Core i7-6920HQ
GPU GTX 1060 6GB GTX 1060 6GB GTX 1060 6GB
GTX 1070 8GB
GTX 1070 8GB (SLI option)
GTX 1080 8GB (SLI option)
RAM 12-16GB 16-32GB 12-32GB 16-64GB
Storage 128-512GB M.2 SATA
1TB HDD
128-512GB M.2 SATA
1TB HDD
128-512GB PCIe and SATA
1TB HDD
Up to 1TB SSD (SATA, NVMe)
1TB HDD
Optical DVD Super-multi None Yes (GT72 only) Blu-ray burner (GT83 only)
Features Killer E2400 LAN
USB 3.1 Type-C
Steel Series RGB Keyboard
Killer E2400 LAN
Killer 1535 WiFi
Thunderbolt 3
Killer E2400 LAN
Killer 1535 WiFi
USB 3.1 Type-C
3x USB 3.0 (GT62)
3x USB 3.0 (GT72)
Killer E2400 LAN
Killer 1535 WiFi
Thunderbolt 3
5x USB 3.0
Steel Series RGB (GT73)
Mechanical Keyboard (GT83)
Weight 5.29-5.35 lbs 3.75-5.35 lbs 6.48-8.33 lbs 8.59-11.59 lbs

Our video below will break down the differences and help point you toward the right notebook for you based on the three key pillars of performance, price and form factor.

Thanks goes out to CUK, Computer Upgrade King, for supplying the 9 different MSI notebooks for our testing and evaluation!

The down and dirty on the hot and bothered ACX 3.0 cards

Subject: Graphics Cards | November 2, 2016 - 11:10 PM |
Tagged: pascal, nvidia, GTX1070, GTX1060, GTX 1080, fail, evga, ACX 3.0

models.PNG

Checklist time readers, do you have the following:

  • A GTX 1060/1070/1080
  • Which is from EVGA
  • With an ACX 3.0 cooler
  • With one of the model numbers above

If not, make like Bobby McFerrin.

If so, you have a reason to be concerned and EVGA offers their apologies and more importantly, a fix.  EVGA's tests, which emulate the ones performed at Tom's show that the thermal temperature of the PWM and memory was just marginally within spec.  That is a fancy way of saying that in certain circumstances the PWM was running just short of causing a critical thermal incident, also know as catching on fire and letting out the magic smoke.  They claim that this was because the testing focused on GPU temperature and the lowest acoustic levels possible and did not involve measuring the heat produced on memory or the VRM which is, as they say, a problem.

Nvidia-GTX-1080-EVGA-FTW-Catches-Fire-840x473.jpg

You have several choices of remedy from EVGA, please remember that you should reach out directly to their support, not NVIDIA's.  You can try requesting a refund from the store you purchased it at but your best bet is EVGA.

The first option is a cross-ship RMA.  Contact EVGA as a guest or with your account to set up an RMA and they will ship you a replacement card with a new VBIOS which will not have this issue and you won't need to send yours back until the replacement arrives.

You can flash to the new VBIOS which will adjust the fan-speed curve to ensure that your fans are running higher than 30% and will provide sufficient cooling to additional portions of the GPU.  Your card will be louder but it will also be less likely to commit suicide in a dramatic fashion.

Lastly you can request a thermal pad kit, which EVGA suggests is unnecessary but certainly sounds like a good idea especially as it is free although requires you sign up for an EVGA account.  Hopefully in the spare seconds currently available to the team we can get our hands on an ACX 3.0 cooled Pascal card with the VBIOS update and thermal pads so we can verify this for you.

This issue should not have happened and does reflect badly on certain factors of EVGA's testing.  Their response has been very appropriate on the other hand, if you are affected then you can get a replacement card with no issues or you can fix the issue yourself.  Any cards shipped, though not necessarily purchased, after Nov. 1st will have the new VBIOS so be careful if you are sticking with a new EVGA Pascal card.

Source: EVGA

Pascal drops some weight; GTX 1050 and 1050 Ti fight in the under $150 category

Subject: Graphics Cards | October 25, 2016 - 05:21 PM |
Tagged: pascal, nvidia, msi, GTX 1050 Ti, gtx 1050, GP107

The Guru of 3D tested out MSI's GeForce GTX 1050 and 1050 Ti, with MSRP's of $109 and $139 respectively.  The non-Ti version has the lowest count of Texture Mapping Units of this generation but a higher GPU frequency that the Ti model, it also has the smallest amount of memory at 2GB though at least it is clocked the same in both models.  DirectX 12 testing offers variable results, in many games the two are bookends to the RX 460 with the GTX 1050 a bit slower and the 1050 Ti a bit faster but this does not hold true in all games.  DirectX 11 results were more favourable for this architecture, the two cards climbed in the rankings with the 1050 Ti offering acceptable performance.  Check out their full review here.

specs.PNG

"Last week Nvidia announced the GeForce GTX 1050 series, with two primary models. In this article we'll review the MSI GeForce GTX 1050 and 1050 Ti Gaming X, two graphics cards aimed at the budget minded consumer. We say budget minded as these cards are very affordable and positioned in an attractive 109 and 139 dollar (US) segment."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Guru of 3D

NVIDIA GTX 1050 and GTX 1050 Ti Launch on October 25th

Subject: General Tech | October 18, 2016 - 01:01 PM |
Tagged: pascal, nvidia, GTX 1050 Ti, gtx 1050

NVIDIA has just announced that the GeForce GTX 1050 ($109) and GeForce GTX 1050 Ti ($139) will launch on October 25th. Both of these Pascal-based cards target the 75W thermal point, which allows them to be powered by a PCIe bus without being tethered directly to the power supply. Like the GTX 750 Ti before it, this allows users to drop it into many existing desktops, upgrading it with discrete graphics.

nvidia-2016-gtx1050-01-bench.jpg

Most of NVIDIA's press deck focuses on the standard GTX 1050. This $109 SKU contains 2GB of GDDR5 memory and 640 CUDA cores, although the core frequency has not been announced at the time of writing. Instead, NVIDIA has provided a handful of benchmarks, comparing the GTX 1050 to the earlier GTX 650 and the Intel Core i5-4760k integrated graphics.

nvidia-2016-gtx1050-02-gfe.jpg

It should be noted that, to hit their >60FPS targets, Gears of War 4 and Grand Theft Auto V needed to be run at medium settings, and Overwatch was set to high. (DOTA2 and World of Warcraft were maxed out, though.) As you might expect, NVIDIA reminded the press about GeForce Experience's game optimization setting just a few slides later. The implication seems to be that, while it cannot max out these games at 1080p, NVIDIA will at least make it easy for users to experience its best-case scenario, while maintaining 60FPS.

So yes, while it's easy to claim 60 FPS is you're able to choose the settings that fit this role, it's a much better experience than the alternative parts they list. On the GTX 650, none of these titles are able to hit an average of 30 FPS, and integrated graphics cannot even hit 15 FPS. This card seems to be intended for users that are interested in playing eSports titles maxed out at 1080p60, while enjoying newer blockbusters, albeit at reduced settings, but have an old, non-gaming machine they can salvage.

nvidia-2016-gtx1050-03withti.jpg

Near the end of their slide deck, they also mention that the GTX 1050 Ti exists. It's basically the same use case as above, with its 75W TDP and all, but with $30 more performance. The VRAM doubles from 2GB to 4GB, which should allow higher texture resolutions and more mods, albeit still targeting 1080p. It also adds another 128 CUDA cores, a 20% increase, although, again, that is somewhat meaningless until we find out what the card is clocked at.

Update: Turns out we did find clock speeds! The GTX 1050 will have a base clock of 1354 MHz and a Boost clock of 1455 MHz while the GTX 1050 Ti will run at 1290/1392 MHz respectively.

NVIDIA's promotional video

Obviously, numbers from a vendor are one thing, and a third-party benchmark is something else entirely (especially when the vendor benchmarks do not compare their product to the latest generation of their competitor). Keep an eye out for reviews.

Source: NVIDIA