A good year to sell GPUs

Subject: General Tech | February 21, 2017 - 01:18 PM |
Tagged: jon peddie, marketshare, graphics cards

The GPU market increased 5.6% from Q3 to Q4 of 2016, beating the historical average of -4.7% by quite a large margin, over the year we saw an increase of 21.1%.  That increase is even more impressive when you consider that the total PC market dropped 10.1% in the same time, showing that far more consumers chose to upgrade their existing machines instead of buying new ones.  This makes sense as neither Intel nor AMD offered a compelling reason to upgrade your processor and motherboard for anyone who purchased one in the last two or three years.

AMD saw a nice amount of growth, grabbing almost 8% of the total market from NVIDIA over the year, though they lost a tiny bit of ground between Q3 and Q4 of 2016.  Jon Peddie's sample also includes workstation class GPUs as well as gaming models and it seems a fair number of users chose to upgrade their machines as that market increased just over 19% in 2016.

unnamed.png

"The graphics add-in board market has defied gravity for over a year now, showing gains while the overall PC market slips. The silly notion of integrated graphics "catching up" with discrete will hopefully be put to rest now," said Dr. Jon Peddie, president of Jon Peddie research, the industry's research and consulting firm for graphics and multimedia."

Here is some more Tech News from around the web:

Tech Talk

Faster than a speeding Gigabyte, the Aorus GTX 1080 XE

Subject: Graphics Cards | February 20, 2017 - 02:54 PM |
Tagged: nvidia, gtx 1080 Xtreme Edition, GTX 1080, gigabyte, aorus

Gigabyte created their Aorus line of products to attract enthusiasts away from some of the competitions sub-brands, such as ASUS ROG.  It is somewhat similar to the Gigabyte Xtreme Edition released last year but their are some differences, such as the large copper heatsink attached to the bottom of the GPU.  The stated clockspeeds are the same as last years model and it also sports the two HDMI connections on the front of the card to connect to Gigabyte's VR Extended Front panel.  The Tech Report manually overclocked the card and saw the Aorus reach the highest frequencies they have seen from a GP104 chip, albeit by a small margin.  Check out the full review right here.

back34.jpg

"Aorus is expanding into graphics cards today with the GeForce GTX 1080 Xtreme Edition 8G, a card that builds on the strong bones of Gigabyte's Editor's Choice-winning GTX 1080 Xtreme Gaming. We dig in to see whether Aorus' take on a GTX 1080 is good enough for a repeat."

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Releases GeForce 378.72 Hotfix (Bonus: a Discussion)

Subject: Graphics Cards | February 17, 2017 - 07:42 AM |
Tagged: nvidia, graphics drivers

Just a couple of days after publishing 378.66, NVIDIA released GeForce 378.72 Hotfix drivers. This fixes a bug encoding video in Steam’s In-Home Streaming, and it also fixes PhysX not being enabled on the GPU under certain conditions. Normally, hotfix drivers solve large-enough issues that were introduced with the previous release. This time, as far as I can tell, is a little different, though. Instead, these fixes seem to be intended for 378.66 but, for one reason or another, couldn’t be integrated and tested in time for the driver to be available for the game launches.

nvidia-2015-bandaid.png

This is an interesting effect of the Game Ready program. There is value in having a graphics driver available on the same day (or early) as a major game releases, so that people can enjoy the title as soon as it is available. There is also value in having as many fixes as the vendor can provide. These conditions oppose each other to some extent.

From a user standpoint, driver updates are cumulative, so they are able to skip a driver or two if they are not affected by any given issue. AMD has taken up a similar structure, some times releasing three or four drivers in a month with only, like, one of them being WHQL certified. For these reasons, I tend to lean on the side of “release ‘em as you got them”. Still, I can see people feeling a little uneasy about a driver being released incomplete to hit a due-date.

But, again, that due-date has value.

It’s interesting. I’m personally glad that AMD and NVIDIA are on a rapid-release schedule, but I can see where complaints could arise. What’s your opinion?

Source: NVIDIA

MSI's wee AERO ITX family of NVIDIA graphics cards

Subject: Graphics Cards | February 16, 2017 - 03:35 PM |
Tagged: msi, AERO ITX, gtx 1070, gtx 1060, gtx 1050, GTX 1050 Ti, SFF, itx

MSI have just release their new series of ITX compatible GPUs, covering NVIDIA's latest series of cards from the GTX 1050 through to the GTX 1070; the GTX 1080 is not available in this form factor.  The GTX 1070 and 1060 are available in both factory overclocked and standard versions.

ITX series.png

All models share a similar design, with a single TORX fan with 8mm Super Pipes and the Zero Frozr feature which stops the fan to give silent operation when temperatures are below 60C.  They are all compatible with the Afterburner Overclocking Utility, including recordings via Predator and wireless control from your phone. 

The overclocked cards run slightly over reference, from the GTX 1070 at 1721MHz boost, 1531MHz base with the GDDR5 at 8GHz to the GTX 1050 at 1518MHz boost, 1404MHz base and the GDDR5 at 7GHz.  The models which do not bear the OC moniker run at NVIDIA's reference clocks even if they are not quite fully grown.

Source: MSI

NVIDIA Releases GeForce 378.66 Drivers with New Features

Subject: Graphics Cards | February 14, 2017 - 09:29 PM |
Tagged: opencl 2.0, opencl, nvidia, graphics drivers

While the headline of the GeForce 378.66 graphics driver release is support for For Honor, Halo Wars 2, and Sniper Elite 4, NVIDIA has snuck something major into the 378 branch: OpenCL 2.0 is now available for evaluation. (I double-checked 378.49 release notes and confirmed that this is new to 378.66.)

nvidia-geforce.png

OpenCL 2.0 support is not complete yet, but at least NVIDIA is now clearly intending to roll it out to end-users. Among other benefits, OpenCL 2.0 allows kernels (think shaders) to, without the host intervening, enqueue work onto the GPU. This saves one (or more) round-trips to the CPU, especially in workloads where you don’t know which kernel will be required until you see the results of the previous run, like recursive sorting algorithms.

So yeah, that’s good, albeit you usually see big changes at the start of version branches.

Another major addition is Video SDK 8.0. This version allows 10- and 12-bit decoding of VP9 and HEVC video. So... yeah. Applications that want to accelerate video encoding or decoding can now hook up to NVIDIA GPUs for more codecs and features.

NVIDIA’s GeForce 378.66 drivers are available now.

Source: NVIDIA

AMD Releases Radeon Software Crimson ReLive 17.2.1

Subject: Graphics Cards | February 14, 2017 - 05:57 PM |
Tagged: amd, graphics drivers

Just in time for For Honor and Sniper Elite 4, AMD has released a new set of graphics drivers, Radeon Software Crimson ReLive 17.2.1, that target these games. The performance improvements that they quote are in the 4-5% range, when compared to their previous driver on the RX 480, which would be equivalent to saving a whole millisecond per frame at 60 FPS. (This is just for mathematical reference; I don’t know what performance users should expect with an RX 480.)

amd-2016-crimson-relive-logo.png

Beyond driver overhead improvements, you will now be able to utilize multiple GPUs in CrossFire (for DirectX 11) on both titles.

Also, several issues have been fixed with this version. If you have a FreeSync monitor, and some games fail to activate variable refresh mode, then this driver might solve this problem for you. Scrubbing through some videos (DXVA H.264) should no longer cause visible corruption. A couple applications, like GRID and DayZ, should no longer crash under certain situations. You get the idea.

If you have an AMD GPU on Windows, pick up these drivers from their support page.

Source: AMD

New graphics drivers? Fine, back to benchmarking.

Subject: Graphics Cards | February 9, 2017 - 02:46 PM |
Tagged: amd, nvidia

New graphics drivers are a boon to everyone who isn't a hardware reviewer, especially one who has just wrapped up benchmarking a new card the same day one is released.  To address this issue see what changes have been implemented by AMD and NVIDIA in their last few releases, [H]ard|OCP tested a slew of recent drivers from both companies.  The performance of AMD's past releases, up to and including the AMD Crimson ReLive Edition 17.1.1 Beta can be found here.  For NVIDIA users, recent drivers covering up to the 378.57 Beta Hotfix are right here.  The tests show both companies generally increasing the performance of their drivers, however the change is so small you are not going to notice a large difference.

0chained-to-office-man-desk-stick-figure-vector-43659486-958x1024.jpg

"We take the AMD Radeon R9 Fury X and AMD Radeon RX 480 for a ride in 11 games using drivers from the time of each video card’s launch date, to the latest AMD Radeon Software Crimson ReLive Edition 17.1.1 Beta driver. We will see how performance in old and newer games has changed over the course of 2015-2017 with new drivers. "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Palit Introduces Fanless GeForce GTX 1050 Ti KalmX GPU

Subject: Graphics Cards | February 6, 2017 - 11:43 AM |
Tagged: video card, silent, Passive, palit, nvidia, KalmX, GTX 1050 Ti, graphics card, gpu, geforce

Palit is offering a passively-cooled GTX 1050 Ti option with their new KalmX card, which features a large heatsink and (of course) zero fan noise.

kalmx_1.jpg

"With passive cooler and the advanced powerful Pascal architecture, Palit GeForce GTX 1050 Ti KalmX - pursue the silent 0dB gaming environment. Palit GeForce GTX 1050 Ti gives you the gaming horsepower to take on today’s most demanding titles in full 1080p HD @ 60 FPS."

kalmx_3.jpg

The specs are identical to a reference GTX 1050 Ti (4GB GDDR5 @ 7 Gb/s, Base 1290/Boost 1392 MHz, etc.), so expect the full performance of this GPU - with some moderate case airflow, no doubt.

kalmx_2.jpg

We don't have specifics on pricing or availablity just yet.

Source: Palit

Micron Planning To Launch GDDR6 Graphics Memory In 2017

Subject: Graphics Cards | February 4, 2017 - 03:29 PM |
Tagged: micron, graphics memory, gddr6

This year is shaping up to be a good year for memory with the promise of 3D XPoint (Intel/Micron), HBM2 (SK Hynix and Samsung), and now GDDR6 graphics memory from Micron launching this year. While GDDR6 was originally planned to be launched next year, Micron recently announced its intentions to start producing the memory chips by the later half of 2017 which would put it much earlier than previously expected.

Micron Logo.png

Computer World reports that Micron is citing the rise of e-sports and gaming driving the computer market that now sees three year upgrade cycles rather than five year cycles (I am not sure how accurate that is, however as it seems like PCs are actually lasting longer between upgrade as far as relevance but i digress) as the primary reason for shifting GDDR6 production into high gear and moving up the launch window. The company expects the e-sports market to grow to 500 million fans by 2020, and it is a growing market that Micron wants to stay relevant in.

If you missed our previous coverage, GDDR6 is the successor to GDDR5 and offers twice the bandwidth at 16 Gb/s (gigabits per second) per die. It is also faster than GDDR5X (12 Gb/s) and uses 20% less power which the gaming laptop market will appreciate. HBM2 still holds the bandwidth crown though as it offers 256 GB/s per stack and up to 1TB/s with four stacks connected to a GPU on package.

As such, High Bandwidth Memory (HBM2 and then HBM3) will power the high end gaming and professional graphics cards while GDDR6 will become the memory used for mid range cards and GDDR5X (which is actually capable of going faster but will likely not be pushed much past 12 Gbps after all if GDDR6 does come out this soon) will replace GDDR5 on most if not all of the lower end products.

I am not sure if Micron’s reasoning of e-sports, faster upgrade cycles, and VR being the motivating factor(s) to ramping up production early is sound or not, but I will certainly take the faster memory coming out sooner rather than later! Depending on exactly when in 2017 the chips start rolling off the fabs, we could see graphics cards using the new memory technology as soon as early 2018 (just in time for CES announcements? oh boy I can see the PR flooding in already! hehe).

Will Samsung change course as well and try for a 2017 release for its GDDR6 memory as well?

Are you ready for GDDR6?

NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 05:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

Radeon Software Crimson ReLive 17.1.2 Drivers Released

Subject: Graphics Cards | February 2, 2017 - 07:02 AM |
Tagged: graphics drivers, amd

A few days ago, AMD released their second graphics drivers of January 2017: Radeon Software Crimson ReLive 17.1.2. The main goal of these drivers are to support the early access of Conan Exiles as well as tomorrow’s closed beta for Tom Clancy’s Ghost Recon Wildlands. Optimization that AMD has been working on prior to release, for either game, are targeted at this version.

amd-2016-crimson-relive-logo.png

Beyond game-specific optimizations, a handful of bugs are also fixed, ranging from crashes to rendering artifacts. There was also an issue with configuring WattMan on a system that has multiple monitors, where the memory clock would drop or bounce around. There is driver also has a bunch of known issues, including a couple of hangs and crashes under certain situations.

Radeon Software Crimson ReLive Edition 17.1.2 is available at AMD’s website.

Source: AMD

NVIDIA Release GeForce 378.57 Hotfix Drivers

Subject: Graphics Cards | February 2, 2017 - 07:01 AM |
Tagged: nvidia, graphics drivers

If you were having issues with Minecraft on NVIDIA’s recent 378.49 drivers, then you probably want to try out their latest hotfix. This version, numbered 378.57, will not be pushed down GeForce Experience, so you will need to grab them from NVIDIA’s customer support page.

nvidia-2015-bandaid.png

Beyond Minecraft, this also fixes an issue with “debug mode”. For some Pascal-based graphics cards, the option in NVIDIA Control Panel > Help > Debug Mode might be on by default. This option will reduce factory-overclocked GPUs down to NVIDIA’s reference speeds, which is useful to eliminate stability issues in testing, but pointlessly slow if you’re already stable. I mean, you bought the factory overclock, right? I’m guessing someone at NVIDIA used it to test 378.49 during its development, fixed an issue, and accidentally commit the config file with the rest of the fix. Either way, someone caught it, and it’s now fixed, even though you should be able to just untick it if you have a factory-overclocked GPU.

Source: NVIDIA

Win our RX 460 Budget Gaming System!!

Subject: Graphics Cards | January 31, 2017 - 11:18 AM |
Tagged: rx 460, radeon, giveaway, contest, buildapc, amd

As part of our partnership with AMD to take a look at the Radeon RX 460 as a budget gaming graphics solution, we are giving away the computer we built for our testing. If you missed our previous stories, shame on you. Check them out here:

Check out the embeded block below to see how you can win our system. It is a global giveaway, so feel free to enter no matter where you live! Thanks again to AMD for providing the hardware for this build!

Radeon RX 460 Budget System Giveaway (sponsored by AMD)

Source: AMD

DirectX Intermediate Language Announced... via GitHub

Subject: Graphics Cards | January 27, 2017 - 09:19 PM |
Tagged: microsoft, DirectX, llvm, dxil, spir-v, vulkan

Over the holidays, Microsoft has published the DirectX Shader Compiler onto GitHub. The interesting part about this is that it outputs HLSL into DirectX Intermediate Language (DXIL) bytecode, which can be ingested by GPU drivers and executed on graphics devices. The reason why this is interesting is that DXIL is based on LLVM, which might start to sound familiar if you have been following along with The Khronos Group and their announcements regarding Vulkan, OpenCL, and SPIR-V.

As it turns out, they were on to something, and Microsoft is working on a DirectX analogue of it.

microsoft-2015-directx12-logo.jpg

The main advantage of LLVM-based bytecode is that you can eventually support multiple languages (and the libraries of code developed in them). When SPIR-V was announced with Vulkan, the first thing that came to my mind was compiling to it from HLSL, which would be useful for existing engines, as they are typically written in HLSL and transpiled to the target platform when used outside of DirectX (like GLSL for OpenGL). So, in Microsoft’s case, it would make sense that they start there (since they own the thing) but I doubt that is the end goal. The most seductive outcome for game engine developers would be single-source C++, but there is a lot of steps between there and here.

Another advantage, albeit to a lesser extent, is that you might be able to benefit from performance optimizations, both on the LLVM / language side as well as on the driver’s side.

According to their readme, the minimum support will be HLSL Shader Model 6. This is the most recent shading model, and it introduces some interesting instructions, typically for GPGPU applications, that allow multiple GPU threads to interact, like balloting. Ironically, while DirectCompute and C++AMP don’t seem to be too popular, this would nudge DirectX 12 into a somewhat competent GPU compute API.

DXIL support is limited to Windows 10 Build 15007 and later, so you will need to either switch one (or more) workstation(s) to Insider, or wait until it launches with the Creators Update (unless something surprising holds it back).

NVIDIA Releases GeForce 378.49 Drivers

Subject: Graphics Cards | January 26, 2017 - 09:38 PM |
Tagged: nvidia, graphics drivers

Update: There are multiple issues being raised in our comments, including a Steam post by Sam Lantinga (Valve) about this driver breaking In-Home Streaming. Other complaints include certain applications crashing and hardware acceleration issues.

Original Post Below

Now that the holidays are over, we’re ready for the late-Winter rush of “AAA” video games. Three of them, Resident Evil VII, the early access of Conan Exiles, and the closed beta of For Honor, are targeted by NVIDIA’s GeForce 378.49 Game Ready drivers. Unless we get a non-Game Ready driver in the interim, I am guessing that this will cover us until mid-February, before the full release of For Honor, alongside Sniper Elite 4 and followed by Halo Wars 2 on the next week.

nvidia-geforce.png

Beyond game-specific updates, the 378-branch of drivers includes a bunch of SLI profiles, including Battlefield 1. It also paves the way for GTX 1050- and GTX 1050 Ti-based notebooks; this is their launch driver whenever OEMs begin to ship the laptops they announced at CES.

This release also contains a bunch of bug fixes (pdf), including a reboot bug with Wargames: Red Dragon and TDR (driver time-out) with Windows 10 Anniversary Update. I haven’t experienced any of these, but it’s good to be fixed regardless.

You can pick up the new drivers from their website if, you know, GeForce Experience hasn’t already notified you.

Source: NVIDIA

The Gigabyte GTX 1060 G6, a decent card with a bit of work

Subject: Graphics Cards | January 25, 2017 - 03:37 PM |
Tagged: windforce, factory overclocked, GTX 1060 G1 GAMING 6G, GeForce GTX 1060, gigabyte

In their testing [H]ard|OCP proved that the Windforce cooler is not the limiting factor when overclocking Gigabyte's GTX 1060 G1 Gaming G6, even at their top overclock of  2.1GHz GPU, 9.4GHz memory the temperature never reached 60C.  They did have some obstacles reaching those speeds, the cards onboard Gaming mode offered an anemic boost and in order to start manually overclocking this card you will need to install the XTREME ENGINE VGA Utility.  Once you have that, you can increase the voltage and clocks to find the limits of the card you have, which should offer a noticeable improvement from its performance straight out of the box.

1484519953uTUJTeH5LD_1_1.png

"We’ve got the brand new GIGABYTE GeForce GTX 1060 G1 GAMING 6G video card to put through the paces and find out how well it performs in games and overclocks. We will compare its highest overclock with one of the best overclocks we’ve achieved on AMD Radeon RX 480 to put it to the test. How will it stand up? Let’s find out."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Sapphire Releases AMD Radeon RX460 with 1024 Shaders

Subject: Graphics Cards | January 18, 2017 - 08:43 PM |
Tagged: video, unlock, shaders, shader cores, sapphire, radeon, Polaris, graphics, gpu, gaming, card, bios, amd, 1024

As reported by WCCFtech, AMD partner Sapphire has a new 1024 stream processor version of the RX460 listed on their site (Chinese language), and this product reveal of course comes after it became known that RX460 graphics cards had the potential to have their stream processor count unlocked from 896 to 1024 via BIOS update.

Sapphire_201712151752.jpg

Sapphire RX460 1024SP 4G D5 Ultra Platinum OC (image credit: Sapphire)

The Sapphire RX460 1024SP edition offers a full Polaris 11 core operating at 1250 MHz, and it otherwise matches the specifications of a stock RX460 graphics card. Whether this product will be available outside of China is unknown, as is the potential pricing model should it be available in the USA. A 4GB Radeon RX460 retails for $99, while the current step-up option is the RX470, which doubles up on this 1024SP RX460's shader count with 2048, with a price increase of about 70% ($169).

Radeon_Chart.PNG

AMD Polaris GCN 4.0 GPU lineup (Credit WCCFtech)

As you may note from the chart above, there is also an RX470D option between these cards that features 1792 shaders, though this option is also China-only.

Source: WCCFtech

Gigabyte Shows Off Half Height GTX 1050 and GTX 1050 Ti Graphics Cards

Subject: Graphics Cards | January 17, 2017 - 10:31 PM |
Tagged: SFF, pascal, low profile, GTX 1050 Ti, gtx 1050, gigabyte

Without much fanfare Gigabyte recently launched two new low profile half height graphics cards packing factory overclocked GTX 1050 and GTX 1050 Ti GPUs. The new cards measure 6.6” x 2.7” x 1.5” (167mm long) and are cooled by a small shrouded single fan cooler. 
 
Gigabyte GTX 1050 OC Low Profile 2G.png
 
Around back, both the Gigabyte GTX 1050 OC Low Profile 2G and GTX 1050 Ti OC Low Profile 4G offer four display outputs in the form of two HDMI 2.0b, one DisplayPort 1.4, and one dual-link DVI-D. It appears that Gigabyte is using the same cooler for both cards. There is not much information on this cooler, but it utilizes an aluminum heatsink and what looks like a ~50mm fan. Note that while the cards are half-height, they use a dual slot design which may limit the cases it can be used in.
 
The GTX 1050 OC Low Profile 2G features 640 Pascal-based CUDA cores clocked at 1366 MHz base and 1468 MHz boost out of the box (1392 MHz base and 1506 MHz boost in OC Mode using Gigabyte’s software) and 2GB of GDDR5 memory at 7008 MHz (7GT/s). For comparison, the GTX 1050 reference clock speeds are 1354 MHz base and 1455 MHz boost.
 
Meanwhile, the GTX 1050 Ti OC Low Profile 4G has 768 cores clocked at 1303 MHz base and 1417 MHz boost by default and 1328 MHz base and 1442 MHz boost in OC Mode. The GPU is paired with 4GB of GDDR5 memory at 7GT/s. NVIDIA’s reference GPU clocks are 1290 MHz base and 1392 MHz boost.
 
The pint-sized graphics cards would certainly allow for gaming on your SFF home theater or other desktop PC as well as being an easy upgrade to make a tiny OEM PC gaming capable (think those thin towers HP, Lenovo, and Dell like to use). 
 
Of course, Gigabyte is not yet talking pricing and availability has only been narrowed down to a general Q1 2017 time frame. I would expect the cards to hit retailers within a month or so and be somewhere around $135 for their half height GTX 1050 OC LP 2G and approximately $155 for the faster GTX 1050 Ti variant. That is to say that the low profile cards should be available at a slight premium over the company's larger GTX 1050 and GTX 1050 Ti graphics cards.
Source: Gigabyte

CES 2017: Gigabyte Shows Off First Aorus Branded Graphics Card

Subject: Graphics Cards | January 10, 2017 - 10:11 PM |
Tagged: CES, CES 2017, aorus, gigabyte, xtreme gaming, GTX 1080, pascal

One interesting development from Gigabyte at this year’s CES was the expansion of its Aorus branding and the transition from Xtreme Gaming. Initially used on its RGB LED equipped motherboards, the company is rolling out the brand to its other higher end products including laptops and graphics cards. While it appears that Xtreme Gaming is not going away entirely, Aorus is taking the spotlight with the introduction of the first Aorus branded graphics card: the GTX 1080.

Aorus GTX 1080 CES 2017 Pauls Hardware.png

Paul's Hardware got hands on with the new card (video) at the Gigabyte CES booth.

Featuring a similar triple 100mm fan cooler as the GTX 1080 Xtreme Gaming 8G, the Aorus GTX 1080 comes with x patterned LED lighting as well as a backlit Aorus logo on the side and a backlit Eagle on the backplate. The cooler is comprised of three 100mm double stacked fans (the center fan is recessed and spins in the opposite direction of the side fans) over a shrouded angled aluminum fin stack that connects to the GPU over five large copper heatpipes.

The graphics card is powered by two 8-pin PCI-E power connectors.

In an interesting twist, the card has two HDMI ports on the back of the card that are intended to be used to hook up front panel HDMI outputs for things like VR headsets. Another differentiator between the upcoming card and the Xtreme Gaming 8G is the backplate which has a large copper plate secured over the underside of the GPU. Several sites are reporting that this area can be used for watercooling, but I am skeptical of this as if you are going to go out and buy a waterblock for your graphics card you might as well buy a block to put on top of the GPU and not on the area of the PCB opposite the GPU!). As is, the copper plate on the backplate certainly won’t hurt cooling, and it looks cool, but that’s all I suspect it is.

Think Computers also checked out the Aorus graphics card. (video above)

Naturally, Gigabyte is not talking clock speeds on this new card, but I expect it to hit at least the same clocks as its Xtreme Gaming 8G predecessor which was clocked at 1759 MHz base and 1848 MHz boost out of the box and 1784 MHz base and 1936 MHz boost in OC Mode respectively. Gigabyte also overlocked the memory on that card up to 10400 MHz on OC Mode.

Gigabyte also had new SLI HB bridges on display bearing the Aorus logo to match the Aorus GPU. The company also had Xtreme Gaming SLI HB bridges though which further suggests that they are not completely retiring that branding (at least not yet).

Pricing has not been announced, but the card will be available in February.

Gigabyte has yet to release official photos of the card or a product page, but it should show up on their website shortly. In the meantime, Paul's Hardware and Think Computers shot some video of the card on the show floor which I have linked above if you are interested in the card. Looking on Amazon, the Xtreme Gaming 1080 8GB is approximately $690 before rebate so I would guess that the Aorus card would come out at a slight premium over that if only for the fact that it is a newer release, has a more expensive backplate and additional RGB LED backlighting.

What are your thoughts on the move to everything-Aorus? 

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

A new Radeon Software Crimson Edition for the New Year

Subject: Graphics Cards | January 9, 2017 - 02:32 PM |
Tagged: Crimson Edition 16.12.2

Radeon Software Crimson Edition 16.12.2 is now live and WHQL certified, ready for you to grab here or through the version you already have installed, which supports the recommended clean installation option. 

index.jpg

This particular update addresses Freesync issues with borderless Fullscreen applications as well as compatibility issues with Battlefield 1 and DOTA 2.  There are also numerous optimizations and fixes for issues with Radeon ReLive, which you can read in more detail under the Release Notes tab.

 

Source: AMD