Report: AMD to Launch Radeon RX 500 Series GPUs in April

Subject: Graphics Cards | March 1, 2017 - 05:04 PM |
Tagged: video card, RX 580, RX 570, RX 560, RX 550, rx 480, rumor, report, rebrand, radeon, graphics, gpu, amd

According to a report from VideoCardz.com we can expect AMD Radeon RX 500-series graphics cards next month, with an April 4th launch of the RX 580 and RX 570, and subsequent RX 560/550 launch on April 11. The bad news? According to the report "all cards, except RX 550, are most likely rebranded from Radeon RX 400 series".

Polaris10.jpg

AMD Polaris 10 GPU (Image credit: Heise Online)

Until official confirmation on specs arrive, this is still speculative; however, if Vega is not ready for an April launch and AMD will indeed be refreshing their Radeon lineup, an R9 300-series speed bump/rebrand is not out of the realm of possibility. VideoCardz offers (unconfirmed, at this point) specs of the upcoming RX 500-series cards, with RX 400 numbers for comparison:

videocardz_chart_1.png

Chart credit: VideoCardz.com

The first graph shows the increased GPU boost clock speed of ~1340 MHz for the rumored RX 580, with the existing RX 480 clocked at 1266 MHz. Both would be Polaris 10 GPUs with otherwise identical specs. The same largely holds for the rumored specs on the RX 570, though this GPU would presumably be shipping with faster memory clocks as well. On the RX 560 side, however, the Polaris 11 powered replacement for the RX 460 might be based on the 1024-core variant we have seen from the Chinese market.

videocardz_chart_2.png

Chart credit: VideoCardz.com

No specifics on the RX 550 are yet known, which VideoCardz says "is most likely equipped with Polaris 12, a new low-end GPU". These rumors come via heise.de (German language), who state that those "hoping for Vega-card will be disappointed - the cards are intended to be rebrands with known GPUs". We will have to wait until next month to know for sure, but even if this is the case, expect faster clocks and better performance for the same money.

Source: VideoCardz

NVIDIA Announces GeForce GTX 1080 Ti 11GB Graphics Card, $699, Available Next Week

Subject: Graphics Cards | February 28, 2017 - 10:59 PM |
Tagged: pascal, nvidia, gtx 1080 ti, gp102, geforce

Tonight at a GDC party hosted by CEO Jen-Hsun Huang, NVIDIA announced the GeForce GTX 1080 Ti graphics card, coming next week for $699. Let’s dive right into the specifications!

card1.jpg

  GTX 1080 Ti Titan X (Pascal) GTX 1080 GTX 980 Ti TITAN X GTX 980 R9 Fury X R9 Fury R9 Nano
GPU GP102 GP102 GP104 GM200 GM200 GM204 Fiji XT Fiji Pro Fiji XT
GPU Cores 3584 3584 2560 2816 3072 2048 4096 3584 4096
Base Clock 1480 MHz 1417 MHz 1607 MHz 1000 MHz 1000 MHz 1126 MHz 1050 MHz 1000 MHz up to 1000 MHz
Boost Clock 1600 MHz 1480 MHz 1733 MHz 1076 MHz 1089 MHz 1216 MHz - - -
Texture Units 224 224 160 176 192 128 256 224 256
ROP Units 88 96 64 96 96 64 64 64 64
Memory 11GB 12GB 8GB 6GB 12GB 4GB 4GB 4GB 4GB
Memory Clock 11000 MHz 10000 MHz 10000 MHz 7000 MHz 7000 MHz 7000 MHz 500 MHz 500 MHz 500 MHz
Memory Interface 352-bit 384-bit G5X 256-bit G5X 384-bit 384-bit 256-bit 4096-bit (HBM) 4096-bit (HBM) 4096-bit (HBM)
Memory Bandwidth 484 GB/s 480 GB/s 320 GB/s 336 GB/s 336 GB/s 224 GB/s 512 GB/s 512 GB/s 512 GB/s
TDP 250 watts 250 watts 180 watts 250 watts 250 watts 165 watts 275 watts 275 watts 175 watts
Peak Compute 10.6 TFLOPS 10.1 TFLOPS 8.2 TFLOPS 5.63 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 8.60 TFLOPS 7.20 TFLOPS 8.19 TFLOPS
Transistor Count 12.0B 12.0B 7.2B 8.0B 8.0B 5.2B 8.9B 8.9B 8.9B
Process Tech 16nm 16nm 16nm 28nm 28nm 28nm 28nm 28nm 28nm
MSRP (current) $699 $1,200 $599 $649 $999 $499 $649 $549 $499

The GTX 1080 Ti looks a whole lot like the TITAN X launched in August of last year. Based on the 12B transistor GP102 chip, the new GTX 1080 Ti will have 3,584 CUDA core with a 1.60 GHz Boost clock. That gives it the same processor count as Titan X but with a slightly higher clock speed which should make the new GTX 1080 Ti slightly faster by at least a few percentage points and has a 4.7% edge in base clock compute capability. It has 28 SMs, 28 geometry units, 224 texture units.

archoverview.jpg

Interestingly, the memory system on the GTX 1080 Ti gets adjusted – NVIDIA has disabled a single 32-bit memory controller to give the card a total of 352-bit wide bus and an odd-sounding 11GB memory capacity. The ROP count also drops to 88 units. Speaking of 11, the memory clock on the G5X implementation on GTX 1080 Ti will now run at 11 Gbps, a boost available to NVIDIA thanks to a chip revision from Micron and improvements to equalization and reverse signal distortion.

memoryeye.jpg

The TDP of the new part is 250 watts, falling between the Titan X and the GTX 1080. That’s an interesting move considering that the GP102 was running at 250 watts with identical to the Titan product. The cooler has been improved compared to the GTX 1080, offering quieter fan speeds and lower temperatures when operating at the same power envelope.

coolerperf.jpg

Performance estimates from NVIDIA put the GTX 1080 Ti about 35% faster than the GTX 1080, the largest “kicker performance increase” that we have seen from a flagship Ti launch.

perf.jpg

Pricing is going to be set at $699 so don't expect to find this in any budget builds. But for the top performing GeForce card on the market, it's what we expect. It should be on virtual shelves starting next week.

(Side note, with the GTX 1080 getting a $100 price drop tonight, I think we'll find this new lineup very compelling to enthusiasts.)

card2.jpg

card3.jpg

NVIDIA did finally detail its tiled caching rendering technique. We'll be diving more into that in a separate article with a little more time for research.

One more thing…

In another interesting move, NVIDIA is going to be offering “overclocked” versions of the GTX 1080 and GTX 1060 with +1 Gbps memory speeds. Partners will be offering them with some undisclosed price premium.

1080oc.jpg

I don’t know how much performance this will give us but it’s clear that NVIDIA is preparing its lineup for the upcoming AMD Vega release.

GeForce_GTX_1080ti_3qtr_Front_Left_1488313915.jpg

We’ll have more news from NVIDIA and GDC as it comes!

Source: NVIDIA

GDC: NVIDIA Announces GTX 1080 Price Drop to $499

Subject: Graphics Cards | February 28, 2017 - 10:55 PM |
Tagged: pascal, nvidia, GTX 1080, GDC

Update Feb 28 @ 10:03pm It's official, NVIDIA launches $699 GTX 1080 Ti.

NVIDIA is hosting a "Gaming Celebration" live event during GDC 2017 to talk PC gaming and possibly launch new hardware (if rumors are true!). During the event, NVIDIA CEO Jen-Hsun Huang made a major announcement regarding its top-end GTX 1080 graphics card with a price drop to $499 effective immediately.

NVIDIA 499 GTX 1080.png

The NVIDIA GTX 1080 is a pascal based graphics card with 2560 CUDA cores paired with 8GB of GDDR5X memory. Graphics cards based on this GP104 GPU are currently selling for around $580 to $700 (most are around $650+/-) with the "Founders Edition" having an MSRP of $699. The $499 price teased at the live stream represents a significant price drop compared to what the graphics cards are going for now. NVIDIA did not specify if the new $499 MSRP was the new Founders Edition price or an average price that includes partner cards as well but even if it only happened on the reference cards, the partners would have to adjust their prices downwards accordingly to compete.

I suspect that NVIDIA is making such a bold move to make room in their lineup for a new product (the long-rumored 1080 Ti perhaps?) as well as a pre-emptive strike against AMD and their Radeon RX Vega products. This move may also be good news for GTX 1070 pricing as they may also see price drops to make room for cheaper GTX 1080 partner cards that come in below the $499 price point.

If you have been considering buying a new graphics card, NVIDIA has sweetened the pot a bit especially if you had already been eyeing a GTX 1080. (Note that while the price drop is said to be effective immediately, at the time of writing Amazon was still showing "normal"/typical prices for the cards. Enthusiasts might have to wait a few hours or days for the retailers to catch up and update their sites.)

This makes me a bit more excited to see what AMD will have to offer with Vega as well as the likelihood of a GTX 1080 Ti launch happening sooner rather than later!

Source: NVIDIA

Futuremark at GDC and MWC

Subject: General Tech, Graphics Cards | February 27, 2017 - 03:39 PM |
Tagged: MWC, GDC, VRMark, Servermark, OptoFidelity, cyan room, benchmark

Futuremark are showing off new benchmarks at GDC and MWC, the two conferences which are both happening this week.  We will have quite a bit of coverage this week as we try to keep up with simultaneous news releases and presentations.

vrmark.jpg

First up is a new benchmark in their recently released DX12 VRMark suite, the new Cyan Room which sits between the existing two in the suite.  The Orange Room is to test if your system is capable of providing you with an acceptable VR experience or if your system falls somewhat short of the minimum requirements while the Blue Room is to show off what a system that exceeds the recommended specs can manage.  The Cyan room will be for those who know that their system can handle most VR, and need to test their systems settings.  If you don't have the test suite Humble Bundle has a great deal on this suite and several other tools, if you act quickly.

unnamed.jpg

Next up is a new suite to test Google Daydream, Google Cardboard, and Samsung Gear VR performance and ability.  There is more than just performance to test when you are using your phone to view VR content, such as avoiding setting your eyeholes on fire.  The tests will help you determine just how long your device can run VR content before overheating becomes an issue and interferes with performance, as well as helping you determine your battery life.

latency.jpg

VR Latency testing is the next in the list of announcements and is very important when it comes to VR as high or unstable latency is the reason some users need to add a bucket to their list of VR essentials.  Futuremark have partnered with OptoFidelity to produce VR Multimeter HMD hardware based testing. This allows you, and hopefully soon PCPer as well, to test motion-to-photon latency, display persistence, and frame jitter as well as audio to video synchronization and motion-to-audio-latency all of which could lead to a bad time.

servermark.jpg

Last up is the brand new Servermark to test the performance you can expect out of virtual servers, media servers and other common tasks.  The VDI test lets you determine if a virtual machine has been provisioned at a level commensurate to the assigned task, so you can adjust it as required.  The Media Transcode portion lets you determine the maximum number of concurrent streams as well as the maximum quality of those streams which your server can handle, very nice for those hosting media for an audience. 

Expect to hear more as we see the new benchmarks in action.

Source: Futuremark

A good year to sell GPUs

Subject: General Tech | February 21, 2017 - 01:18 PM |
Tagged: jon peddie, marketshare, graphics cards

The GPU market increased 5.6% from Q3 to Q4 of 2016, beating the historical average of -4.7% by quite a large margin, over the year we saw an increase of 21.1%.  That increase is even more impressive when you consider that the total PC market dropped 10.1% in the same time, showing that far more consumers chose to upgrade their existing machines instead of buying new ones.  This makes sense as neither Intel nor AMD offered a compelling reason to upgrade your processor and motherboard for anyone who purchased one in the last two or three years.

AMD saw a nice amount of growth, grabbing almost 8% of the total market from NVIDIA over the year, though they lost a tiny bit of ground between Q3 and Q4 of 2016.  Jon Peddie's sample also includes workstation class GPUs as well as gaming models and it seems a fair number of users chose to upgrade their machines as that market increased just over 19% in 2016.

unnamed.png

"The graphics add-in board market has defied gravity for over a year now, showing gains while the overall PC market slips. The silly notion of integrated graphics "catching up" with discrete will hopefully be put to rest now," said Dr. Jon Peddie, president of Jon Peddie research, the industry's research and consulting firm for graphics and multimedia."

Here is some more Tech News from around the web:

Tech Talk

Faster than a speeding Gigabyte, the Aorus GTX 1080 XE

Subject: Graphics Cards | February 20, 2017 - 02:54 PM |
Tagged: nvidia, gtx 1080 Xtreme Edition, GTX 1080, gigabyte, aorus

Gigabyte created their Aorus line of products to attract enthusiasts away from some of the competitions sub-brands, such as ASUS ROG.  It is somewhat similar to the Gigabyte Xtreme Edition released last year but their are some differences, such as the large copper heatsink attached to the bottom of the GPU.  The stated clockspeeds are the same as last years model and it also sports the two HDMI connections on the front of the card to connect to Gigabyte's VR Extended Front panel.  The Tech Report manually overclocked the card and saw the Aorus reach the highest frequencies they have seen from a GP104 chip, albeit by a small margin.  Check out the full review right here.

back34.jpg

"Aorus is expanding into graphics cards today with the GeForce GTX 1080 Xtreme Edition 8G, a card that builds on the strong bones of Gigabyte's Editor's Choice-winning GTX 1080 Xtreme Gaming. We dig in to see whether Aorus' take on a GTX 1080 is good enough for a repeat."

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Releases GeForce 378.72 Hotfix (Bonus: a Discussion)

Subject: Graphics Cards | February 17, 2017 - 07:42 AM |
Tagged: nvidia, graphics drivers

Just a couple of days after publishing 378.66, NVIDIA released GeForce 378.72 Hotfix drivers. This fixes a bug encoding video in Steam’s In-Home Streaming, and it also fixes PhysX not being enabled on the GPU under certain conditions. Normally, hotfix drivers solve large-enough issues that were introduced with the previous release. This time, as far as I can tell, is a little different, though. Instead, these fixes seem to be intended for 378.66 but, for one reason or another, couldn’t be integrated and tested in time for the driver to be available for the game launches.

nvidia-2015-bandaid.png

This is an interesting effect of the Game Ready program. There is value in having a graphics driver available on the same day (or early) as a major game releases, so that people can enjoy the title as soon as it is available. There is also value in having as many fixes as the vendor can provide. These conditions oppose each other to some extent.

From a user standpoint, driver updates are cumulative, so they are able to skip a driver or two if they are not affected by any given issue. AMD has taken up a similar structure, some times releasing three or four drivers in a month with only, like, one of them being WHQL certified. For these reasons, I tend to lean on the side of “release ‘em as you got them”. Still, I can see people feeling a little uneasy about a driver being released incomplete to hit a due-date.

But, again, that due-date has value.

It’s interesting. I’m personally glad that AMD and NVIDIA are on a rapid-release schedule, but I can see where complaints could arise. What’s your opinion?

Source: NVIDIA

MSI's wee AERO ITX family of NVIDIA graphics cards

Subject: Graphics Cards | February 16, 2017 - 03:35 PM |
Tagged: msi, AERO ITX, gtx 1070, gtx 1060, gtx 1050, GTX 1050 Ti, SFF, itx

MSI have just release their new series of ITX compatible GPUs, covering NVIDIA's latest series of cards from the GTX 1050 through to the GTX 1070; the GTX 1080 is not available in this form factor.  The GTX 1070 and 1060 are available in both factory overclocked and standard versions.

ITX series.png

All models share a similar design, with a single TORX fan with 8mm Super Pipes and the Zero Frozr feature which stops the fan to give silent operation when temperatures are below 60C.  They are all compatible with the Afterburner Overclocking Utility, including recordings via Predator and wireless control from your phone. 

The overclocked cards run slightly over reference, from the GTX 1070 at 1721MHz boost, 1531MHz base with the GDDR5 at 8GHz to the GTX 1050 at 1518MHz boost, 1404MHz base and the GDDR5 at 7GHz.  The models which do not bear the OC moniker run at NVIDIA's reference clocks even if they are not quite fully grown.

Source: MSI

NVIDIA Releases GeForce 378.66 Drivers with New Features

Subject: Graphics Cards | February 14, 2017 - 09:29 PM |
Tagged: opencl 2.0, opencl, nvidia, graphics drivers

While the headline of the GeForce 378.66 graphics driver release is support for For Honor, Halo Wars 2, and Sniper Elite 4, NVIDIA has snuck something major into the 378 branch: OpenCL 2.0 is now available for evaluation. (I double-checked 378.49 release notes and confirmed that this is new to 378.66.)

nvidia-geforce.png

OpenCL 2.0 support is not complete yet, but at least NVIDIA is now clearly intending to roll it out to end-users. Among other benefits, OpenCL 2.0 allows kernels (think shaders) to, without the host intervening, enqueue work onto the GPU. This saves one (or more) round-trips to the CPU, especially in workloads where you don’t know which kernel will be required until you see the results of the previous run, like recursive sorting algorithms.

So yeah, that’s good, albeit you usually see big changes at the start of version branches.

Another major addition is Video SDK 8.0. This version allows 10- and 12-bit decoding of VP9 and HEVC video. So... yeah. Applications that want to accelerate video encoding or decoding can now hook up to NVIDIA GPUs for more codecs and features.

NVIDIA’s GeForce 378.66 drivers are available now.

Source: NVIDIA

AMD Releases Radeon Software Crimson ReLive 17.2.1

Subject: Graphics Cards | February 14, 2017 - 05:57 PM |
Tagged: amd, graphics drivers

Just in time for For Honor and Sniper Elite 4, AMD has released a new set of graphics drivers, Radeon Software Crimson ReLive 17.2.1, that target these games. The performance improvements that they quote are in the 4-5% range, when compared to their previous driver on the RX 480, which would be equivalent to saving a whole millisecond per frame at 60 FPS. (This is just for mathematical reference; I don’t know what performance users should expect with an RX 480.)

amd-2016-crimson-relive-logo.png

Beyond driver overhead improvements, you will now be able to utilize multiple GPUs in CrossFire (for DirectX 11) on both titles.

Also, several issues have been fixed with this version. If you have a FreeSync monitor, and some games fail to activate variable refresh mode, then this driver might solve this problem for you. Scrubbing through some videos (DXVA H.264) should no longer cause visible corruption. A couple applications, like GRID and DayZ, should no longer crash under certain situations. You get the idea.

If you have an AMD GPU on Windows, pick up these drivers from their support page.

Source: AMD

New graphics drivers? Fine, back to benchmarking.

Subject: Graphics Cards | February 9, 2017 - 02:46 PM |
Tagged: amd, nvidia

New graphics drivers are a boon to everyone who isn't a hardware reviewer, especially one who has just wrapped up benchmarking a new card the same day one is released.  To address this issue see what changes have been implemented by AMD and NVIDIA in their last few releases, [H]ard|OCP tested a slew of recent drivers from both companies.  The performance of AMD's past releases, up to and including the AMD Crimson ReLive Edition 17.1.1 Beta can be found here.  For NVIDIA users, recent drivers covering up to the 378.57 Beta Hotfix are right here.  The tests show both companies generally increasing the performance of their drivers, however the change is so small you are not going to notice a large difference.

0chained-to-office-man-desk-stick-figure-vector-43659486-958x1024.jpg

"We take the AMD Radeon R9 Fury X and AMD Radeon RX 480 for a ride in 11 games using drivers from the time of each video card’s launch date, to the latest AMD Radeon Software Crimson ReLive Edition 17.1.1 Beta driver. We will see how performance in old and newer games has changed over the course of 2015-2017 with new drivers. "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Palit Introduces Fanless GeForce GTX 1050 Ti KalmX GPU

Subject: Graphics Cards | February 6, 2017 - 11:43 AM |
Tagged: video card, silent, Passive, palit, nvidia, KalmX, GTX 1050 Ti, graphics card, gpu, geforce

Palit is offering a passively-cooled GTX 1050 Ti option with their new KalmX card, which features a large heatsink and (of course) zero fan noise.

kalmx_1.jpg

"With passive cooler and the advanced powerful Pascal architecture, Palit GeForce GTX 1050 Ti KalmX - pursue the silent 0dB gaming environment. Palit GeForce GTX 1050 Ti gives you the gaming horsepower to take on today’s most demanding titles in full 1080p HD @ 60 FPS."

kalmx_3.jpg

The specs are identical to a reference GTX 1050 Ti (4GB GDDR5 @ 7 Gb/s, Base 1290/Boost 1392 MHz, etc.), so expect the full performance of this GPU - with some moderate case airflow, no doubt.

kalmx_2.jpg

We don't have specifics on pricing or availablity just yet.

Source: Palit

Micron Planning To Launch GDDR6 Graphics Memory In 2017

Subject: Graphics Cards | February 4, 2017 - 03:29 PM |
Tagged: micron, graphics memory, gddr6

This year is shaping up to be a good year for memory with the promise of 3D XPoint (Intel/Micron), HBM2 (SK Hynix and Samsung), and now GDDR6 graphics memory from Micron launching this year. While GDDR6 was originally planned to be launched next year, Micron recently announced its intentions to start producing the memory chips by the later half of 2017 which would put it much earlier than previously expected.

Micron Logo.png

Computer World reports that Micron is citing the rise of e-sports and gaming driving the computer market that now sees three year upgrade cycles rather than five year cycles (I am not sure how accurate that is, however as it seems like PCs are actually lasting longer between upgrade as far as relevance but i digress) as the primary reason for shifting GDDR6 production into high gear and moving up the launch window. The company expects the e-sports market to grow to 500 million fans by 2020, and it is a growing market that Micron wants to stay relevant in.

If you missed our previous coverage, GDDR6 is the successor to GDDR5 and offers twice the bandwidth at 16 Gb/s (gigabits per second) per die. It is also faster than GDDR5X (12 Gb/s) and uses 20% less power which the gaming laptop market will appreciate. HBM2 still holds the bandwidth crown though as it offers 256 GB/s per stack and up to 1TB/s with four stacks connected to a GPU on package.

As such, High Bandwidth Memory (HBM2 and then HBM3) will power the high end gaming and professional graphics cards while GDDR6 will become the memory used for mid range cards and GDDR5X (which is actually capable of going faster but will likely not be pushed much past 12 Gbps after all if GDDR6 does come out this soon) will replace GDDR5 on most if not all of the lower end products.

I am not sure if Micron’s reasoning of e-sports, faster upgrade cycles, and VR being the motivating factor(s) to ramping up production early is sound or not, but I will certainly take the faster memory coming out sooner rather than later! Depending on exactly when in 2017 the chips start rolling off the fabs, we could see graphics cards using the new memory technology as soon as early 2018 (just in time for CES announcements? oh boy I can see the PR flooding in already! hehe).

Will Samsung change course as well and try for a 2017 release for its GDDR6 memory as well?

Are you ready for GDDR6?

NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 05:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

Radeon Software Crimson ReLive 17.1.2 Drivers Released

Subject: Graphics Cards | February 2, 2017 - 07:02 AM |
Tagged: graphics drivers, amd

A few days ago, AMD released their second graphics drivers of January 2017: Radeon Software Crimson ReLive 17.1.2. The main goal of these drivers are to support the early access of Conan Exiles as well as tomorrow’s closed beta for Tom Clancy’s Ghost Recon Wildlands. Optimization that AMD has been working on prior to release, for either game, are targeted at this version.

amd-2016-crimson-relive-logo.png

Beyond game-specific optimizations, a handful of bugs are also fixed, ranging from crashes to rendering artifacts. There was also an issue with configuring WattMan on a system that has multiple monitors, where the memory clock would drop or bounce around. There is driver also has a bunch of known issues, including a couple of hangs and crashes under certain situations.

Radeon Software Crimson ReLive Edition 17.1.2 is available at AMD’s website.

Source: AMD

NVIDIA Release GeForce 378.57 Hotfix Drivers

Subject: Graphics Cards | February 2, 2017 - 07:01 AM |
Tagged: nvidia, graphics drivers

If you were having issues with Minecraft on NVIDIA’s recent 378.49 drivers, then you probably want to try out their latest hotfix. This version, numbered 378.57, will not be pushed down GeForce Experience, so you will need to grab them from NVIDIA’s customer support page.

nvidia-2015-bandaid.png

Beyond Minecraft, this also fixes an issue with “debug mode”. For some Pascal-based graphics cards, the option in NVIDIA Control Panel > Help > Debug Mode might be on by default. This option will reduce factory-overclocked GPUs down to NVIDIA’s reference speeds, which is useful to eliminate stability issues in testing, but pointlessly slow if you’re already stable. I mean, you bought the factory overclock, right? I’m guessing someone at NVIDIA used it to test 378.49 during its development, fixed an issue, and accidentally commit the config file with the rest of the fix. Either way, someone caught it, and it’s now fixed, even though you should be able to just untick it if you have a factory-overclocked GPU.

Source: NVIDIA

Win our RX 460 Budget Gaming System!!

Subject: Graphics Cards | January 31, 2017 - 11:18 AM |
Tagged: rx 460, radeon, giveaway, contest, buildapc, amd

As part of our partnership with AMD to take a look at the Radeon RX 460 as a budget gaming graphics solution, we are giving away the computer we built for our testing. If you missed our previous stories, shame on you. Check them out here:

Check out the embeded block below to see how you can win our system. It is a global giveaway, so feel free to enter no matter where you live! Thanks again to AMD for providing the hardware for this build!

Radeon RX 460 Budget System Giveaway (sponsored by AMD)

Source: AMD

DirectX Intermediate Language Announced... via GitHub

Subject: Graphics Cards | January 27, 2017 - 09:19 PM |
Tagged: microsoft, DirectX, llvm, dxil, spir-v, vulkan

Over the holidays, Microsoft has published the DirectX Shader Compiler onto GitHub. The interesting part about this is that it outputs HLSL into DirectX Intermediate Language (DXIL) bytecode, which can be ingested by GPU drivers and executed on graphics devices. The reason why this is interesting is that DXIL is based on LLVM, which might start to sound familiar if you have been following along with The Khronos Group and their announcements regarding Vulkan, OpenCL, and SPIR-V.

As it turns out, they were on to something, and Microsoft is working on a DirectX analogue of it.

microsoft-2015-directx12-logo.jpg

The main advantage of LLVM-based bytecode is that you can eventually support multiple languages (and the libraries of code developed in them). When SPIR-V was announced with Vulkan, the first thing that came to my mind was compiling to it from HLSL, which would be useful for existing engines, as they are typically written in HLSL and transpiled to the target platform when used outside of DirectX (like GLSL for OpenGL). So, in Microsoft’s case, it would make sense that they start there (since they own the thing) but I doubt that is the end goal. The most seductive outcome for game engine developers would be single-source C++, but there is a lot of steps between there and here.

Another advantage, albeit to a lesser extent, is that you might be able to benefit from performance optimizations, both on the LLVM / language side as well as on the driver’s side.

According to their readme, the minimum support will be HLSL Shader Model 6. This is the most recent shading model, and it introduces some interesting instructions, typically for GPGPU applications, that allow multiple GPU threads to interact, like balloting. Ironically, while DirectCompute and C++AMP don’t seem to be too popular, this would nudge DirectX 12 into a somewhat competent GPU compute API.

DXIL support is limited to Windows 10 Build 15007 and later, so you will need to either switch one (or more) workstation(s) to Insider, or wait until it launches with the Creators Update (unless something surprising holds it back).

NVIDIA Releases GeForce 378.49 Drivers

Subject: Graphics Cards | January 26, 2017 - 09:38 PM |
Tagged: nvidia, graphics drivers

Update: There are multiple issues being raised in our comments, including a Steam post by Sam Lantinga (Valve) about this driver breaking In-Home Streaming. Other complaints include certain applications crashing and hardware acceleration issues.

Original Post Below

Now that the holidays are over, we’re ready for the late-Winter rush of “AAA” video games. Three of them, Resident Evil VII, the early access of Conan Exiles, and the closed beta of For Honor, are targeted by NVIDIA’s GeForce 378.49 Game Ready drivers. Unless we get a non-Game Ready driver in the interim, I am guessing that this will cover us until mid-February, before the full release of For Honor, alongside Sniper Elite 4 and followed by Halo Wars 2 on the next week.

nvidia-geforce.png

Beyond game-specific updates, the 378-branch of drivers includes a bunch of SLI profiles, including Battlefield 1. It also paves the way for GTX 1050- and GTX 1050 Ti-based notebooks; this is their launch driver whenever OEMs begin to ship the laptops they announced at CES.

This release also contains a bunch of bug fixes (pdf), including a reboot bug with Wargames: Red Dragon and TDR (driver time-out) with Windows 10 Anniversary Update. I haven’t experienced any of these, but it’s good to be fixed regardless.

You can pick up the new drivers from their website if, you know, GeForce Experience hasn’t already notified you.

Source: NVIDIA

The Gigabyte GTX 1060 G6, a decent card with a bit of work

Subject: Graphics Cards | January 25, 2017 - 03:37 PM |
Tagged: windforce, factory overclocked, GTX 1060 G1 GAMING 6G, GeForce GTX 1060, gigabyte

In their testing [H]ard|OCP proved that the Windforce cooler is not the limiting factor when overclocking Gigabyte's GTX 1060 G1 Gaming G6, even at their top overclock of  2.1GHz GPU, 9.4GHz memory the temperature never reached 60C.  They did have some obstacles reaching those speeds, the cards onboard Gaming mode offered an anemic boost and in order to start manually overclocking this card you will need to install the XTREME ENGINE VGA Utility.  Once you have that, you can increase the voltage and clocks to find the limits of the card you have, which should offer a noticeable improvement from its performance straight out of the box.

1484519953uTUJTeH5LD_1_1.png

"We’ve got the brand new GIGABYTE GeForce GTX 1060 G1 GAMING 6G video card to put through the paces and find out how well it performs in games and overclocks. We will compare its highest overclock with one of the best overclocks we’ve achieved on AMD Radeon RX 480 to put it to the test. How will it stand up? Let’s find out."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP