AMD Radeon Crimson Edition 16.4.2 hits the streets

Subject: Graphics Cards | April 25, 2016 - 04:41 PM |
Tagged: graphics driver, crimson, amd

AMD's new Crimson driver has just been released with new features including official support for the new Radeon Pro Duo as well as both the Oculus Rift and HTC Vive VR headsets.  It also adds enhanced support for AMD's XConnect technology for external GPUs connected via a Thunderbolt 3 interface.  Crossfire profile updates include Hitman, Elite Dangerous and Need for Speed and they have also resolved the ongoing issue with the internal update procedure not seeing the newest drivers.  If you are having issues with games crashing to desktop on launch you will still need to disable the AMD Gaming Evolved overlay, unfortunately.

Get 'em right here!

View Full Size

"The latest version of Radeon Software Crimson Edition is here with 16.4.2. With this version, AMD delivers many quality improvements, updated/introduced new CrossFire profiles and delivered full support for AMD’s XConnect technology (including plug’n’play simplicity for Thunderbolt 3 eGFX enclosures configured with Radeon R9 Fury, Nano or 300 Series GPUs.)  Best of all, our DirectX 12 leadership continues to be strong, as shown by the performance numbers below."

View Full Size

Source: AMD

April 25, 2016 | 04:43 PM - Posted by Anonymous (not verified)

16.4.2

April 25, 2016 | 04:44 PM - Posted by Ophelos

It's 16.4.2, not 16.3.2

April 25, 2016 | 05:11 PM - Posted by Jeremy Hellstrom

Sorry, I copied from the release notes ... not double checking the sources accuracy.

April 25, 2016 | 06:27 PM - Posted by Anonymous (not verified)

AMD drivers have been great the last six months or so I've had my Radeon GPU.
I can say without a doubt that, after owner Nvidia GPUs for almost five years straight I much prefer Radeon Crimson and the driver releases.
Maybe I just hopped on the bandwagon at a good time but I was expecting the worst driver wise and I've been blown away so far.

Hopefully AMD keeps it up, I'd love to see some Overwatch drivers released.

April 25, 2016 | 10:44 PM - Posted by JohnL (not verified)

Negative scaling for The Division. What does AMD mean by fixed?

April 26, 2016 | 12:25 AM - Posted by Anonymous (not verified)

Do not expect anything for The Division. gameworks is too much for AMD in this one XD

April 26, 2016 | 10:08 AM - Posted by Anonymous (not verified)

Not that it matters with the division, so many hackers and greifers its nearly impossible to play.

April 25, 2016 | 11:37 PM - Posted by StephanS

It would be good to do a review refresh of GPU cards...
Things have dramatically changed.

Many data point seem to show that GCN optimized title now put the 290x neck to neck with the 980ti.

And those went up in value on ebay but they can still be found for around $250.

Its puzzling to see AMD charging $420 for an 8GB version when the extra 4GB are near useless.

April 26, 2016 | 03:03 AM - Posted by Anonymous (not verified)

AMD does seem to have more DX12 ready hardware. It makes sense since they presumably started work on Mantle quite a few years ago. It would be interesting to know the inside story of these things. I read an article a while ago about AMD adding more asynchronous compute abilities to their Pitcairn-class GPU to make the PS4 APU. The article indicated that Sony may have wanted more asynchronous compute. I can't find the article that it was from though. Looking at the Wikipedia article for Jaguar, it indicates that the PS4 chip has 8 ACEs (asynchronous compute engines) while the XB1 only has 2 ACEs. I believe current high end AMD GPUs also use 8 ACEs, but they also have the ability to issue commands from 8 queues at a time for each ACE. I don't know if the PS4 ACEs can issue from multiple queues since it was added in GCN 1.1. Since it is 8 ACEs, the PS4 chip may be 1.1, I guess. It was limited to 2 in 1.0, so the Xbox1 chip may be GCN 1.0. I am curious as to how, or if, the PS4 APU development tied into development of their other GPUs and possibly Mantle.

This seems like it is a bit of a mess for developers though. The Xbox1 has less asynchronous compute ability than the PS4. It will be interesting to see what happens if Microsoft releases an updated Xbox 1 that has much increased asynchronous compute ability. For the PS4, upgraded hardware will mostly just be more powerful, without any significant changes in architecture since the current PS4 already has similar capabilities to AMDs top end GPUs. Nvidia cards, with their large installed base, don't seem to have much of a usable implementation at all. This seems like Nvidia is holding the industry back in this case. They seem to have preferred to not implement ACEs in hardware since it would have consumed a lot more power to implement the added scheduling resources.

People gave Nvidia so much credit for their low power consumption, but that seems to have mostly come from cutting out 64-bit capabilities and possibly not including hardware asynchronous compute scheduling resources. I guess it is built in obsolescence, since people will probably need to upgrade from their 970s once Nvidia comes out with an asynchronous compute optimized GPU. This reminds me of the early days of the Pentium 4 release. AMD switched to DDR for higher bandwidth, but Intel would not support it since it would help AMD. Rambus memory never caught on, so Intel ended up selling Pentium 4s with SDRAM for a while. With SDRAM, they performed worse than a Pentium 3 for many applications. Such anti-competitive practices just hurt consumers.

April 26, 2016 | 11:24 AM - Posted by Anonymous (not verified)

Yes Nvidia gimped ther Async compute, and their marketing sold the whole market on low power usage at the expence of performance, and Joe bought into Nvidia's marketing hook line and sinker! Nvidia has had to fix some of their lack of async compute with their Pascal Micro-arch. Nvidia will have to have parity with AMD on the fully in hardware async compute front, as VR gaming will make use of async compute to its fullest!

Intel has had a part in the ginping down of the entire OEM laptop market with Intel's Ultrabook initiative, where Intel has reduced the laptop SKUs on the market to mostly dual core i7 U/M based SOC offeringes for laptops, and even AMD has been forced by Intel's laptop market meddling with Laptop OEMs to only supplying AMD's SKUs in gimped down laptop SKUs devoid of any thermal headroom, or options for dual cannel memory, High resolution screens, etc. etc! The overall PC/laptop market is shrinking and even the retailers remaing older model Laptop stocks are reduced to mostly the Ultrabook style Gimped of usability laptop SKUs with very few quad-core i7s still remaoning for prchase and none with windows 7, or 8.1 pro which can be downgraded(Really upgraded to windows 7)! Hopefully the Zen/polaris laptop SKUs that will come online before 2020 will be offered fo sale in some Linux OS based laptop OEM's lines of laptop SKUs.

AMD will need to realise that getting some of their APU SKUs into Linux OS OEM based SKUs will allow for more sales come 2020 when windows 7 goes EOL. I really want a Linux OS OEM based Laptop where I can avoid the entire WinTel ecosystem, and have a GPU alternative to Nvidia's higher priced graphics.

April 26, 2016 | 12:07 PM - Posted by Anonymous (not verified)

"People gave Nvidia so much credit for their low power consumption, but that seems to have mostly come from cutting out 64-bit capabilities and possibly not including hardware asynchronous compute scheduling resources. I guess it is built in obsolescence, since people will probably need to upgrade from their 970s once Nvidia comes out with an asynchronous compute optimized GPU. This reminds me of the early days of the Pentium 4 release. AMD switched to DDR for higher bandwidth, but Intel would not support it since it would help AMD. Rambus memory never caught on, so Intel ended up selling Pentium 4s with SDRAM for a while. With SDRAM, they performed worse than a Pentium 3 for many applications. Such anti-competitive practices just hurt consumers."

Perfect example!

More proof that with enough marketing people will buy anything.

April 27, 2016 | 02:00 PM - Posted by Anonymous (not verified)

Intel is the main force forcing its terrible graphics at high prices onto the entire PC/laptop market, so hopefully AMD will have better IPC from its Zen offerings, not that the single core IPC metric is going to be as important with the graphics/HSA like APIs Vulkan/DX12 that can even accelerate gaming physics/other gaming calculations on the GPUs that have asynchronous compute fully in the hardware.

VR gaming will rely very much on doing more of the game's non graphics calculations on the GPU to reduce the latency that happens when CPUs need to communicate with GPU to a minimum! VR is all about doing as much of the game on the GPU as possible to reduce latency, so CPUs are going to be having their gaming workloads reduced. There are already some VR headsets that are going to have their own SOC/APU powered functionality, to help in providing a boost to laptop/other systems that may not have the full power on their own to drive VR gaming. I can even see future discrete GPU gaming systems getting their own specialized CPU cores on the interposer package along with the HBM and GPU to maybe accelerate VR gaming and reduce latency to a minimum. AMD's Zen based APUs on an interposer for the server/HPC market will lead to consumer variants that will combine separately fabricated Zen cores on a single die with a separately fabricated GPU on its die, and HBM to offer the highest bandwidth at the lowest power usage!

April 28, 2016 | 09:44 AM - Posted by Irishgamer01

OK. This driver was suppose to fix The Graphics glitches in The Division on some AMD cards. It did but now I have sound Issues with the game....crazy

Didn't they say they had solved their driver issues?
Its 7 weeks since the division launched.

Lucky I also have Nvidia (Got the game free)

May 3, 2016 | 11:25 AM - Posted by Puiu (not verified)

Sound issues are not from AMD. Even the console version has sound related bugs. The game is just a buggy mess.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.