AMD Catalyst 15.5 to 15.15 Performance Check - Validating AMD R9 390 Testing

Subject: Graphics Cards | June 19, 2015 - 06:25 PM |
Tagged: radeon, r9 390, hawaii, catalyst, amd, 15.15

During the course of our review of the new Sapphire Nitro R9 390 8GB card earlier this week, a question came up on driver support. For testing the R9 300-series as well as the Fury X cards, AMD provided a new Catalyst 15.15 beta driver. The problem is that these drivers would not install on the Radeon R9 200-series cards. That's not totally uncommon on new GPU releases but it does seem a bit odd considering the similarities between the R9 390 and the R9 290, for example.

That meant that in our review we had to use the Catalyst 15.5 beta for the Radeon R9 290X and the Radeon R9 290 GPU while using the newer Catalyst 15.15 beta for the Sapphire Nitro R9 390. Eyebrows were raised as you would expect as any performance differences between the new cards and the old cards would have to take into account the driver changes as well. But since we couldn't install the new driver on the old hardware, we were stuck, and published what we had.

View Full Size

Since then, a driver with some INI modifications that allows Catalyst 15.15 to be installed on Radeon R9 290X/290 hardware was built and uploaded from the Guru3D Forums. Today I installed that on our XFX Radeon R9 290 4GB card used in our R9 390 review to re-run a few game tests to see what changes we saw, if any. This would help us address any concerns over the updated driver causing performance changes rather than the hardware changes.

(Note: I realize that using an INI hacked driver isn't exactly going to pass QA with AMD, but I think we are seeing results that are close enough.)

First up, let's look at Grand Theft Auto V.

View Full Size

View Full Size

View Full Size

View Full Size

In GTA V we see that the average frame rate at 2560x1440 goes from 39.5 FPS to 40.5 FPS, an increase of about 2-3%. That's minimal but it is interesting to see how the frame rate consistency changes as we move down the sliding scale; pay attention to the orange and pink lines in the FPS by Percentile graph to see what I am referencing. As you move into the slower frame times in our testing, the gap between the 15.5 and 15.15 driver begins to widen slightly, indicating a little more frame time consistency in 15.15 release.

But what about BF4 or Metro: Last Light?

Continue reading our performance check on Catalyst 15.5 and Catalyst 15.15 drivers!

View Full Size

View Full Size

View Full Size

View Full Size

In Battlefield 4 there is no difference in performance on the R9 290 4GB card when moving from the 15.5 driver to the 15.15.

View Full Size

View Full Size

View Full Size

View Full Size

The same is true for Metro: Last Light - performance is essentially identical between the new and older Catalyst beta driver on the Radeon R9 290.

So what's the take away? While in Grand Theft Auto V there is some performance delta between Catalyst 15.5 and Catalyst 15.15 with the R9 290, that is not the case in Battlefield 4 or Metro: Last Light. (Note: the results at 4K testing, which I did run, showed identical behavior to the 2560x1440 testing shown above.) It makes sense that GTA V would see some improvement in gaming experience with a newer driver as it is still a new title getting updates and fixes from both AMD and NVIDIA.  BF4 and Metro: LL are quite a bit longer in the tooth and thus the lack of change is expected.

But, there does not appear to be any kind of smoking gun to point to that would indicate AMD was purposefully attempting to improve its stance through driver manipulation. And that's all I wanted to make sure of with this testing today and this story. I obviously didn't test every game that users are playing today, but all indications are that AMD is in the clear.

June 19, 2015 | 06:43 PM - Posted by El Tech Gato (not verified)

Bleh. I guess it's what everybody expected. I sincerely hope the FuryX delivers.

June 19, 2015 | 09:04 PM - Posted by Anonymous (not verified)

Dial core clock and memory to the same speed and you get same performance tested!

June 19, 2015 | 09:07 PM - Posted by Anonymous (not verified)

Which means absolutely 0 improvement liar AMD just increase clock and change higher RAM module and called works

June 20, 2015 | 01:00 AM - Posted by Titan_V (not verified)

Nvidia LIED about the 3.5 GB 970.
Nvidia LIED about bumpgate.

You were saying?

June 20, 2015 | 08:35 AM - Posted by svnowviwvn

That you are charlie's sock puppet.

June 20, 2015 | 01:36 PM - Posted by Anonymous (not verified)

You are one to talk, as you also have a big arm shoved up your tuchus, look it moves its fingers and your sock mouth moves!

June 20, 2015 | 03:04 AM - Posted by Alamo

i dont get what are you upset about ?
there is improvement, many other tests suggest that, 390 even in these tests is better than 290, higher results shows on games using heavy tesselations like witcher 3 ( basicaly gameworks or more specificaly hairworks) where 300 gets 10% boost on other sites, so AMD didnt lie, 300 got faster tesselation, got slightly more compute and 50mhz clock bump, and double the memory.
what PCper test lacks is a gamework title using heavy tesselation , like the witcher 3 or crysis 3.

June 20, 2015 | 03:57 AM - Posted by Anonymous (not verified)

PCgameshardware tested the 15.5 and 15.15 drivers with a 290X to look specifically at tessellation differences and found that the 15.15 drivers improved 290X tessellation noticeably. That could explain why the 390 cards do better on Gameworks and tessellation titles.

June 20, 2015 | 10:56 PM - Posted by annoyingmoose (not verified)

and look at that R9 285 now bitch-slapping them all upwards of x16 tessellation, with that "3xx series" beta driver.

so, this should probably explain the 40% performance improvement for 390X (15.15 beta) over 290X (15.5 beta) in Withcer 3 seen at HardOCP:

June 21, 2015 | 01:08 PM - Posted by Anonymous (not verified)

R9 285 has a newer chip design than R9 390X, Tonga (GCN gen3) vs Hawaii (GCN gen2). Tonga has a more powerful geometric engine than Hawaii. It always had more tessellation performance. It doesn't have much to do with drivers.

June 21, 2015 | 03:46 PM - Posted by annoyingmoose (not verified)

right, although it does benefit the most (by quite a big margin) from the new driver, according to PCGH tessellation graph.

June 21, 2015 | 01:04 PM - Posted by Anonymous (not verified)

You are clueless. You get more performance, lower temps, less noise and more VRAM than a reference R9 390. It's actually a quite different product. No one cares about running it at the same clocks because no customer will do so. The only thing that hasn't changed is the basic chip design. But there's nothing wrong with it. Nvidia did the same in the past, e.g. GTX 680 -> GTX 770. It's more important for AMD to focus on the 14/16nm next generation. Fiji is more than enough for now.

June 22, 2015 | 10:30 AM - Posted by Anonymous (not verified)

Not really, they haven't even changed the VRAM module. They simply overclocked them from 5 GHz to 6 GHz, I've seen photos posted around and the modules they are using are still rated for 5 GHz and they overclocked them. Thus making buyers have even less overclock potential.

Nvidia instead uses 7 GHz modules and we can overclock them above 8 GHz, I wonder why AMD doesn't use those? Guess they cost more and they want to lower production costs, nothing else comes to mind otherwise.

June 19, 2015 | 06:51 PM - Posted by Anonymous (not verified)

All AMD had to do is say it was a bug like Nvidia did with Kepler cards and everyone would have believed them right ?

When are we going to see the Kepler test.

June 19, 2015 | 07:18 PM - Posted by Anonymous (not verified)

When Nvidia did it it was perfectly okay and everyone believed them.

If AMD had done that there would have been OUTRAGE. Mostly from Nvidia fans.

June 19, 2015 | 07:50 PM - Posted by El Tech Gato (not verified)

Well when you have the best product people give you more leeway...Right? I don't think there's as much fanboyism as many seem to believe.

June 19, 2015 | 07:54 PM - Posted by interbet (not verified)

Yeah, no one has seemingly came to defense of Kepler owners whose cards seem to be getting slower by the year. 780 was supposed to compete with a 290X, but its lagging behind even the 280X in latest tests.

June 21, 2015 | 02:57 AM - Posted by Klimax (not verified)

Because optimizations for Kepler were not yet present. First is support for current tech and then support for previous gen.

Old trade-off between complexity/efficiency. Either you have complex massive hardware which can somewhat handle new situations, but is not that efficient for particular case or you build highly efficient relatively simple hardware and then HW is dependent on software to handle new situations.

Also there was never provided evidence for alleged crippling by Nvidia. Just whole lot of noise and stupidity from ignorant.

June 19, 2015 | 07:28 PM - Posted by JohnGR

I think many Nvidia users using 700 series cards, would love to see a driver comparison like the above, comparing a few of the latest GeForce drivers and how they perform with 700 series cards. For example if there is a performance decrease, or if the problems with some GameWorks titles like Witcher 3, where fixed as promised.

June 21, 2015 | 09:19 AM - Posted by Martin Trautvetter

How about you use your Keppler card, do the tests and share your results?

You do own a Keppler card, right?

June 21, 2015 | 05:07 PM - Posted by JohnGR

Oh! That rage!
I am guessing 12-15 years old. Am I right?

June 22, 2015 | 01:56 AM - Posted by Martin Trautvetter

So, no, you don't own a Keppler GPU, or no, you're too lazy to run a few tests yourself, but not lazy enough to ask someone else to run them for you?

June 22, 2015 | 08:42 AM - Posted by Anonymous (not verified)

lol, it's literally PCpers business to run test...literally. I own a GTX780 and wouldn't run a test for any of you, even if it would save your life.

So, go buy one and you run the test, or are you too lazy? Waffles

June 22, 2015 | 12:44 PM - Posted by Martin Trautvetter

"lol, it's literally PCpers business to run test...literally."

Don't think I agree, but I'm sure if JohnGR contracts them to run tests for him, they'll do just that.

"I own a GTX780"

Good on you, mate!

"and wouldn't run a test for any of you, even if it would save your life."

I'm sure JohnGR is going to be devastated by your disregard for his life.

"So, go buy one and you run the test, or are you too lazy?"

My point exactly.



June 22, 2015 | 02:55 PM - Posted by snook

skirts don't look good on men.

im not here for johnGR's health or yours for that matter.

June 19, 2015 | 07:34 PM - Posted by Anonymous (not verified)

Driver manipulations seems more like an Nvidia thing, but still, thanks for checking and being thorough.

June 21, 2015 | 02:59 AM - Posted by Klimax (not verified)

Nope. (Apart from title specific fun, which both AMD and NVidia had, I never saw any evidence of such thing)

June 21, 2015 | 12:00 PM - Posted by Anonymous (not verified)

then you have not looked well enough...they have both been caught.

June 19, 2015 | 09:22 PM - Posted by tbone (not verified)

i just like to say that there is another particular website...and ill be honest... they have been slacking severely lately. i don't even go there anymore because they are slow with articles or no articles at all. not sure whats going on over there. its pretty pathetic for a top tier tech website.

PCper has been on point with many reviews and tests and multiple test like this one...feeding my need for PC related tech stuff at this ever so fast industry. keep up the great work. you guys have come along way and pretty much IMO the best tech website there is now.

June 20, 2015 | 02:42 AM - Posted by Anonymous (not verified)

" image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit. "

no THIS card not have HDMI 2.0
Without appropriate Chip .
they not have SiI9777 Chip

to speak from video card to screen display or Blu-ray is a need to Chip in any device
the chip need Protocol of the HDMI 2.0
ONLY Silicon Image make that chip for HDMI Organization

nvidia 960/970/980 not have SiI9777 Chip
no hdcp2.2
no HDR
no bandwidth 18GBPS

this the card 980TI where the SiI9777 Chip ?

This is very easy to prove with a simple detailed image for 4:4:4 and only a blind man cannot tell the different between 30 and 60hz on a PC.
yes this very easy to prove
ask in written confirmation from NVIDIA video card that thay have :
SiI9777 Chip on video card
andwidth 18GBPS

i try 3 time from nvidia and no reply !
what they have to hide ?

you bringing article on Sony's XBR 55X900A this tv make in 2013 when he was only HDMI 1.4
with out chip AND the can play with HDMI 1.4 only 30HZ 4k
so how it can play HDMI 2.0 how it can have HDCP 2.2 ? how it can have 18 gbps ?
Can you connect this screen to NETFLIX ?
this TV was manufactured before the chip SiI9777 was invented in 2014

Lacking the available bandwidth to fully support 4K@60Hz until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit.

June 20, 2015 | 10:11 AM - Posted by Ryan Shrout

Maxwell, son, Maxwell.

June 22, 2015 | 03:10 AM - Posted by Anonymous (not verified)

you have attach a letter from AMD to have HDMI 2.0

Unlike NVIDIA fake!


HDCP 2.2 ?
DCI - P3 coloer ?
SiI9777 Chip ?
bandwidth 18GBPS ?

reviews without checking the truth no reviews!

June 22, 2015 | 10:41 AM - Posted by Anonymous (not verified)

WTF are you talking about? That image you posted talks about an ADAPTER from DP 1.2a to HDMI 2.0. Nowjere does it say the AMD card has HDMI 2.0 support natively.

This same adapter can be used with ANY card supporting DP 1.2.

June 22, 2015 | 11:12 AM - Posted by rl (not verified)

He's right on one point though: Maxwell can't do HDCP 2.2(except for GM206/GTX 960).

June 23, 2015 | 12:01 AM - Posted by Anonymous (not verified)

no HDCP 2.2 on GTX 960/970/980
cant play Ultra Blue Ray or NETFLIX

adapter can be used on DP 1.2a ! no DP 1.2
Write clear 1.2a

June 23, 2015 | 03:06 AM - Posted by Anonymous (not verified)

MSI is saying their 380, 390 and 390X all have HDMI 2.0

AMD admits at HDMI 1.4 bandwidth


Dear Dave,

Your service request : SR #{ticketno:[]} has been reviewed and updated.

Response and Service Request History:

Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.

In order to update this service request, please respond, leaving the service request reference intact.

Best regards,

AMD Global Customer Care

June 20, 2015 | 05:46 AM - Posted by Mac (not verified)

But, there does not appear to be any kind of smoking gun to point to that would indicate AMD was purposefully attempting to improve its stance through driver manipulation. And that's all I wanted to make sure of with this testing today and this story. I obviously didn't test every game that users are playing today, but all indications are that AMD is in the clear.A lot of dissappointed Kepler users out there who feel their performance is being manipulated negatively through drivers, are pcper going to look into that to ?

June 20, 2015 | 05:55 AM - Posted by Alamo

kepler are not negatively manipulated by Nvidia.
they release New Lineup, they work on it, ppl who want's better drivers need to upgrade.
i think there is a good angle to AMD's rebrands, users got to keep driver support longer :D on their cards, since it take them 4 years to phase them out xD.
no but seriously most benchs put 290X way ahead of the 780Ti and Titan, funny.

June 22, 2015 | 09:14 AM - Posted by snook

The answer is no, they will not look into nvidia's latest maxwell drivers causing Kepler issues.

Posit your own reason.

June 20, 2015 | 08:33 AM - Posted by svnowviwvn

Quote: During the course of our review of the new Sapphire Nitro R9 390 8GB card earlier this week, a question came up on driver support.

The unending AMD fanboy's whining of a great conspiracy gets sugar coated into "a question".

June 20, 2015 | 10:00 AM - Posted by Anonymous (not verified)

I don't think AMD has purposefully tried to nerf 200 series benchmarks, but based on my testing there is variance. This is a different driver where the performance increase varies from nothing (BF4) to margin of error stuff (GTA) to... well, something more.

I could be completely wrong of course, but as a friendly recommendation to PCPer, I'd suggest trying AC Unity at ultra high 1440p with FXAA on both drivers. I see a 15% increase on 15.15 vs 15.5 beta on my R9 290X. A lot of the stutter seems considerably reduced.

Haven't got the numbers to hand but Far Cry 4 is also worth looking at.

June 20, 2015 | 02:09 PM - Posted by Anonymous (not verified)

Well AMD gave the market its new Fury, and spent its limited engineering budget on getting a new competitive performance SKU out there, and performing with 4GB of memory. AMD does not have the revenues to rework its entire product stack top to bottom in a single stroke. AMD has offered some improvements on its older microarchitectures, and it will take some time for the Fury technology to work its way down the product stack. Give AMD time to compete with the limited funds/resources that they have both with GPU, and CPU engineering, AMD is doing a great job with the little R&D budgets that they have! If there were a year to year innovation for the dollar spent on R&D price metric, on what was done with the budgets available, AMD would be the winner in that category. AMD does more engineering hanging on by a thread just to stay in business, than anyone in the history of the PC market.

Fury is to market, and Zen is incoming, there are some Carrizo based laptops being offered, they better get some better screen resolution options, or we'll know the fix was in from a certain CPU/SOC monopoly! AMD's pricing for its Fury line is definitely more competitive for damn sure, and for the consumer that is one good thing.

P.S. Blender 3d is getting support for the latest AMD graphics GCN hardware, so Cycles rendering on the GPU support will make a Carrizo based laptops more attractive for Blender rendering on AMD APUs. I know from a mesh modeling standpoint that AMD's, and Nvidia's, GPUs handle high resolution mesh models/scenes much better than Intel's limited SP/other execution unit count GPU/SOC SKUs. So I'm going to be looking at laptops with Carrizo APUs, and even laptops with Carrizo APUs combined with an AMD discrete mobile GPUs, as once sufficient Blender rendering can be done on the GPU(Ray Tracing included) there will be no need for an expensive quad core i7 based laptop for Blender 3d laptop based usage and medium graphics workloads.

June 21, 2015 | 07:19 AM - Posted by skline00

Ryan, thank you for the testing.

June 22, 2015 | 03:02 AM - Posted by Irishgamer01


July 20, 2015 | 12:40 AM - Posted by Anonymous (not verified)

GTX 970 EVGA or R9 390 MSI??????

July 20, 2015 | 12:41 AM - Posted by KAKAROTO (not verified)

GTX 970 EVGA or R9 390 MSI

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.