AMD responds to the variability issue

Subject: General Tech | December 6, 2013 - 12:53 PM |
Tagged: amd, R9 290X, leafblower

The Tech Report posted an statement from AMD about the variability that sites have seen when comparing retail 290X's to the press samples sent out to review sites.  At this moment they are citing heat issues and the fact that the performance delta is lessened under Uber mode but will be investigating other possible causes.  With the pending arrival of third party coolers we will be able to get a better sense of the possible contribution insufficient cooling has on these issues it is also possible that the golden sample theory is also at least partially correct.  The big win for consumers is AMD's attitude adjustment and admission that there is an issue worth investigating; if they can get your R9 290X running faster you will be the one who wins after all.

View Full Size

"The range of performance differential is not expected to meaningfully change the user experience but we’ve taken note of recent reports that the degree of variability is higher than expected. Reasonably we would expect the variability to occur both above and below the performance of the press samples, however it appears that most reported performances are biased towards the low side."

Here is some more Tech News from around the web:

Tech Talk

December 6, 2013 | 04:01 PM - Posted by drbaltazar (not verified)

ROFL! Clue 1 : cooling capacity is less then GPU heat output!
Clue 2:message signal interrupt extended is set at one per CPU socket , for hardwired computer (not using battery)msix is supposed to be set at one per CPU core. Hopefully and will supply gamer with the option (okok,with a warning mentioning potential rise in energy consumption)rofl

December 6, 2013 | 04:39 PM - Posted by Anonymous (not verified)

Incoming Ryan rant in 3,2,1...

December 6, 2013 | 05:27 PM - Posted by arbiter

Reading the article on techreports, sounds like AMD expected all benchmarks to be run using uber mode setting. Well problem with that from me as a gamer when i am playing a game i don't want my gpu fan sounding like a jet engine near my head the whole time. Then you through on to the whole BS ryan hit on with this Up to crap, you get a card that is up to certain speed but givin the thermals of the card with the cooler its on avg what 15-20% slower then what AMD makes claim of. So in cold start benchmarks 290(x) is faster then gtx780, but after say 30min and higher when heat soak is it faster after that?

December 6, 2013 | 05:52 PM - Posted by Anonymous (not verified)

Did you not read the article?

He used 3 Press samples and 3 Retail samples. 1 of the press sample was equal to the retail cards. Which contradicts everything Ryan has been shouting on about.

The only solid fact is that they are variations but no one knows to what degree because people are too busy jumping to conclusions as to the cause or excluding factors like other sites have pointed to.

When is the last time someone bought a enthusiast GPU for low noise over FPS. The fact that you refer to a jet engine near your head makes you ignorant on the matter. There is distance and a case enclosure but it seams you believe open air test benching and noise test reflect those use in your home setup.

December 7, 2013 | 03:42 AM - Posted by arbiter

BECAUSE amd has no set min clock the card will run at hence why there is a problem. chips are not all equal some need more voltage then others to run at certain clock. With that bare bones cooler it only makes the problem worse. Shows how little you really know cause when you pull your head outta you know where, being inclosed case makes it worse cause the heat build up in case makes gpu fan have to run harder to deal with that hence even louder then most benchmarks show.

December 6, 2013 | 07:32 PM - Posted by Anonymous (not verified)

Maybe someone should go ride inside a M1-A2, damn fan noise. Stop the battle, this damn fan is ruining the helicopter sound effects, damn wining fan noise/hydrolic pump noise, ruining my realistic battle, crap!

http://www.liveleak.com/view?i=de6_1329722694

December 7, 2013 | 05:40 AM - Posted by JohnGR (not verified)

What I would like to see is an article about a retail card that is better than the press sample used for the review.

There are two possibilities.

The first is that it is extremely rare to find a retail card that is better than the press samples.

The second possibility is that no site is going to make an article showing that retail cards, better than the press samples, do exist. They will make a fuss about retail cards that perform worst than the press samples, but they will say nothing about retail cards that perform better.

Well, whatever is the case - the first case is off course more possible - AMD should be more careful in the future. They almost blew it like with 2900XT. Fortunately Hawaii chip is a very good gpu.

December 7, 2013 | 10:50 AM - Posted by Anonymous (not verified)

LOL. At least there's LTC mining right? I have a stupid AMD GPU rig out of sight and out of mind that is happily mining LTC for me and funding all of my intel + nvidia GPU + other upgrades. There is not a way in hell I would ever game on an AMD rig, when you buy AMD you deal with BS issues like this. Fact of the matter is that nvidia gives you consistent performance with the GK110 without the stupid jet engine fan noise. And yes, i've heard a friends 290X and that thing is stupid loud. Come on AMD fanboys, tell me it "isn't that bad". Keep telling yourself that.

I used to really like ATI GPUs and the 5870 was great. AMD has just taken a proverbial dump on the ATI name though, once a great company now destined for mediocrity by being paired with AMD.

Nvidia may cost more but the quality is also higher. Without the variance. Without the nonsense. Without the driver issues. Without the dust buster noise. When AMD comes to their senses and can make a "complete" balanced product by all metrics, i'll reconsider.

December 7, 2013 | 10:51 AM - Posted by Anonymous (not verified)

By the way did AMD ever fix crossfire + eyefinity on the 280X and below + the 79xx and below cards? I know it's fixed on the 290, but last I heard it wasn't fixed for anything else.

Is this true? Gotta love those AMD issues....issues that you DONT GET with nvidia.

December 7, 2013 | 07:26 PM - Posted by JohnGR (not verified)

You are using an AMD for minimg because your perfect Nvidia card is a useless piece of junk at computing. What else do you want to hear? Really you are saying it in your face. No help for AMD fanboys necessary here.

December 9, 2013 | 10:40 AM - Posted by Anonymous (not verified)

Please, where is the test showing nVidia's variance is better ?

December 10, 2013 | 01:27 PM - Posted by Anonymous (not verified)

Nothing is going to stop AMD selling every single card above the 270 (7950, 7970, 7990, 280X, 290, 290X). In fact almost all are sold out LOL

Take that nvidia fanboys... Some Nvidia fanboys buying AMD to mine.. that's hellerious lol

December 10, 2013 | 01:35 PM - Posted by Anonymous (not verified)

This is not only about Gamers anymore. AMD cards have became a way to make money (Free cards) via Mining Digital Currency. Nobody buying these cards come on to sites to see what the performance is like for referance cards, they just go below and its end of story..

https://litecoin.info/Mining_Hardware_Comparison#AMD_.28ATI.29

http://coinmarketcap.com/

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.