AMD Releases First 5 GHz Processor for Consumers, FX-9590 and FX-9370

Subject: Processors | June 11, 2013 - 11:13 AM |
Tagged: vishera, TWKR, piledriver, FX-9590, FX-9370, Centurion, amd

We have all heard the rumors, but it appears to be true.  We had originally heard about a “Centurion” product which would be for extreme overclockers on the AMD side, running at 5 GHz with a 220 watt TDP.  Now we finally get to see what all the fuss is about.  AMD is releasing two new Vishera based processors that, for the time being, will be limited to system integrators and will be available later this summer.

View Full Size

The top end product is the FX-9590 which has a top turbo speed of 5 GHz.  This will be a full four module implementation with the 8 MB of L3 cache.  AMD did not give any other details for this particular part.  We do not know what the base clock is, we do not know what the TDP is, and we can only assume that the northbridge/L3 cache will be clocked at the standard 2.2 GHz that we have seen on previous Vishera parts.

The second product is the FX-9370 which is again a four module part that has a top turbo speed of 4.7 GHz.  Remember that the four modules each have two “cores”, so it is still considered an eight core part.  These processors are unlocked, so they can be further overclocked if one so desires.  TDP and other details were again skipped for this particular part.

These parts will be going to system integrators first, and I am not entirely sure that AMD will sell them on the market direct to consumers.  If AMD does in fact sell to consumers (not implied at all in the press release) then they likely will have to bundle it with a very robust cooler.  Probably something along the lines of what we saw with the original FX-8150 LCS bundle.

Consider that the FX-8350 is a 4 GHz base clock product with a max turbo of 4.2 GHz and having a TDP of 125 watts, we probably have to assume that the 220 watt number bandied about is accurate.  A pretty beefy air cooler would be required, or the aforementioned liquid cooling system.  AMD also likely had GLOBALFOUNDRIES change the “mix” when fabricating these parts.  These batches probably feature more leaky transistors that can achieve higher speeds without an extreme amount of voltage.

This is an interesting move by AMD.  Remember those TWKR chips that they released that were designed for LN2 use?  There were a very limited number of those units, and we can imagine that while the FX-9000 series will be in greater numbers they still will not be commonplace on the retail market.  SI’s like Maingear will be introducing systems this summer featuring these chips.  Performance will be good with these solutions, but the tradeoff is of course power consumption and heat production as compared to similarly performing (and stock clocked) Intel i7 3770K and 4770K parts.

AMD is doing their best to address the enthusiast market, but until Kaveri hits the streets we will not see any major upgrades beyond these parts.


We received some further info about this chip.  The TDP is up in the 220 watt region.  It utilizes Turbo Core 3.0 to help achieve those speeds, so it seems that some of the work that went into Richland has made it into these latest FX processors.  BIOS updates are probably a must.  These chips will only be going to system integrators (SIs) and will be bundled with a liquid cooling system.  We have no idea what the price will be since these will only be sold to SIs.  Systems should be available after July 16.

Source: AMD

June 11, 2013 | 11:31 AM - Posted by tbone8ty (not verified)

Price is critical!!!

June 11, 2013 | 01:32 PM - Posted by Bondinspace (not verified)

Jesus, even the i7-4770K only has a TDP of 83W. Don't see where you're going here, AMD...

June 11, 2013 | 02:48 PM - Posted by SPBHM

yes well, I agree that a high TDP like this is a bad thing for me or most people, but whatever, this will be sold in limited quantity I think... and it's not that absurd if you think a lot of people buy 700w+ PSUs, run more than one VGA... I mean, we have single VGAs with over 300W TDP right now... 220W for a CPU is pretty high, but people have been running 200W+ TDP CPUs for a long time when overclocking (I'm pretty sure a 4GHz i7 920 can use that, and it was a popular thing to have for many years), for some people who can afford not to care about heat with their high end coolers and powers, this should be fine.

June 11, 2013 | 02:05 PM - Posted by orvtrebor

I'd like to see where they price these at and what they do for cooling.

If the price is right I could see it working for certain things, but at the same time long term power consumption could offset the benefit. (depending on use)

Glad to see them staying active though :)

June 11, 2013 | 02:06 PM - Posted by John H (not verified)


Yes it's power hungry, but give them credit for at least getting this out the door in some kind of quantity. This is what 'extreme' should be about.

June 11, 2013 | 02:40 PM - Posted by Anonymous (not verified)

This is stupid. Reckless power consumption. AMD has to push 3 times the power for it to compare to intel's i7. Pathetic.

June 11, 2013 | 04:47 PM - Posted by Anonymous (not verified)

Sniff Sniff I smell a Intel Fanboi Troll.

June 11, 2013 | 05:28 PM - Posted by arbiter1

Well just cause AMD sells for cheaper, the extra cost on the electric bill makes Intel cheaper long term.

June 11, 2013 | 07:53 PM - Posted by Anonymous (not verified)

True. But if you are in a cold climate , you could use this to heat your room.

June 12, 2013 | 03:31 AM - Posted by razor512

Agreed, I use my Phenom II x6 overclocked system to heat my room in the winter.I keep the heat at around 55f then use the PC to keep the from warm.

For really cold days, I start bit coin mining and that keeps the room at around 75f even when it is 5f outside.

The bit coins then help to offset some of the power cost.

July 11, 2013 | 06:25 PM - Posted by Anonymous (not verified)

bitcoin is dead for people using pc's you have to by those retarded butterfly labs machines to even see any gains

June 11, 2013 | 03:17 PM - Posted by Onion uk (not verified)

^^ Look we have an intel employe ! Ha! Ha!

June 11, 2013 | 04:08 PM - Posted by thomask (not verified)

for desktop no one cares about cpu power consumption

remember all games will be designed to run on AMD consoles
so future gaming benchmarks might be in AMD's favour

gamers are enthusiasts which tell their friends what to buy.

i might have to invest in AMD.

June 12, 2013 | 01:21 PM - Posted by Klimax (not verified)

Somebody doesn't understand porting. There will be no trace of AMD optimization, because they'll take high level source code and after changing API calls they'll use regular compiler (most likely Visual Studio or in rarer case Intel C compiler; there is no AMD compiler for Windows) and recompile. 99% of assembler code will be gone. At best case there'll be vendor neutral code, at worst case there'll be Intel-optimized code.

And remember, Jaguar cores are very weak, much weaker then Sandy Bridge cores.

As for GPUs, if current state of Gaming Evolved titles is of any indication, NVidia doesn't have to fear anything...

June 11, 2013 | 04:39 PM - Posted by ugly niglet (not verified)

Yeah right...why would anyone ever buy an amd over intel. Not impressed.

June 11, 2013 | 04:49 PM - Posted by Anonymous (not verified)

Because i can get a Asus motherboard the Chip The cooler and all for the Price of One Overprices I7?

June 11, 2013 | 05:47 PM - Posted by arbiter

take an amd 8350 and i7 3770k, compare cost after say 3-4 years. Which cpu is saving you money long term. Most ppl are not ones that upgrade their computers every few years.

Its like a hybrid/electric car vs gas power car. Yes they cost more short term, but money you have on gas ends up saving you money.

October 11, 2013 | 10:45 AM - Posted by Anonymous (not verified)

Gah, that's the worst analogy ever. Maintenance on a gas car is much cheaper than that of a diesel, or a hybrid. Do you have any idea how much it costs to get a new hybrid battery, let alone the installation? Around 3-5 grand. You'll have to replace that more than any real costly part in a car (given you didn't get a lemon) to what, save 500 bucks a year in fuel costs to replace that monstrosity every 150k miles or something like that (for me that would be less than 2 yrs)? No way, I'll definitely pay 1/2 as much for a processor I probably will replace within 2-3 years - Moore's Law still holds true, Holmes. This will most likely be outdated within 2-4 years, tops.

Plus, it's got electrolytes, it's got what plants need!

June 11, 2013 | 05:10 PM - Posted by dR0 (not verified)

"Yeah right...why would anyone ever buy an amd over intel.
Not impressed"

No, you're just a stupid troll. The reason people buy AMD is bang for the buck. Simple as that. But then, you knew that.

I have bought 2 intel chips since i got into computers in 1993. A p-iii 450 (that ran hot as hell) & a p4 2.8ghz which is the machine i am typing on.

All my other machines use amd from my 6 core phenom ii @3.3 ghz 1100T to my Athlon 1500+ @1.3 ghz & the 4 machines in between. I am always riding the tail end of the leading edge, happily with AMD & for much less $$$, :D

Very glad to read about this monster even if it does use more power than intel's offering. By the time i'm ready to upg next year maybe they will have refined it & it'll be available to consumers.

June 11, 2013 | 05:35 PM - Posted by John_Gr (not verified)

A nice marketing chip.
Now AMD only has to send as many chips as possible to hardware sites so that in future reviews there is guaranteed that there is going to be a green line closer to the top.

June 11, 2013 | 06:27 PM - Posted by Anonymous (not verified)

I can't believe people are arguing over actual power cost here.

Electricy is sold in units of kilowatt (1000 watt) hours. Assume average rates of 10 cents per kilowatt-hour.

Running this light for 1 hour is 60 watt-hours.

So the calculation is 60/1000 * 10 cents = less than 1 penny

Run your amd every day for 1 hour for a year = around $3.60

You're still better off than any High end Intel i7 and double the price

June 11, 2013 | 07:59 PM - Posted by Anonymous (not verified)

Even if you run 24/7, it still only comes out too $75 a year. Currently, the most expensive AMD processor on Newegg is $200, and the most expensive Intel is $1070. Be serious, even if AMD charges $400, it'll be cheaper in the long run, assuming you're not going to replace it before it becomes ancient.

June 12, 2013 | 05:11 PM - Posted by arbiter

Yea that is a pretty BAD argument you just made there. As the top end intel would stomp that AMD. AMD would need probably least 4-8 cores on that to come close to that top end intel. What is best comparison is top of line AMD vs what is performance wise same class cpu from intel. Which would be 3770k/4770k which is 100-150$ more.

June 11, 2013 | 08:12 PM - Posted by orvtrebor

Not exactly...

1. 10 cents is low (at least where I live) Tier 1 rates are 13 cents and go up depending on use. @ Tier 4 it is over 33 cents per kwh. Or if you're on a TOU rate it can be impacted even more. (during the summer peak hour usage can be closer to 50 cents per kwh)

2. Who would use a beast like that for only 1 hour a day per year? Most people I know (who would use something like this) leave their rigs on 24 hours a day.

This isn't for your mom to check her Email in the morning before work.

A more realistic use would be 4 to 5 hours per weekday and 8 to 10 hours a day per weekend.

June 12, 2013 | 10:01 AM - Posted by Joe (not verified)

Using your computer vs. using at peak power draw are also two different things. The 220W TDP only applies when all 8 cores are running flat out. I doubt even a hardcore computer user would spend ~40 hours a week at 220W unless they were using the computer as part of a compute farm or mining bitcoins.

Even then, suppose you did use it at its full 220W potential for ~40 hours a week, or about 2000 hours a year. That's about 440kWh for the AMD part vs. 170kWh for the Intel part. Even at $0.50/kWh, that's only around a $135 difference per year.

Of course, you're not going to cram all 2000 hours of compute time into the expensive early summer evening rolling blackout times of day (when the $0.50/kWh rates are more likely to be), so your actual cost would be less than that.

And yeah, sue me for using rounded-off numbers above. The point is, even if you assume crazy loading factors and crazy rates, the electric bill still doesn't make the Intel chip cheaper for the average consumer.

Now, if you're talking data center... now you have an argument! But I highly doubt these crazy chips are aimed there.

June 12, 2013 | 05:14 PM - Posted by arbiter

That is true its under max load, but How many people upgrade their computer every few years outside dedicated people? There are people that still use P4 machines cause well they work fine for their task's. SO some ppl might have the machine for 6-7+ years.

June 14, 2013 | 10:41 AM - Posted by Cole Brodine (not verified)

If you ran your computer 24/7 and your electric rates are about 10 cents per kWh you get the following:

220 W / 1000 * 24 hours * 365 days = 1927.2 kWh

At $.10 per kWh (my local electric rate) that is $192.72

For something with 83 W TDP:

83 W / 1000 * 24 hours * 365 days = 727.08 kWh

At $.10 per kWh that is $72.71 giving you a difference of $120.01 per year. Don't forget that assumes you are doing folding or something to keep your CPU at 100%

A more realistic use case for myself is about 4 hours a day on a week day and 6-8 hours on a weekend day. That gives me about 36 hours of runtime a week or 1,872 hours a year.

So: 220 W / 1000 * 1872 hours = 411.84 kWh versus 83 W / 1000 * 1872 = 155.38 kWh

At $.10 per kWh that is $41.18 per year versus $15.54 giving you a difference of $25.64 PER YEAR. (Also assuming that it is running at 100% CPU when it is on)

Fill in your own electric rates if you want, but I'd pay and extra $26 per year to run a $400 CPU versus a $1000 CPU on my desktop.

Although it is all acedemic until we see some prices. It also probably doesn't really matter anyway since it seems like they will only put it in specialty systems and not sell OEM or retail packages.

June 11, 2013 | 07:49 PM - Posted by Anonymous (not verified)

Can I make French fries with it ?

June 11, 2013 | 07:55 PM - Posted by Anonymous (not verified)

A hot beverage ?

June 11, 2013 | 08:13 PM - Posted by Brett from Australia (not verified)

Thanks for posting this Josh I can see where these chips may appeal to enthusiast gamers but that's a lot of heat. Assuming at this early stage that these chips will run on AM3+ boards only.

June 11, 2013 | 08:55 PM - Posted by Pigeon (not verified)

Amazing to see what a gloss you can get on a turd if you are prepared to spend over 30 years constantly polishing it.

Depressing to think how much further on we might be if IBM had chosen to build the original PC on the 68000... or indeed on anything that was not an obsolescing pile of crud before they even started.

June 12, 2013 | 08:21 AM - Posted by Poci

Wanna talk about polishing turds? Whats this the 4th gen core i7s? New sockets for barely any improvement in architecture?

At least AMD cares about its fan base and allows backwards compatibility

June 12, 2013 | 05:17 PM - Posted by arbiter

Backwards compatibility is not all its cracked up to be on AMD. They do come out with newer sockets using same pins, but newer cpu's use faster bus interconnect which older board don't do so CPU has to dumb down to it, so you lose performance cause that bottleneck.

June 12, 2013 | 08:32 PM - Posted by Poci

Bus interconnect? Are you talking about hypertransport bus? Yeah and so what? There is very and I mean very little bandwith difference between those lesser speeds of HT bus. No bottleneck at all, CPU and NB are what matter. And on FX from what Im seeing its just CPU, NB increase doesn't affect as much as with Phenoms.

June 13, 2013 | 12:15 PM - Posted by Josh Walrath

Meh, that is not as true as it used to be.  All connections are moving to the CPU and PCI-E bus.  HT 3.0 is still plenty fast as well.  FM2+ will feature PCI-E 3.0, but will still work with older Trinity/Richland parts, as well as Kaveri.  So, this type of backwards compatibility is nice.  Intel... socket 1156, 1155, and 1150?  The differences?  Other than re-routing... not a whole lot.

June 13, 2013 | 10:46 PM - Posted by Poci

Yes exactly what I mean Josh.
HT is already oversaturated in bandwith.

June 11, 2013 | 10:41 PM - Posted by semiconductor_dude (not verified)

If you are not rooting for AMD then you are a fool. Why? Because without someone pushing Intel all you are going to get from them is little incremental updates when they have much better chips waiting in the wings. Do you think they are going to release those chips? No, not until they get the most money out of the market. Kind of like when AMD released the first 1Ghz chip and we found out Intel had several chips waiting in the wings that they didn't release so they could squeeze more money out of people. You might think your Intel chip rocks, but I can tell you it would be a hell of a lot faster if AMD was either leading or nipping at their heals. As it is Intel is sitting back and resting just counting their money. Without competition only big business wins.

June 13, 2013 | 06:06 AM - Posted by Anonymous (not verified)

Hear, hear.

June 12, 2013 | 08:19 AM - Posted by Poci

Why complain about TDP? Not like those that OC care about that. You run CPu to its limit anyways. In any case that turbo 5ghz I don't think is on all cores. Probably only 2.

June 12, 2013 | 10:04 AM - Posted by Vellinious (not verified)

I've always been a fan of AMD, but lately, they've been slipping. I've got a system running an 8350 and love it. A few months ago, I decided to go ahead and build a system around the 3770k. The AMD process cost me $200. The i7 cost me $250. The performance difference is staggering.

I keep seeing the AMD guys screaming about $1000 Intel processors. What you should be talking about is the 3770k or the 3570k, which are priced very similarly to the high end AMD processors, and DO out perform them. Hands down...not even a contest, WITH less power consumption and less heat.

Compare apples to apples....

My next build will be an AMD A10. I want to see how they compare in the "eyeball test". I know they don't do well in benchmarks, but....I want to see it.

June 12, 2013 | 04:34 PM - Posted by Bill (not verified)

Exactly. Why they keep trying to compare AMD's top cpu against Intel's top cpu just because they are both top of the line cpu's is beyond me. It's like comparing the top of the line Honda to the top of the line Mercedes. There's no comparison.

Even worse is knowing that doesn't matter anyway as Intel's way lower in the line cpu's still bitch slap the crap out of AMD's space heater!

June 12, 2013 | 04:53 PM - Posted by Poci

No their low line cpus don't bitch slap anything. What you said is ludicrous. If there is a department besides gaming in which AMD can still hold its own is the budget low end market.

June 12, 2013 | 04:51 PM - Posted by Poci

Not even a contest? There are benchs where an 8350 will even match a 1000$ i7. ANd in gaming benchmarks there is not much difference especially in games that will use all 8 cores. In fact to get the 6 core i7s to run BF3 good u have to disable hyperthreading.....

June 12, 2013 | 05:25 PM - Posted by arbiter

well if benchmark is optimized for 1 cpu over the other that can happen. A certain fruity company used to do that crap years ago. THat top of line intel is a 2011 Pin 6core/12thread cpu that supports I think 4 channel ram which. Truthfully they are not even in same league in terms of power. But you can get a 6 core part in the league of it for like 500$. As for games that use all 8 cores don't think there are any that do. Less its an AMD gpu + physx game.

June 12, 2013 | 08:34 PM - Posted by Poci

Frostbite engine in battlefield can use up to 8 threads.....
So can valve source engine

June 12, 2013 | 09:12 PM - Posted by Poci

double post

June 12, 2013 | 08:39 PM - Posted by Poci


Im trying to comprehend what your saying, its not making sense to me.....

Either that or you fail to realize my point...

6 core i7s needs to disable hyperthreading because with 12 threads they are losing cycles in the game. Turn off ht and u have the 6 cores dedicated with 6 threads. 12 is too much if engine uses up to 8, gabbish? ;)

As for the 4 channel ram Im lost on your point, that is totally bios/os dependant and a game does not need to be coded for it or have more threads. Just affects memory bandwith.

And yes there are game engines that use 8 cores or more. Frostbite engine in BF3 and upcoming BF4 use 8. Valve source engine with a command line tweak I think can do even more then 8.

June 12, 2013 | 09:11 PM - Posted by Poci

double post

June 13, 2013 | 10:50 AM - Posted by Parn (not verified)

220W TDP to start off. Unbelievable.

While it's good to hear AMD is still in the game, it's shocking to see AMD using this kind of drastic measure to compete with Intel's i7s which have a TDP of 77W - 84W.

June 14, 2013 | 01:09 AM - Posted by doom_Oo7 (not verified)

And just one year ago...

June 14, 2013 | 11:21 AM - Posted by Josh Walrath

Market pressures? Just changing their tune? Seeing a potential soft spot in Intel's armor?  Hard to say, but I like that they aren't afraid to change their mind and move quickly to implement it.

June 14, 2013 | 08:03 AM - Posted by AlienAndy (not verified)

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

*walks away whistling*

June 14, 2013 | 09:50 PM - Posted by Anonymous (not verified)

220w its fine by me. we all know theres 1kw, 1.5kw psu or more out there. it's options thats about it. 8 core on 5 GHz! yay!

July 11, 2013 | 06:57 PM - Posted by Anonymous (not verified)

i just wish they had shrunk the die to 28nm and then gave it 5hz base clock :)

July 11, 2013 | 06:57 PM - Posted by Anonymous (not verified)

i just wish they had shrunk the die to 28nm and then gave it 5hz base clock :)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.