AMD Bulldozer FX Processor Benchmarks Leaked

Subject: Processors | October 2, 2011 - 09:29 PM |
Tagged: sandy bridge, Intel, i7 2600k, FX 8150, FX, cpu, bulldozer, amd

Intel has held the performance lead for several processor generations now, and while AMD is still technically in the game for home theater PC and budget builds, many enthusiasts have moved to Intel for gaming and high performance computers. Many of those people have also held hope that the chip manufacturer would eventually come back strong and maintain some level of competition in the industry. As we move closer to AMD's Bulldozer launch (which seems to have been confirmed for October 12th), enthusiasts and reviewers alike are clamoring to answer a long awaited question: "will Bulldozer give Intel a run for its money?"

According to website Donanim Haber, enthusiasts’ high hopes may finally be realized. The site has posted several benchmarks results that indicate Bulldozer is not only cheaper than Sandy Bridge, but performs on par with Intel’s top end Sandy Bridge chips. In many tests, the AMD FX 8150 CPU’s eight core performance matches the multi-threaded (8 threads, 4 cores) performance of Intel’s high end Core i7 2600k processor.

View Full Size

In the benchmarks that the site performed, the AMD FX 8150 was tested against the Intel Core i7 980X for 1080p gaming and the Core i7 2500k and 2600k for multi-threaded performance. In the graph shown above, the AMD Bulldozer CPU was roughly on par with the i7 980X, trading wins in some games and providing a similar level of performance in others. The AMD processor won in the Metro 2033 and Lost Planet benchmarks, but was slightly slower in Civilization V and F1 2010. In AVP and Batman (among others), the two competing processors saw equal results.

View Full Size

They also ran several benchmarks using highly multi-threaded programs to take advantage of the many-core designs of the AMD and Intel processors, including WinRar 4, Handbrake, 7zip, and wPrime 32M. The eight core AMD FX 8150 Bulldozer processor was tested against both an Intel Core i5 2500k and a Core i7 2600k. The AMD CPU came out ahead in 7zip, wPrime 32M, and Bibble 5.0. It was slower than the Core i7 2600k in the WinRar 4 tests and slower than both the 2500k and 2600k in the ABBYY OCR10 benchmarks. In the other tests, the AMD processor kept pace with or was only slightly behind the top end Intel 2600k CPU.

From the leaked benchmarks (which you can read here), AMD’s new Bulldozer CPUs have made an admirable showing. Should these benchmarks hold true, Intel will have some serious competition on its hands, something that the company has not had to deal with in a long time. Whether Bulldozer will result in price cuts or ramped up production on the Intel side remains to be seen; however, the results are not going to be easy for Intel to ignore.

Stay tuned to PC Perspective for more Bulldozer news in the coming weeks.

Source: Donanimhaber
October 4, 2011 | 04:46 AM - Posted by 3dbomb (not verified)

I don't know whether to trust these benchmarks and it all starts to get quite confusing working out whether you really need a 990FX motherboard which are expensive or a lesser one as long as its AM3+ I still don't know what you're missing if you get a cheap 890 AM3+ board and plop an 8150 into it. Anyone got more info on this?

I'm getting more into 3D and 3D animation. Games, encoding and any other tasks cpu's get put to don't interest me. I guess all I'm really looking for is the fastest speed at the cheapest price for multi threaded applications like Cinema 4D where all cores are in use all the time. To make it really simple. I just want the CPU that will give the best Cinebench benchmarking score.

So with that said. If you wanted to buy a new system in the next few months for the cheapest price, what would you go for? $1000 CPU's are out of the question :)

October 9, 2011 | 12:25 PM - Posted by Jimbo Jones (not verified)

"I don't know whether to trust these benchmarks" -- These are presentation slides, not even by AMD, not benchmarks - they may be indicative, but I wouldn't trust them - wait till the 12th.

"I just want the CPU that will give the best Cinebench benchmarking score." -- no you don't, you last sentence indicates you want the best bang for your buck for rendering. Currently that will be a i7 2500 or 2600 (one would have to see how much difference HT makes in Cinebench - it may be good or negligible - I'm sure there are tests done already), But I'd wait till BD launches - with 8 integer cores it may do well in this area.

"I still don't know what you're missing if you get a cheap 890 AM3+ ..." For your uses, you likely won't miss much - you basically want fast multi-threaded integer crunching - there's not really much in the chipset that would help/hinder you there. My 2 cents.

October 4, 2011 | 12:02 PM - Posted by Anonymous (not verified)

Well, i'm glad to have had my 2500k since March. Tell me, how long have any of you had your bulldozers for? Hell, I might still get an 8150, but honestly, if I had to look back to the beginning of the year, I am glad I made the choice I did.

October 4, 2011 | 03:40 PM - Posted by Anonymous (not verified)

Why all the arguments. Looking at replacing my 4 year old q 6600 come November, if intel has new i7 2700 then and it's cheaper than an 8150 and performs better I'll buy the i7. If the 8150 is cheaper and performs similar I'll build based on that. And in 6 months surely another processor will beat it but will I give a s*** then. Why do people get wound up just because a new AMD processor may beat their 6 month old Intel. Maybe it's my age 35. Some comments are childish.

October 4, 2011 | 04:45 PM - Posted by Anonymous (not verified)

You guys are all idiots. Talking like Fan Boy's...

You guys are talking like its expensive to build a computer, who cares..about Frames Per Sec...I mean this whole thread is just pointless.

I'm maxing out games with a dated AMD 9500 2.2 ghz quad core with a old 9800gtx..yeah direct 10x..plays silky smooth and is probably one of the slowest CPUs out of everyone here..

Am I crying about this? Hell no, I'm gaming without a problem.

I could afford bulldozer and sandy-e and Ivy all at the same time, who cares about whats better?

Buy a house, work on your car or your emotions.

(I'm building a Bulldozer rig, not to booast on how cool I am, and I won't be making a gay youtube video about it, im building it for fun, and to have a comp for gaming at a good price till 2013 PCI express 3.0 and perhaps by then a 30inch Dell 2560x res screen)

Or just use one of your pay checks, and buy IVY and use part of your next check and buy Bulldozer, then have both. Really who cares?

October 4, 2011 | 04:57 PM - Posted by Anonymous (not verified)

Personally. My close friend has made 2 other people we both know, to build 2500k systems, all done up with 6950s and new everything. This all happening in the past month, those 2 "friends" we both know, got convinced by a Fanboi to build 2500k rigs, because of benchmarks proven it's the state of the art fastest beast comps.

what a friend right? I would have told them to wait for sandy-e or bulldozer..save some money..

They all get hyped and worried about their current comps..and build 2500k rigs because of BF3? or Diablo 3? I mean come on...building rigs for 2 games?

Making people think their 955be's won't run anything...is just fanboy talk.

October 4, 2011 | 09:15 PM - Posted by Anonymous (not verified)

haha first off...BF3 will DEFINITELY need better than a 955 BE phenom II to be maxed out. Yeah yeah I know the video card is more important but if phenom II's don't cut it for metro 2033 or crysis II at the highest settings, then that phenom won't for BF3 either.

D3 may be another story...its only in the beta stages at the moment and is only optimized for 2 cores at the moment, but when theres a lot of action on the screen, you may actually need the extra processing power to keep going smooth. Thats what you're paying for with the sandy bridge CPU's. Not necessarily just the ability to play a game at all, its to be able to come close to playing it with optimal performance at the maximum (or close to) maximum settings.

Also...nobody knows how good bulldozer is going to be, and for all we know, it could be a paper launch where they ship, but they aren't in stock? Maybe AMD could even push back the release date that they once did. One thing that is more than likely but not for sure is that bulldozer won't be as fast as the sandy bridge CPU's in applications that use 4 cores or less (because if they were...I'm thinking AMD would be bragging a bit more and they seem to be quite tight lipped while also cherry-picking their benchmarks)

As for your friends...sometimes someone wants a badass computer right NOW and not everyone wants to wait for something that MIGHT be better in the future. Everyone knows i5 2500k is a great CPU especially for the price and btw sandy bridge E would be stupid to buy for gaming (extreme overkill). Anyway I said fuck it a few weeks ago and got an i7 2600k and I haven't regretted it since. AMD isn't pushing the bar high enough for intel to feel a fire under their asses to pull something great off next year for the consumers!

I was really rooting for AMD but even if these turn out to be better than most of us thought, its probably going to be too little too late. Most people already bought their computers...

October 9, 2011 | 03:57 PM - Posted by Anonymous (not verified)

You all realize all those benchmark pictures were deemed fakes a long time ago. On top of it, sorta seems even more fake considering they were out a few months ago. So honestly I'll wait for an official review from Tomshardware or someone like that.

October 9, 2011 | 03:57 PM - Posted by Anonymous (not verified)

You all realize all those benchmark pictures were deemed fakes a long time ago. On top of it, sorta seems even more fake considering they were out a few months ago. So honestly I'll wait for an official review from Tomshardware or someone like that.

October 12, 2011 | 02:54 AM - Posted by drbaltazar (not verified)

unless i am wrong!amd still use 16 bit pcie!so it halves they datapath cap speed!so if you ungang like a lot of use were or are doing you will run in 8 bit pcie x2 instead of your expected 16 bit time 2

October 25, 2011 | 05:40 AM - Posted by Indiocj (not verified)

I love this stuff. amd,intel for me it is just fun too see the tech and how far it hase come(it for 15y)I love raw mhz and tech. thank god that we have the 2 big boys to do battle, or else we all get the shaft

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.