Review Index:

AMD A10-6800K and A10-6700 Review: Richland Finally Lands

Author: Josh Walrath
Subject: Processors
Manufacturer: AMD

Results: Skyrim, DiRT 3, and Battlefield 3


This title is CPU bound when using a relatively high end GPU, but is more GPU bound when using an APU.  A manual runthrough was captured with FRAPS.  Settings are put to 1280x960 with Ultra level graphics, 8X AA, and 16X AF.  Ryan was unable to replicate this run with the 4670K, so it again is not represented here.

View Full Size

We see a small bump in both average and minimum frame rates, but it is not nearly as impressive as we had hoped.  Still, it is an improvement, though around the 7% mark rather than 10 to 15%.


DiRT 3

DiRT 3 is a DX11 based racer from Codemasters.  It does not have the heavy lighting routines of DiRT: Showdown, but it still looks very nice.  The High preset was used with a 1280x960 resolution.  This also has a DX10 fallback, so it should run on the i3 processor.  The built-in benchmark is used for this test.

View Full Size

The 4670K was present for this, and it gets mostly destroyed by the AMD parts.  This is a very solid win for AMD over the GT2 enabled 4670K.  The couple frame differences between the 5800K and 6800K are again not impressive, but they do show an improvement.


Battlefield 3

This is another very popular title that is coded for DX11.  Settings were put the Medium preset with 1280x960 resolution.  A manual runthrough of Operation Swordbreaker was captured with FRAPS.  Ryan was again unable to match my benchmarking run with the 4670K.

View Full Size

Here was see a bit larger of a difference in performance between the chips.  The 2133 memory on the 5800K helps it a ton!  If a user already owns 2133 memory, then it is an easy upgrade to flash the BIOS and utilize that higher memory speed.  If they do not have 2133 memory, it is likely not worth their money to upgrade.

June 5, 2013 | 03:20 AM - Posted by Tim Verry

Really good review Josh! It looks like AMD has another solid APU on its hands, all things considered.

EDIT: Wow, those are some nice overclocking results, especially for the GPU!

June 5, 2013 | 05:10 AM - Posted by SPBHM

I think the CPU OC is a lot more impressive, 4.8GHz and using 140w under load? that's pretty good

what software was used for the (stock) load entire system test?
it's strange that the difference between the 100 and 65w TDP CPUs would only be 15w (6700 vs 5800K), but still, performance is the same, so... not bad.

the ability to use the IGP + any other card would be quite good.

June 5, 2013 | 07:51 AM - Posted by Tim Verry

Well that too of course :).

June 5, 2013 | 10:29 AM - Posted by Josh Walrath

I used Cinebench 11.5 to load up the CPU cores for power testing.

Real world TDPs are... interesting.  AMD sets a level for the worst case scenario for their parts, so there is going to be some variances chip to chip.  So while the 6800K is officially a 100 watt TDP part, it most likely consumes less power than that rating implies.

There are certainly some interesting wrinkles to this release.  It is certainly not a world beater, but the overall product is still pretty impressive.  AMD must be extremely happy that Intel decided to forego putting GT3 and GT3e graphics on the desktop.  If Intel had done that, AMD would have an even tougher time selling chips.

June 5, 2013 | 05:04 AM - Posted by Anonymous (not verified)

Doesn't matter what AMD releases for the next month or two.

Like it or not, unless it comes with a supermodel serving you beer, it will be hard to drown out all the buzz around Haswell.

On the other hand, so far, I've been very unimpressed with Haswell for the desktop enthusiast.

June 5, 2013 | 08:10 AM - Posted by Anonymous (not verified)

the ability to use the IGP + any other card, is disallowed for higher end cards! Is this in the hardware or BIOS? Can Lucid's MVP Virtu software get around, AMD's obvious antiHSA move (not allowing higher end discrete cards and IGPU), the AMD restriction? Why is AMD, the founder of the HSA foundation, doing something so antiHSA, by not allowing the user access to the full computational ability of their APUs, in conjuction with Higher End discrete GPU cards! AMD do NOT let your marketong folks ruin you! This whole thing reeks of milking the user base, Just like Intel does with it's less than adequate, planed incremental performence/technological increases! Oh let's give the user just enough advancment to keep them buying! I wish there was a third company out there with an x86 license, and some GPU IP, to keep both AMD and INTEL on the straight and narrow!

June 5, 2013 | 09:36 AM - Posted by Anonymous (not verified)

It's not because AMD is trying to be anti-HSA, it's because if you have a higher end card it would actually not help you process data any faster. This is due to the fact that the current way they an IGPU + Discrete card interact is by "Crossfire X". The methodology is that both GPUs"combine" to appear as one card. This eliminates a lot of other issues that AMD saw with "Hybrid Crossfire". If you read around there are plenty of people that have messed with trying to get an HD 7870 to work with an 5800K APU for example and come to the same conclusion as AMD. Why hobble the powerful card to get comparably marginal performance from the IGPU? I understand wanting to use it since it's there but the way the software and hardware are built right now, it's not worth the worry.

June 5, 2013 | 10:17 AM - Posted by Anonymous (not verified)

Yes but do not shut the IGPU off for Opencl workloads, while the higher end descrete GPU crunches graphics, The one thing AMD, and Nvidia have that can give them leverage, against Intel x86 CPU might, is using the IGPU as an extra processing resource (GPGPU) for gaming physics/ray tracing boost! Read up on Lucid mvp technology, and tell me if you think this offers a solution to the problem!

AMD Like INTEL, both, if given the chance, will slow CPU/APU technological improvements if there is insufficient competition in the market place to fore them to move forward! Just look at AMD VS NIVIDA in the descrete GPU market!

June 5, 2013 | 09:48 AM - Posted by Spigzone (not verified)

You're con-fusing AMD (desperately trying to stay in the processor race) with Intel (800 lb. gorilla cash cow milking it's lead).

AMD's marketing does not drive this process, the step by step design and engineering learning process and system evolution leading up to a full HSA implementation does.

AMD would love nothing more than to CURRENTLY have a full array of HSA APU processors on the market able to additively couple with it's full array of current GPU boards.

In short, AMD flat out cannot afford to leave performance advantages on the table to 'milk the consumer'. It has never been in that position. That is what Intel routinely does.

June 5, 2013 | 10:31 AM - Posted by Anonymous (not verified)

AMD is not in the x86 race on the high end, Intel won, and AMD said forget the high end, but AMD can and does still compete with Intel on the low power Mobile x86 market, and AMD integrated GPU IP, EATS Intel's IGPU IP for lunch! AMD could easily use its better GPU IP to put Intel Haswell/Broadwell GPU IP in the rear view mirrior, for a few years, at least! Why do you thik AMD's lack of progress with AMD's IGPUs vs Intel's IGPUs has a back and forth feel to it, when AMD could, if they so desired, Blow Intel's IGPU's so completely out of the water, and be done with it!

June 5, 2013 | 11:10 AM - Posted by Anonymous (not verified)

Oh And the real con- here is anyone trying to paint AMD as the Underdog, and Intel As The Overdog, the truth is both AMD and Intel are both just plain Dogs, and the consumer wins when they are fighting each other for survival! I hope APPLE takes ARM technology and soups it up and combines it with an equally souped up IGPU, to compete with AMD and Intel's mobile offerings, Apple has a wad-o-cash, and has allready shown it can take the ARM instruction set, and pair it with some GPU IP and go to market with it! Apple is another DOG with deep pockets, the more dogs in the competition, the more the consumer becomes winner!

June 8, 2013 | 08:46 PM - Posted by deowll (not verified)

Right now there really isn't an AMD Intel fight. Intel could finish crushing AMD and put it out of business but it doesn't want to because if it did then it might become a monopoly and get attacked as such the way MS, Google, and others have been attacked. Much cheaper to allow AMD just enough room to survive but not enough room to prosper.

June 5, 2013 | 02:01 PM - Posted by D1RTYD1Z619

So what would you do Josh? I was going to upgrade the video cards (HD4850 512mb Crossfired) in my 12yr old daughters computer, she plays Minecraft, TF2, Sims 3 and L4D2, to a HD7770 because they are under a 100 bucks or should I just dump the system and build her one off the A10-6800K with no discrete card.

Current System specs:
Antec 300 Case
Asus P5K-E WIFI-AP Motherboard
Intel Core 2 Quad Q6700 @ 3.4 Processor
Corsair Dominator CM2X1024-8500C5D2GB 2GB (2 x 1GB) PC2 8500 1066MHZ DDR2 Memory X 2
Hittachi-LG HL-DT-ST DVDRAM GH20NS15 Optical Drive
Maxtor MaXLine 7H500F0 500GB Hard Drive
Asus EAH4850/TOP/HTDI/512M HD4850 512MB DDR3 in Crossfire
Antec TruePower New TP-750 750 Watt Power Supply

This system puts out way too much heat and I bet draws too much power for its performance benefits.

June 5, 2013 | 02:29 PM - Posted by Josh Walrath

If you are worried about heat, power, and age then yes... upgrade and a 6800K will be a great replacement that will enable her to play those games without issue.

That being said, a Q6700 at 3.4 GHz is still a really fast processor. It likely is slower overall than the 6800K, but it is still fast. Getting a $100 card will provide quite a few more years out of it, plus cut down on heat because the old 4800 series could really push it out. For the price though, you might want to consider the 7790... could get some really nice mileage out of that combination.

June 7, 2013 | 10:26 AM - Posted by Yamaeda (not verified)

I'd just buy one 7790 and be done with it. :)

It's a very good processor, especially overclocked, but i understand if your rig gets warm. Though it's more due to teh graphics cards than the cpu.


June 5, 2013 | 02:59 PM - Posted by w_km (not verified)

QUESTION: Could I use the A10 GPU to run one 1440p monitor with an additional 7xxx series GPU to run another 1440p monitor?

Or would the discrete 7xxx GPU take over both monitors with the APU disabled???

June 5, 2013 | 05:28 PM - Posted by Josh Walrath

I guess I will have to check.  I am not sure, but I believe that when a standalone card which is not a 6670 and below is inserted it disables the GPU portion of the APU.  If not, then I believe Win7 does support multiple, non-identical graphics adapters as long as they utilize the same drivers.

June 6, 2013 | 08:27 AM - Posted by pyjamarama (not verified)

For building a low end system that does minor gaming, if we pared the Intel low end Pentium G2020 with a 100$ to 150$ graphics card would we get a better gaming performance?

Or putting the question another way if we put a G2020 or a Core I3 what graphics card we would have to buy to have the same gaming performance?

June 7, 2013 | 10:25 AM - Posted by Josh Walrath

I believe a G2020 is actually going to be far slower than Richland in both CPU and GPU performance.  A person would have to buy around a $75 card to match the graphics performance of the A10 6800K in most instances.  If you were dead set on buying the cheapest, decent processor and adding a $150 graphics card... then that will still make a great gaming machine anymore.  The FX-6350 is around $139 (you might be able to find the FX-6300 for cheaper) and seems like a nice combination of performance, thermals, and price.  There are also plenty of i3s in the $100 to $175 range that will perform well.

As for graphics, I wouldn't go with anything less HD 7790 anymore.  That and the GTX 650 Ti Boost seem pretty solid.

June 8, 2013 | 01:14 PM - Posted by Anonymous (not verified)

This is a very good review.

By the way, I am a fan of

June 11, 2013 | 04:44 AM - Posted by Anonymous (not verified)

Rly nice CPU.

June 12, 2013 | 05:42 AM - Posted by Blair (not verified)

I'd build a new pc around it and give my old Q6600 to a friend.

June 12, 2013 | 08:53 PM - Posted by Poci

Wow im impressed, these little apus can score close to what my old 5970 did in 3dmark????

And people complain about AMD?????

June 14, 2013 | 03:12 AM - Posted by Panta

i think in-light of recent time intel CPU's heat & OC fails
its a good time for AMD pick themselves up & start fight intel again.

June 14, 2013 | 11:16 AM - Posted by Josh Walrath

Yeah, it seems like Intel is so comfortable in their lead, that they aren't pushing things in terms of enthusiast level products.  How long has Sandy Bridge E been out?  Still any sign of Ivy Bridge E?  Also, Intel is really pushing into mobile and they probably think power desktop computing is dead.  Hard to say where exactly that will go, but I hope power desktop computing isn't becoming a niche market (even though it sorta is...).

June 20, 2013 | 06:25 AM - Posted by Anonymous (not verified)

Can we have a windows 7 aéro test screen pleaz
Cause the A10-5800k has 6.9 aéro and 6.9 3D in windows 7 and I whant to know if the A10 6800k has better performance (win7 test) with 2133 memory

September 10, 2013 | 08:43 AM - Posted by ezjohny

good read Josh, AMD needs to make a good solid APU for the desktops that utilizes AMD latest 7000 series graphic cards. Maybe HSA is a better way to move data around faster, we will have to see when it comes out. Hope AMD does not disappoint! I'm thinking of getting an APU set up!

October 3, 2014 | 06:00 PM - Posted by Irving (not verified)

Hi, i feel that i saw you visited my web site thus i came
to go back the favor?.I am trying to to find issues to enhance my web site!I guess its good enough to use
some of your ideas!!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.