Review Index:
Feedback

AMD A8-3850 Llano Desktop Processor Review - Can AMD compete with Sandy Bridge?

Author: Ryan Shrout
Manufacturer: AMD

AMD Dual Graphics Performance and Scaling

AMD's Dual Graphics is a new name for a technology that has been around or discussed almost as long as the recent rejuvenation of multi-GPU graphics.  The idea of "Hybrid SLI" or "Hybrid CrossFire" was always the combination of a discrete GPU and integrated graphics though at that time the integrated graphics was on a chipset.  Both AMD and NVIDIA shipped products with this for quite some time although the uptake for it was never that substantial.  The reasons seemed simple enough: those who were interested enough in gaming to know what CrossFire or SLI was weren't the same people buying integrated graphics based computers for gaming.  

Dual Graphics is a rebrand that attempts to pair the AMD Llano based integrated graphics with certain discrete GPUs.  The list is limited so don't plan on taking that spare Radeon HD 5830 you have sitting around and getting a boost on it.  Here is the list:

View Full Size

The four discrete graphics cards currently supported are the Radeon HD 6670, HD 6570, HD 6450 and the HD 6350.  You can see in the above table that when combined with either the A8 or A6-based APU, the combination is actually assigned a new model number as well.  In our testing we used the A8-3850 and the HD 6670 which will be rebranded as the Radeon HD 6690D2.  The D2 indicates that we are using a Dual Graphics implementation here.  This mostly for system builders and retail differentiation - we just need to know that the combination of the HD 6670 and the HD 6550D integrated graphics is supposed to be faster than each individually.  

I should note here that there is another requirement that wasn't blatantly obvious to me before starting testing: AMD says that Dual Graphics technology, which is enabled by simply hitting the "Enable CrossFire" button in the new AMD control panel, will only work on DX10 and DX11 games, NOT on DX9 titles.  I have two problems with this.  First, this goes against all we know about multi-GPU technology up until today - we have never had to differentiate between DirectX versions before.  Maybe budget gamers won't have that stigma, sure, I get it.  But my second issue is that many games, including current ones like Dirt 2 and Shogun 2, actually use DX9 code paths at lower image quality settings.  And since we are talking about budget to *maybe* mainstream performance level graphics, I would bet that these games will be run at those lower settings anyway.  So until the DX11 conversion for games is complete (which is likely years away) Dual Graphics won't be fully utilized.

So, what does our testing look like then?

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

So only 3DMark Vantage (which is DX10) and Dirt 3 (which is DX10/11) saw performance gains going from just the Radeon HD 6670 to the Dual Graphics configuration.  The other titles, Left 4 Dead 2, Civilization V (which crashed consistently when Dual Graphics enabled) and StarCraft II saw no gains at all.  

Obviously if you limit the testing to games that are DX10/11 only then you will see some performance boosts here and it is also likely that had we used the HD 6570 instead of the HD 6670, the performance boost from the Dual Graphics technology would have been more substantial.  As it is, the HD 6670 is so much more powerful than the HD 6550D graphics in Llano that seeing the 11% boost in Dirt 3 at 1680x1050 is actually pretty good.  

We plan on taking some more time with Dual Graphics in July to see if we can't iron out some of these issues and get some more information on the DX9 vs DX10/11 issue and report back.  For now I would consider this feature a minor advantage at most for Llano.

June 30, 2011 | 02:48 AM - Posted by codedivine

That load power consumption doesn't look too good. Given that the GPU isn't active in a CPU-only test, one would have hoped that the power consumption doesn't approach the 100W TDP. That raises questions about Glofo's 32nm process.

June 30, 2011 | 10:53 AM - Posted by Anonymous (not verified)

i digg on them big booty hoe's with them daisy dukes on

June 30, 2011 | 11:46 AM - Posted by Ryan Shrout

Interesting take...

June 30, 2011 | 01:43 PM - Posted by James (not verified)

Considering how little of a CPU the average user uses these days, integrating a CPU into a primarily GPU was a good choice. I would really like to see what the retailers(dell, HP, apple?) do with this chip. Especially if we see some sort of media powerhouse 1 chip Micro-ATX configurations. Right now that kind of field is dominated with some of the Atom type systems which are really lacking in the ability to handle media, if we see price drops in the llano line then it could push atom out of the super-compact media lineup.

Alternatively if it is adapted for mobile processors it could be a cheap solution to integrating discreet graphics on a full size laptop, however that 150W under load would eat a battery in no time.

June 30, 2011 | 04:25 PM - Posted by Ryan Shrout

Actually Llano for the mobile market was launched a couple weeks ago and is already showing up for sale.

http://www.pcper.com/reviews/Processors/AMD-Series-Llano-APU-Sabine-Note...

November 7, 2011 | 10:24 PM - Posted by Anonymous (not verified)

Saying this over half a year later, I can confirm this is all true. The atom is currently being destroyed, and apple is rumored to be adopting APUs.

June 30, 2011 | 03:04 PM - Posted by dude (not verified)

AMD is using outdated production process in it's chipsets (65nm) that in part adds to power consumption.

The bulldozer chipset is 65nm.

June 30, 2011 | 06:30 PM - Posted by Josh Walrath

Eh, that really is not an issue.  The problem with going to a smaller process with these chipsets is really how many transistors they have and need.  Now that we are seeing most of the northbridge functionality on CPUs, there is no real reason to shrink chipset logic.  The problem that we start seeing is that these are already small chips, and if we shrink anymore without adding a bunch of extra features (and therefore transistor count), then we run into the problem of having enough pad space on the die for the substrate and pinouts.  These chips also do not eat a whole lot of power at this time.  Maybe 7 watts at max with two chips?  It was different with the 790GX/890GX chips, as they were in the 15 watt range.  But with the 990FX and SB950 both consuming between 5 and 7 watts combined... not a huge issue for a desktop application.

September 9, 2011 | 05:35 AM - Posted by Anonymous (not verified)

Read a litle and then write!!!! Both Llano and bulldozer are 32nm like sandy bridge.

July 1, 2011 | 01:03 AM - Posted by Denzil (not verified)

I like the new APU and i want it in some new Del laptops. Please keep a eye on this thanks. I dont like to over spend on any PC for it doing a basic job. thats watching vidio surfing and listting to audio and NO incodeing at all. yet i want to watch UTube and vidio content. I dont want any stutter and good audio. 6 ours batry life is fine 8 would be awsome. It has to have USB 3 ports + Thunderbolt. I think Thunderbalt is dead in the water if you have to buy the lead for 50$ too.

July 1, 2011 | 01:27 AM - Posted by noiserr (not verified)

Why is every reviewer using DDR3 1333Mhz RAM? This review is useless as a result. Llano is very memory bandwidth capped.

July 1, 2011 | 04:23 AM - Posted by funkydmunky (not verified)

Hillarious. So many out of touch. Four cores to do meaneal tasks with actual GFX performance and power consumption that equalls the best. AMD will eat the competition despite reviewers being so removed from the reality of what box stores sales are comprised of. Of course Intel will just use its uber billions to advertise your need for "Intel" in the box, and discount its low end to compete.
Reviewers need to stop focusing on CPU performance when it comes to certain market segments. I have 10+ imidiate family members who could never figure out how to stress a modern CPU if they were offered $$$. But they get pissed in a second when they have crap multimedia. I have yet to hear anyone ever complain that their off the shelf just couldn't keep up with Office for them.
We are all past minimum performance being an issue. Get over it. This new AMD tech struggles in no area, yet the competition has constant issues with price or GFX capability in comparison.

July 1, 2011 | 02:06 PM - Posted by Ryan Shrout

It is an interesting point of view here. Though I disagree with the statement of "who could never figure out how to stress a modern CPU".

July 1, 2011 | 02:05 PM - Posted by Ryan Shrout

If you actually read the review you will see a lot of testing done on the graphics side of things with faster memory settings than 1333 MHz.

As for the CPU processing side of things, memory bandwidth makes a noticeable difference in very very few cases.

July 1, 2011 | 09:13 PM - Posted by Anonymous (not verified)

Quote by reviewer: AMD Dual Graphics is a technology that I think has potential but lacks in some areas that I thought it needed to excel in. The fact that it doesn't support DX9 games completely confounds me and the response from AMD was built around the idea of "time commitments and value propositions." I don't see how DX9 titles, which are still FAR AND AWAY the majority of games out there right now, could not be worth the investment for gaming on the APU.

This is laughable, could not be worth investment for gaming on the APU? The I3 I5 or I7 with HD3000 CAN NOT play the DX9 games, the frame rates are to low. Not only that the image quality is less than video cards from 5 years ago. So which company is lacking an investment on DX9 games? Yes it would be nice to see dual graphics work on DX9 games but the fact is the AMD 6650D IGPU can play them just fine without dual graphics. Maybe you should also note that the HD3000/2000 do not support DX11.

July 5, 2011 | 05:36 PM - Posted by Ryan Shrout

Actually, we mention a few times that only the AMD APU can handle DX11 gaming. But the issue is, DX11 titles are going to be most often run at DX9/10 settings on these performance levels of GPUs.

July 6, 2011 | 08:48 PM - Posted by Anonymous (not verified)

This review is lacking. Want to talk about gaming performance, how about a Pentium G620 + $60 discrete graphics?

July 6, 2011 | 10:30 PM - Posted by Ryan Shrout

That is indeed a good combination there - we just didn't have a chance to test ALL the combinations we wanted too. Soon!

July 7, 2011 | 11:20 PM - Posted by Darren Coull (not verified)

In this video from AMD:

http://www.youtube.com/watch?v=mdPi4GPEI74

They were showing how their chip kept plugging away while the mobile i7 ground to a halt due to ineffective multitasking ability.

Does this make any difference in real-world situations? I don't know.

Ryan, have you tried this same test, i.e. comparing the llano when asked to run multiple tasks vs. the Intel i series chips? Just curious if this really is the case, or marketing hype!

July 11, 2011 | 03:22 PM - Posted by Ryan Shrout

So if you take out the GPU-based gaming from that workload demonstration I think you will find WILDLY different results comparing Sandy Bridge to Llano.

July 8, 2011 | 12:08 AM - Posted by mastrdrver

How well does this multi task though?

While I don't do it, I know several "non gamer" types in my family who like listening to music while surfing the web. I know others do some downloading in the background while playing a game in windows mode so they can switch back and forth between the game and a chat window.

Weird (in my book), but it seems like there are a lot who do these kind of multi tasking things with their systems today.

Love the podcasts by the way. Though, the last couple have had video and audio sync problems (in case you didn't know).

July 17, 2011 | 10:29 AM - Posted by John W (not verified)

Assuming a good watercooling block such as the Corsair h70,and plenty of fans,do you feel this chip has significant O/C capacity?

July 18, 2011 | 06:27 AM - Posted by Anonymous (not verified)

In my MC guild, we rolled for gear because we the group was in fact an alliance of 2-3 little guilds operating collectively to create 40-mans achievable. Only merchandise I ever bought was my Pally bracers, a BOE... from the auction home.

October 31, 2011 | 08:13 PM - Posted by Bovinebill (not verified)

That is a totally random comment. What forum were you supposed to be posting that to.

July 19, 2011 | 07:04 PM - Posted by AParsh335i (not verified)

As usual, a very well written article Ryan. I can tell you really spent a lot of time writing it, unfortunately not all the people commenting seem to care about the time you spent to both research the product as well as actually understand it.

My take on Llano is that they should have stuck to only launching it for "gaming notebooks." There was a toshiba on sale last week with the A8-3850 for $500 and that was the first time i saw an AMD notebook in a long time that actually seemed worth buying at a good price.

I also think a good market for Llano would be gaming nettops. A little Lan box that could play games decent, but nothing like a typical "gaming rig."

August 25, 2011 | 07:12 AM - Posted by Anonymous (not verified)

i think its good and im buying it next week along with hd6670...i seen few videos on you tube and im convinced its preety great considering price...so...its for ppl that need PC for new games like me and cant afford to give 2k $ for some beast of PC :) in cross with hd6670 it can play crysis 2 on average of 30-35 FPS so im contend with it :) thx for review :)

September 21, 2011 | 07:08 AM - Posted by K.A. (not verified)

Hi there,
I recently got apu3850 + Radeon HD6850. Any chance to crossfire this combination? Tried it according video tips for apu + 6670, but can't see crossfire option anywhere in AMD Vision Control Panel.

September 28, 2011 | 12:18 PM - Posted by ITXGamer (not verified)

AFAIK, the HD6850 is not supported for Dual Graphics. It's far too powerful to benefit from the onboard HD6550. The HD6670 is the most powerful supported GPU.

November 1, 2011 | 11:43 AM - Posted by Anonymous (not verified)

I'm having a similar problem except I have the apu3850 and an HD6770 card... I have the most updated drivers but do not see the "enable crossfire" option in vision center. Any ideas?

October 15, 2011 | 05:16 AM - Posted by geo (not verified)

the APU trades memory bandwidth for more GPU performance; for a weak 64-bit DDR3 6450 it may mean +50% (unlikely), but for a full 256-bit GDDR5 6850 with lots of shaders it may even slow it down ... parallel processing has a cost, if it's added other bandwidth penalty it may be counter-productive ... so NO, you made a pretty bad choice if you miss the 1866 DDR3 support

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.