Review Index:

AMD A-Series Llano APU Sabine Notebook Platform Review

Author: Ryan Shrout
Subject: Processors, Mobile
Manufacturer: AMD

Product Position and Graphics Engine

AMD Llano APU Product Positioning

Now that we have an overview of the architecture of the AMD Llano APU let's look at the specifics of the product position and where AMD sees this APU going in the market.  

View Full Size

The AMD E-series is the currently available Bobcat-based APUs for netbooks and nettops while the new Llano parts are going to be known as the AMD A-series.  The A8, A6 and A4 series will compete against Intel's Core i7, i5 and i3 line of processors and they will surely be better; after all they are '1' higher right?

In reality these competing lines will match up pretty much head to head in terms of pricing and target markets.  We'll have more details on the specific models being launched in the AMD A-series on the following page when we get into the review sample we got from AMD for this article

View Full Size

Along with the A-series launches a new marketing and branding campaign from AMD around the Vision name and some simplified messaging.  The cooler logos are seen at the bottom here where you see mention of "dual graphics" which is really a return of the idea of Hybrid CrossFire.

View Full Size

Dual Graphics allows a vendor to add in a discrete AMD graphics solution to a notebook and get increased performance by using the power of the Llano integrated graphics as well.  In fact, AMD is actually giving the combinations completely unique model numbers.  The graphics portion of the Llano A8 processors is going to be marketed as the Radeon HD 6620G.  If you also have a Radeon HD 6770M discrete GPU in your notebook the combined product is the Radeon HD 6775G2.  The A8 6620G paired with a Radeon HD 6630M GPU will be a Radeon HD 6690G2.  

Yes, I know this is incredibly confusing and likely won't be very useful for our crowed so don't attempt to memorize it; I know I'm not.  The goal here is obviously to be able to create an easy to grasp model numbering system that can be presented at stores like Best Buy to consumers that only understand the "higher is better" PC buying mentality. 

View Full Size

This dual graphics capability will result in increased gaming and graphics performance across the board though the amount of scaling you'll see will depend on the discrete GPU and integrated GPU combination to a large degree.  In the above slide AMD paired an A8 Llano APU with a Radeon HD 6570 discrete desktop card and saw somewhere around 30-35% better gaming performance in a handful of applications over the Core i5 based system using the same discrete card.  

These are high end gaming settings or resolutions and thus won't really appeal to the gamer reading this but for a budget system, or in particular for a mobile machine with light gaming capability, dual graphics could be a nice upsell.

The Llano Graphics Engine

We have looked at the x86 CPU core portion of the AMD A-series Llano APU now let's take a look at the GPU portion and see what architecture AMD has implemented.  Based on the Redwood GPU design from the Evergreen generation (think Radeon HD 5600), the Llano APU possess a lot of GPU horsepower for an integrated solution.

View Full Size

Each line in the A-series will have its own iteration of GPU specifications with the A8 obviously taking the lead.  With 400 stream processors (now being called Radeon Cores) built on 5 SIMD arrays of 80 each, 20 texture units, 2 ROPs and a 444 MHz clock speed, the top option is capable of 355 GFLOPS of raw computing power.  The A6 offering drops to 4 SIMD arrays and the A4 has 3 arrays.  Obviously we expect performance to scale down with the decreasing shader count in the standard way though I am curious how much the memory data rates of only 1.6 Gbps or 1.333 Gbps will modify the scale. 

View Full Size

If you have read any of our AMD GPU stories in the last 18 months the chances are good this diagram will be very familiar representing the "Sumo" architecture found in the Llano core.  The update to the Redwood core includes the newer UVD3 engine, additional power gating support for better energy efficiency as well as being integrated to the new memory interface required for being slapped on an APU.  Also interesting?  This is the first GPU to be built on GlobalFoundries new 32nm process technology.

View Full Size

With the Radeon heritage users that pick up a system based on the AMD A-series of APUs will get many added benefits like this one: much better aniso filtering when compared to the CPU/GPU competition from Intel (on the left).  If you don't know the significance of this diagram just know that the condensed, uniform appearance on the right will result in better image quality in gaming.  I think this is a trend that cannot be easily overlooked - the graphics drivers from AMD have been and will continue to considerably outpace Intel's in terms of features and performance.

View Full Size

There was at least one interesting addition to the control panel suite this time around, a feature called AMD Steady Video.  Using a heterogeneous computing model AMD's driver will have the ability to stabilize "bouncy" video that is usually associated with consumer cameras and unsteady hands.  We were just getting a chance to play with this option before we had to finish up this review so expect more on this later.

View Full Size

AMD will definitely be pushing the benefits of the Llano A-series APU to users and OEMs both in standalone configurations as well as dual-graphics systems.  

View Full Size

Obviously another big factor for AMD with Llano and the A-series of APUs is DirectX 11 support.  Here you can see a very blunt demonstration of that fact as we see not only huge performance benefits with the A-series APU over the Intel Core i5 Sandy Bridge system but also that Intel's processor graphics simply can't run DX11 applications.  Many of you will simply shrug this off saying that you can games with at least similar image qualities on the DX9 code path but DX11 offers much more than that including multi-threading support, tessellation support and Shader Model 5.0 support.  

View Full Size

One area where the GPU combatants used to thrive is in the world of transcoding applications.  First there were CUDA-accelerated transcoding programs, then NVIDIA and AMD supported ones and finally we are getting into OpenCL development down the same road.  This slide uses software from Arcsoft to demonstrate the performance advantages seen on the AMD Fusion A-series of APUs where the red bar represents CPU-only transcoding and the green bar shows the advantages when enabling GPU acceleration.  Using the DX11-class GPU on the A-series parts you get some very noticeable performance boosts as well as much lower CPU utilization during that transcoding time.

The problem of course with this argument now is that Intel has addressed the transcoding problem on Sandy Bridge with some dedicated logic on the chip, not using the shaders on the GPU.  It turns out this results in a much faster transcoding application than even those accelerated by discrete or integrated GPUs from AMD or NVIDIA.  The GPU-based designs used on AMD's APUs are definitely more flexible and extensible than the hardwired unit on Sandy Bridge, but we need to see some more software fall into our hands that really utilizes the GPU.  Maybe the AMD Fusion Developer Summit will do just that...

June 13, 2011 | 10:32 PM - Posted by C-Dub (not verified)

Thank you for posting this - it's a great read.

A typo - you have "The graphics portion of the Llano A8 processors is going to be marketed as the Radeon HD 6620G. If you also have a Radeon HD 6770M discrete GPU in your notebook the combined product is the Radeon HD 6775G2. The A8 6620G paired with a Radeon HD 6630M GPU will be a Radeon HD 6990G2." I believe that you mean "6690G2" rather than "6990G2". :-)

June 14, 2011 | 02:56 AM - Posted by Ryan Shrout

Thanks, fixed!

June 14, 2011 | 03:18 AM - Posted by bro (not verified)

Youa are poorAMDfag. Dodo destroyed in every test. How can it be Gold award?

June 14, 2011 | 11:30 AM - Posted by Ryan Shrout

In the mobile form factor, CPU performance is less important than overall usability and battery life.

Desktop processors will have a MUCH tougher go at it.

June 14, 2011 | 04:34 PM - Posted by whodi (not verified)

You're a troll calling people fags behind a keyboard, Grow a set and get out of your parent's basement.

June 14, 2011 | 07:38 AM - Posted by Anonymous (not verified)

stupid question whats up with $2339 compared to a $600 system?
to be fair, test a sandy bridge system in the price range and not some overpriced gaming system that is NOT in the price range

you guys should also compare it to a amd based mobile phenom II and mobile athlon II laptop too

$2339 = Maingear eX-EL15

$1739 = MSI GT680R-008US

anyway, good review, very informative

June 14, 2011 | 11:31 AM - Posted by Ryan Shrout

We wanted to give a general overview of how Llano compared to EVERYTHING. We did include the K53/N53 for that purpose as well.

This is the same reason we through the Core i7-990X CPU in with our $200 CPU tests. People want to see those comparisons.

June 14, 2011 | 12:13 PM - Posted by Matt Smith

From the MSI GT680R onward I've started using the new PCMark 7 and 3DMark 11 benchmarks where possible instead of their older siblings. In the long run I think this is the better idea, but I've only tested the MSI and Maingear since making that switch, so they were the only laptops against which to compare.

I did try to make clear that the MSI and Maingear are MUCH more powerful machines, and the A-Series laptop performed well considering the competition.

June 14, 2011 | 09:54 AM - Posted by Tom (not verified)

Yeah, I didn't understand that either. You're pitting i7's w/ high end discrete cards against a system w/ no discrete cards on a hybrid xfire thats still be fixed up?

I'd rather see comparisons that make sense, like ya know, a stock sandybridge w/ its trash IGP vs Llano in gaming. ANyway, I can't wait to pick one of these up if the performance is there. I hope they can fix the kinks w/ the hybrid xfire so performance will be decent, or if it can allow graphics switching to its IGP and then back to the discrete for gaming.

June 14, 2011 | 09:55 AM - Posted by Tom (not verified)

Sorry for any grammar and mechanical errors in my last post, but I think you get it.

June 14, 2011 | 10:35 AM - Posted by bjv2370

nice review!

June 14, 2011 | 11:09 AM - Posted by Anoyingmouse (not verified)

The inexpensive gpu heavy platform looks like it could suit my mobile needs nicely, however I can't help but fear that its price point will mean initial design wins will feature glossy plastic and glossy 1366-768 screens and other such cheapness. Please Lenovo, Sony, Samsung, or any one else, PROVE ME WRONG...

June 14, 2011 | 04:36 PM - Posted by whodi (not verified)

nice review, I'm looking for a 600 laptop to tide me over for a couple of years.

June 17, 2011 | 11:28 AM - Posted by Ryan Shrout

I imagine these machines will be able to do just that.

June 14, 2011 | 07:08 PM - Posted by Anonymous (not verified)

Well we were promised a Giraffe and got a tapir .. Not a bad swap but it simply wont impress the Emperor.

June 15, 2011 | 12:44 PM - Posted by Anonymous (not verified)

anyone know if this will be able to Xfire with any Radeon? to make a Bigger and better Xfire then we saw today? I mean it was only a 6630. What if it was a 6830M with the 6620M on the A8? Will this work?

June 17, 2011 | 11:29 AM - Posted by Ryan Shrout

Yep, it can essentially do that with "Dual Graphics" technology to speed up gaming. The feature is listed and described on page 2 I think.

June 15, 2011 | 01:14 PM - Posted by Anoyingmouse (not verified)

I believe it will only work with specific (mid range) cards. Xfire works through alternate frame rendering, so if the mismatch is too big, the faster card would be slowed down by having to wait for the slower card.

June 16, 2011 | 10:25 PM - Posted by Anonymous (not verified)

Your review fails to give the average consumer--not geeks--some perspective on what this chips performance means. I will do it for you: people that can only afford one new computer for around $500 and want the portability and power efficiency of a labtop will now also have the option for entry-level gaming, somethig that was simply out of reach before. So parents, you can watch hd movies, surf the internet and run msoffice for 4hrs and beyond on one charge while your kids can borrow the labtop to play some--not all--pc games. I think AMD has a winning product for the mid to low end labtop,netbook segment.

June 17, 2011 | 11:30 AM - Posted by Ryan Shrout

I agree with this - as new models based on this technology are released we will definitely be making these points.

June 17, 2011 | 11:29 AM - Posted by vik (not verified)

Most buyers are so fool that only know about Intel as Processor.They haven't heard of AMD in there life. So computers with AMD will increase in sale but slowly.
People consult among there friends when they need to buy a PC. If a person buying AMD pc likes it performance very much he will tell about it to his friends and only this can increase the popularity of AMD.

Also in many homes there are children who likes to play games and they will like PC with AMD.

June 17, 2011 | 11:34 AM - Posted by vik (not verified)

Benchmark here is not that good. Please compare AMD laptops with equally priced Intel Laptops. Intel laptops with i7 & Nvidia GTX 480 are very costly compared to AMD laptops.How can AMD match the performance of such costly Intel Laptops.

June 20, 2011 | 06:31 AM - Posted by Anonymous (not verified)

ASUS K53(2Core + IGP) is the biggest power eater in the Battery Eater Standard? It's strange. I think it's fitting that ASUS K53's battery life is at least longer than ASUS N53(4Core + GT 540M).

June 22, 2011 | 07:59 PM - Posted by Matt Smith

Strange, but true. That was tested back in the ASUS K53E review. Battery Eater Standard + Intel HD 3000 = A quickly drained battery. I'm seeing a similar situation from another Intel HD 3000 powered laptop I'm currently testing.

July 18, 2011 | 12:14 AM - Posted by Anonymous (not verified)

Do not let the past repeat itself!

Intel engaged in unfair competition by offering very large rebates to worldwide PC manufacturers and oem sellers who agreed to eliminate or limit purchases of microprocessors made by AMD.
In the 90s intel used cash over product performance to keep its lead

You can read about this in Wikipedia AMD v. Intel

October 5, 2011 | 12:11 AM - Posted by John Christmas (not verified)

The lack of battery life kills this laptop. The poor processor performance also means many will turn to laptops for better experiences. Its a shame because the rest of what this laptop offers is just fine.

October 16, 2011 | 01:39 AM - Posted by Vish (not verified)

Great write-up!
Maybe you guys can help me with this question. Suppose my laptop comes with just the AMD-A8-3510MX and the Radeon 6620G discrete-class graphics, would it be possible to add on, say, a 1GB DDR5 Radeon card to this later and still have it function as a Dual graphics? Any pointers on how I would know if the motherboard will support this.


October 28, 2011 | 02:17 AM - Posted by kathyink

This review of the AMD A-Series Llano APU Sabine Notebook Platform is good. Those who like notebooks would like the review. The photos of the notebooks look eye-catching. The charts giving details of the processor, RAM and hard drive are useful for us.

February 28, 2012 | 08:05 PM - Posted by Anonymous (not verified)

if i buy a notebook lets say samsung series 3 305V4A-S01 with a 6640g2 because it's a 6620g with a 6450 i belive can i buy an hd 6770 and put it in my notebook? would it work?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.