Review Index:

Mobile GPU Comparison: GeForce GTX 580M and Radeon HD 6990M

Author: Ryan Shrout
Manufacturer: Various

3DMark11, Quick Battery Life Test and Final Thoughts

Futuremark 3DMark11

View Full Size

Alienware M17x + AMD Radeon HD 6990M

View Full Size

Alienware M17x + NVIDIA GeForce GTX 580M

Since so many people can compare their own systems directly using 3DMark11, I thought I should include it here.  The top result is for the Radeon HD 6990M system and since the overall score includes the processor / physics tests as well, you may just want to focus on the Graphics Score of 3176.  On the M17x with the GeForce GTX 580M, the same Graphics Score is 3288, a slight win of about 3.5% for the NVIDIA based rig.  The physics score gave the advantage to the faster processor in the NVIDIA system with about a 9% gain.  

Power Consumption and Switchable Graphics - It Matters on these GPUs

One of the things we have to take into consideration with a mobile GPU review that you don't really spend too much time focusing on (except in extreme cases) is the world of power consumption and, consequently, battery life.  Since the two systems are based on the same M17x chassis and general platform, this is a great test to see how the two GPUs really stack up against each other.  Both systems including Intel Core i7 processors that integrate HD Graphics in them courtesy of the Sandy Bridge architecture.  What differs between them though is how NVIDIA and AMD implement switchable graphics - moving between the integrated Intel GPU and the discrete, high-performance GPU.

We have long known about NVIDIA Optimus, how it works, and the advantages it provides.  On the Alienware M17x, the NVIDIA driver that integrates that technology automatically switches between the the Intel and NVIDIA graphics solutions depending on the task being worked on.  If you are running a browser session or simply taking notes during class for example, then the NVIDIA GTX 580M will be completely disabled and the customer doesn't have to worry about which mode the system is in.  When you start up your game of Battlefield 3, the discrete GPU is enabled and takes over processing for the best performance.  

In fact, there is a button on the keyboard with the set function of switching between graphics modes.  When you press it on the GTX 580M configured rig, this is what you see:

View Full Size

That is the best case for an end user - nothing to configure or worry about.

The AMD Radeon HD 6990M discrete option does have a switchable graphics solution as well though you have manually switch from discrete to integrated graphics.  That same function key on the M17x keyboard I mentioned above is one of two ways the user can switch modes.  When you initiate it, it takes a few seconds as the screen flashes once or twice and then you are brought back to the desktop.  The downside with this method is that you don't know if you are then using the Intel or AMD graphics - there appears to be no on-screen indicator. 

The second method requires right clicking on the desktop and selecting Switchable Graphics to get the following menu option:

View Full Size

The Catalyst Control Center allows the user to switch between the High Performacne GPU and the Power-saving GPU and will also tell which is currently enabled.  While this is better and communicates more the the user, it is an extra step that NVIDIA GPU users with Optimus simply don't have to deal with.

The question is, does this matter?  Both systems can be run only the Intel integrated graphics for the best battery life and both can run on the discrete GPU when situations warrant it.  NVIDIA's solution is simply more elegant and more user friendly.  If a user should turn on their M17x with the Radeon HD 6990M (or other AMD GPU solution) and not realize they were running with the discrete GPU enabled, even at the desktop or browsing the web, the battery life differences are pretty drastic.

View Full Size

This is of course a worst-case for AMD's GPU - more than likely a user would recognize the system was running hotter, the fans were spinning faster and the battery life was draining quicker but WHEN they notice it is the issue. 

AMD is either not interested in creating a software solution like Optimus or they think it doesn't matter enough to the OEM to shift the purchasing decision.  To be honest, making a simple desktop application or widget that constantly informs the user of which mode they are in would alleviate a lot of our concerns but until then, NVIDIA's Optimus is easily the best power management solution for mobile gamers. 

Final Thoughts

After gaming on both of these systems for a solid 10 days or so, I have come to the conclusion that I really want an M17x around the office for mobile testing purposes.  Yes, testing purposes... 

But seriously, the M17x is a hell of a gaming machine.  But since our focus today is looking at the GeForce GTX 580M and the Radeon HD 6990M, we'll stick to that discussion.  Both mobile GPUs provide a gamer with enough horsepower to push the latest gaming titles at the native 1920x1080 resolution of the 17-in display, including Battlefield 3.  Picking a performance leader is a tough call - the Radeon 6990M wins a few and the GTX 580M wins a few - neither really lives up to the claim of the "world's fastest" in any kind of definitive fashion. 

View Full Size

I do think the GTX 580M provides the best overall user experience with support for Optimus switchable graphics as well the potential future-looking graphics performance seen in our Ultra quality settings test of Battlefield 3.  The NVIDIA Verde driver program is also a benefit over what AMD has traditionally done in the mobile market.  The NVIDIA driver team has been releasing mobile-compatible driver updates on a much more regular basis than AMD has been and in most cases are running version-for-version with the desktop parts.  A quick check as of this writing shows the 285.64 driver that has been released for the desktop GPUs is also available for notebooks - if you are serious mobile gamer that cares about day-of releases (like getting BF3 running at its best as soon as its out) then this is a serious concern, right?

But issues arise when we look at the pricing:

View Full Size

NVIDIA is asking for $300 more for the GTX 580M than AMD is asking for the HD 6990M.  That is a significant difference in a gaming notebook that you can get for a little as $1799.  NVIDIA claims that features like Optimus, the improved driver support, 3D Vision (optional) and PhysX (returning to the scene with Batman: Arkham City) can justify that kind of price difference.  I am not sure most of our readers would agree with that but if you are a high-end mobile gamer it is possible that those kinds of added benefits really do matter to you. 

START MAJOR UPDATE (11/7/11):  It looks like Alienware and NVIDIA have listened to our feedback and decided to drop the price on the GeForce GTX 580M on the M17x, M18x and others - and by quite a bit!  As of this writing you can go to the website and now upgrade from the HD 6990M to the GTX 580M for only $75 - that is a $225 price drop compared to last week.  

View Full Size

What does this do for our opinions and thoughts on the battle between the HD 6990M and the GTX 580M?  I think it makes the added benefits of the NVIDIA ecosystem (Optimus, 3D, Verde driver updates, PhysX) much more attainable and in my book well worth the additional cost.  With this price change, Alienware has really shifted my view on the mobile GPU of choice.

I am going to update my award from the Gold to the Editor's Choice for NVIDIA's GTX 580M for this specific reason.  Happy gaming!


When it really comes down to it, I feel that most gamers will pick the Radeon HD 6990M that comes with similar performance levels at a much lower price (at least on the M17x).  For all we know, AMD might be losing money on this part and undermining the market as a whole but most consumers don't really care about that kind of stuff.  To them it is about performance and about value.  For those gamers that are NVIDIA fans or really do put a lot of weight on the idea of Optimus technology, day-of-release driver updates and the specific hardcore features mentioned above, the GTX 580M will provide the best overall experience, but just be prepared to pay for it.

View Full Size

Gold Award - AMD Radeon HD 6990M - Gaming Performance Value

View Full Size

Editor's Choice Award - NVIDIA GeForce GTX 580M - Added/Hardcore Features

November 2, 2011 | 10:58 PM - Posted by Flayed Banana (not verified)

If this test is about the Nvidia GTX 580M and the Radeon 6990M then why do the screenshots from Lost Planet say Nvidia GTX 460? Wrong pictures or problems with the game?

November 3, 2011 | 01:10 PM - Posted by Ryan Shrout

No, those are just older screenshots that we used to demonstrate the game we were using. We don't take new ones each time we test the game.

March 12, 2014 | 12:21 AM - Posted by jlukeadan (not verified)

No content is comparable and admirable then this mentioned for the explanations, a must one to read for everyone and get assisted a lot
[ whole body garcinia cambogia ][ ias coaching in delhi ]

March 12, 2014 | 12:21 AM - Posted by jlukeadan (not verified)

Hats off for publishing this unique comments here, would be used always for best outcomes more often, the results would be positive always
[ Dr Oz Garcinia Cambogia ][ Xtreme Garcinia Cambogia ][ can i buy garcinia cambogia at walmart ]

November 3, 2011 | 08:17 AM - Posted by Anonymous (not verified)

Its unfair the testing specs for the AMD Radeon HD 6990M setup, 500Mhz makes a difference. Test with even specs or forget it.

November 3, 2011 | 01:10 PM - Posted by Ryan Shrout

Not really true, though I did comment in our first page that the difference may skew things just slightly. The fact is with frame rates typically 40 or less, the CPU bottleneck is minimal on all of these titles.

December 3, 2011 | 02:12 PM - Posted by Anonymous (not verified)

wrong, Deus Ex, for example, is CPU bound and that's why the GTX580M is able to keep up with the 6990M. A CPU bound game will see big performance gains from a higher CPU clock.

November 3, 2011 | 11:43 AM - Posted by Alterac (not verified)

What Driver versions were you using for the testing?

November 3, 2011 | 01:12 PM - Posted by Ryan Shrout

Ah, sorry about that! We were using the new 285.64 driver from on the GTX 580M and the 11.10 drivers that shipped with the M17x from AMD on the HD 6990M.

November 3, 2011 | 01:34 PM - Posted by Anonymous (not verified)

how could you use nvidia drivers when they dont work on alienware.. and dell only have 269.03. even doing an nvidia update refers you back to dell for drivers.if you manage to get the nvidia drivers to work, tell me how.

November 3, 2011 | 03:52 PM - Posted by Anonymous (not verified)

The Verde drivers on install on the Alienware system. Download and install, not hard buddy!

November 3, 2011 | 05:39 PM - Posted by Ryan Shrout

^ this.

November 3, 2011 | 04:47 PM - Posted by Anonymous (not verified)

Um, crappy equivalent card comparison using a 560 Ti. A 460 GTX would be the best case, but the GTX580M is really closer to an 8800GT. And as for the 6990M, compare it to a 6850 at best.

Just going by Tex fill rate and Mem Bandwidth. I'm currently running an old 275GTX with better specs than the 580M.

Now, get out of the graphic card companies pockets and start telling it like it really is. They want us to believe we are paying $400+ for something better than a $50 graphics card.

Otherwise, nice article.

November 3, 2011 | 05:40 PM - Posted by Ryan Shrout

Umm...R U SERIOUS? An 8800 GT? You so crraazzzyy!

May 16, 2015 | 04:08 AM - Posted by Fay (not verified)

At this time it seems like BlogEngine is the preferred blogging platform available right now.
(from what I've read) Is that what you're using on your blog?

Here is my blog post :: ultra garcinia

November 4, 2011 | 02:42 PM - Posted by Eric (not verified)

I just got done listening to the podcast and this portion was of interest to me as I have had my M17X with a 6990M for about a month or so now. I have a primary system and pretty much use this laptop for my "sitting in the living room with my wife but still gaming" system =D.

The 6990M makes gaming on a laptop an actual possibility without having to always find the lowest possible setting to play games with. While I can't play BF3 on ultra settings in multiplayer, I can play on High and still be 50 FPS+ and have an enjoyable experience. I personally don't see the value in a 300+ nvidia tax for the same performance. Good Article and I'm sure there are others out there who have been on the fence about this choice that might have a clearer decision to make.

November 6, 2011 | 03:36 AM - Posted by w chris (not verified)

Just an FYI just because you have the 580m or 6990m does not mean you have optimus or amd's equivelent. Example clevo resellers such as sager and others do not have optimus with these cards, and possibly other high end cards.

November 9, 2011 | 04:36 PM - Posted by sotoa

I just heard the podcast this morning too and a couple of reasons why I ordered the 580m on my Sager NP8170(P170HM) yesterday.

1) 3D only possible with Nvidia (or at least on these current ones)
2) Drivers are usually better from Nvidia (although AMD is getting real close lately)
3) PhysX
4) Optimus > AMD's Switchable Graphics solution (although I can't do Optimus on this 120Hz screen)
5) Bought it because I didn't hear your podcast in time >:(

Is it really, really worth the premium? Maybe not, but I have price guarantee till end of this year and I'm hoping Nvidia will do the right thing.

November 9, 2011 | 05:53 PM - Posted by w chris (not verified)

No optimus or amd switchable graphics on any sager/clevos from my understanding.

November 9, 2011 | 07:30 PM - Posted by sotoa

You might want to check the ram on the 580m. It should be 2GB, not 1.5GB.

November 10, 2011 | 12:44 PM - Posted by Anonymous (not verified)

Battlefield 3 is a processor friendly game.

The GTX 580m machine had a more powerful processor than the 6990m machine.

Perhaps that could have been a deciding factor on the BF3 outcome. I don't think the 6990m would be that much worse on Ultra than the 580m.

November 13, 2011 | 05:16 PM - Posted by Anonymous (not verified)

What would you guys prefer? I have been searching for a while now, and trying to find out whats the best graphic card to buy.

Nvidia GTX 580M
Intel Core I7 2760qm


ATI HD 6990M
Intel Core I7 2860qm

It's one of those two that i'm planning to buy, but which one is best at running BF3 on ultra..? The price for the two setups is exactly the same..

November 14, 2011 | 09:42 PM - Posted by Anonymous (not verified)

I just bought one of these over in Australia

Our models are slightly different to the US spec, but I have to say..... what a beast !!

It runs Battlefield 3 with no issues - HDMI out looks great on a plasma tv. Massive battery life.. Good on ya dell !

I also got the 6990, 256 SSD - I will get an external SSD later if the internal gets full and plug in through ESATA port. Alienware backpack.

November 16, 2011 | 12:14 PM - Posted by Anonymous (not verified)

So you would prefer the HD 6990m?

And is is on ultra your 6990M is running without issues

What cpu do you have ? :)

November 22, 2011 | 12:47 AM - Posted by Anonymous (not verified)

Yep - For the price

I think the settings are all on high and it runs great

I have the 2.4Ghz CPU

I also play it using my HTC smart phone as a WIFI hotspot, then connect the 17x to it wirelessly , it runs well- even on the 32 vs 32 maps.

sometimes laggy but sometimes not - read sniper shots accross the map.!

November 22, 2011 | 08:01 PM - Posted by Anonymous (not verified)

I take that back, I have the graphics set on Custom, all ULTRA and on, except for PRE alaising which is off.

November 23, 2011 | 02:28 PM - Posted by sotoa

The 580m's memory is still incorrect on the chart.

January 3, 2012 | 03:01 PM - Posted by Bewbreage (not verified)

We've heard it again and again, from massive billboards on the warning labels on cig packets: Smoking cigarettes is a danger to your health. But it seems such as the more we hear it, the more it appears like the words somehow were able to lose its efficiency. Smoke assist has emerged as a substitute for smokers who experienced struggles with getting themselves off from smoking a cigarette.

Smokers are attracted to tobacco cigarettes because the nicotine in them is very addictive, offers psychotropic effects, and that makes the users hooked on to the cigarette. I was looking for an escape from the following addiction. I hated the idea of being hooked on anything. Even worse, it was eventually burning holes in my pockets. Smoke Assist could be the alternative I was in need of.

My girlfriend didnâ??t such as my smoking habit either, and so it really began getting on my own nerves. does indeed work, and I learned this only after trying it out. There was an absolutely free trial in the providing, so I figured there was no harm as long as it didnâ??t cost anything. Although smoke assist digital cigarette doesnâ??t quite change the physical cigarette, I wanted to quit so bad so that i was willing to try out anything. Knowing I could get back my money just still ended up a smoker, I decided to make the leap with the trial version of Smoke Assist.

I am so thankful that I decided to test out this alternative to cigarettes. The outer shell is actually stainless steel and comes with a control circuit. It is governed by the mini-computer, making it a great e-cigarette. This smokeless cigarette does not produce carbon monoxide smoke that could be damaging to non-smokers. After concerning 4 months, my need for tobacco cigarettes provides almost gone away. The idea of smoking a smoke assist electronic cigarette was so intriguing that i tried it out for the sake of novelty, and today, and I bought way more than My partner and i expected.

January 16, 2012 | 03:38 PM - Posted by Anonymous (not verified)

you're lying check the dell site yourself at least in Canada its still a $300 difference, and we all know that with identical systems, the 6990m would beat the 580m in most catigories, for a far cheaper price, and the added features of the 580m are really not that desirable especially for $300+

February 1, 2012 | 12:50 PM - Posted by Goldberry (not verified)

Hey guys i only play 1 game and that is World of Warcraft Cataclysm: Pls tell which GC would be suitable for online gaming the GTX 580M or the HD 6990M as im planning to run WOW in Ultra Settings Tyvm in advance.

February 4, 2012 | 12:20 PM - Posted by Anonymous (not verified)

the GeForce GTX 580m is now $300 more then the 6990 in the site

March 9, 2012 | 08:37 PM - Posted by Jwest (not verified)

Hmmm that's strange my Alienware m17x with a 580m runs Battlefield 3 on a average of 40 to 50...

May 6, 2013 | 09:31 AM - Posted by Dyan Joy (not verified)

While I love the inclusion of the Momentus XT hybrid SSD/HDD in the GeForce-based M17x, it won't improve frame rates on our gaming tests. The system memory difference, while pretty extreme from a capacity stand point, should make basically no difference either since 8GB of memory is more than enough to handle 1080p gaming scenarios. And while the 500 MHz CPU clock speed difference could be a factor in some games and at some resolutions, I feel that our settings are high enough to warrant this basically a draw.Best Mac apps

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.