Review Index:

AMD Radeon HD 7970 3GB Graphics Card Review - Tahiti at 28nm

Author: Ryan Shrout
Manufacturer: AMD

Power Management, Eyefinity and Video Codec Engine

PowerTune technology was introduced by AMD with the 6000 series of cards in order to address the issue of "problem applications" that cause the usage and thermals of a graphics card to get out skew with reality.  Programs like Furmark and OCCT SC8 that were built basically as stress tests for graphics cards and GPUs had a bad habit of frying components when pushed to their limits.  AMD and NVIDIA both saw this as a negative for the industry - it caused them to be more tentative with their clock speeds on GPUs in order to save thermal headroom for those rogue programs.  

PowerTune is a solution that monitors and calculates the power signature of a piece of software to keep it from exceeding the maximum threshold set by the GPU vendor.

View Full Size

In a sense, this allowed both GPU companies to set higher default clock speeds for running games at on current generation GPUs without the worry of a user running Furmark, frying their card, and asking for a refund.  AMD's PowerTune is a hardware-based solution that keeps things in check and adjusts the clock speeds as necessary to keep things in bounds.  

The initial worry from users and the media was that this would mean variable performance on a card to card basis - but this fear never came to be. 

A new technology AMD is introducing with the Radeon HD 7970 is ZeroCore Power that puts the GPU in a nearly completely powered off state when your display goes to sleep.

View Full Size

During a long idle state, which is defined as when your screen is put to sleep in the Windows power settings, the GPU will enter a new power state that borrows technology from AMD's mobility line of graphics chips.  When in this mode the GPU actually consumes less than a watt of power while the entire 7970 card uses less than 3 watts!  The fan turns off (it doesn't just slow down) and you have basically a silent running, yet powered on and ready for you to just move the mouse, PC.  

View Full Size

Maybe even more interestingly, this technology allows the non-primary GPUs running in a CrossFireX configuration to completely power down when you are in Windows mode.  If you are running a set of three HD 7970s then while doing basic web browsing the second and third GPUs will be running at a sub-watt level and won't be contributing to the noise levels of your system either.  This will even apply to multi-GPU cards when they are released later in the year.

This is the kind of innovation we want to see to push the further adoption of multi-GPU gaming rigs!

Eyefinity Improvements

AMD's Eyefinity was the first to bring multi-display gaming to the masses and it continues to be the leading detriment to gamer's wallets everywhere.  With the Southern Islands architecture the team at AMD is bringing a few new changes to the ecosystem.

View Full Size

Even though we saw it released with the 11.12 and 12.1 driver this month, the added ability to run both Eyefinity and AMD HD3D gaming at the same time is welcome addition to the Radeon HD 7970.  Yes NVIDIA has had this option for a while with 3D Vision Surround but only the AMD offerings will be able to do it from a single graphics cards - NVIDIA still requires an SLI setup.  

And, as we'll see later in the benchmark portion of our review, with the performance the HD 7970 provides, running Eyefinity and even HD3D is a viable option for those of you with the funds.

View Full Size

Another really awesome update is the inclusion of discrete digital multi-point audio.  DDM Audio allows for a lot of cool functionality, one of which is the ability to output independent audio streams to each display that is connected via HDMI or DisplayPort.  As indicated in the above slide, this allows for conferencing where the audio actually comes from the display associated with that sound.  Of course, the complication here is that video conference software like ooVoo will be required to specifically target this functionality.  

Side note - AMD and ooVoo tried to demonstrate this at the technology day in Austin, TX and while it was functional, it obviously needed some more tweaking.  Be sure you try a demo of any software out before shelling out the money for it. 

View Full Size

My absolute favorite capability of DDMA is that you can now have a central computer act as your entertainment hub while getting different audio streams to different parts of your house.  If you have ever wanted to share a computer with another user (one watching Netflix while you surf the web) then you already know how important this could be. 

View Full Size

With the Radeon HD 7970 it is now possible to both play a game and (either you or another person) to watch video with support for audio, at the same time!  Who says ADHD is an outdated trait?

View Full Size

Another much-requested feature is the ability to set custom resolutions in the control panel as well as moving the taskbar and default desktop locations to the center monitor.  AMD is going to be addressing both of these Eyefinity quirks with the 12.2 driver release due in February!

View Full Size

The oft-discussed, never-released MST Hub, that allows for splitting of a DisplayPort 1.2 connection into three DP ports, will finally be available in the Summer of 2012 according to AMD.  These devices won't be cheap though - expect to pay ~$150 each for the ability to run 6 monitors off of your standard 6000 or 7000-series graphics cards. 

View Full Size

Finally, AMD's HD 7970 graphics card will be the first to support the new HDMI 3 GHz standard that allows for higher resolutions across the connection that has become the video standard.  3 GHz HDMI will allow for true 60 Hz per eye on a 1080p resolution 3D configuration as well as running 4K displays off of a single input (as opposed to the two they require today).  While this is a great feature, no displays that I know of in the consumer market yet support the 3 GHz standard, but at least you will have the option with a Radeon HD 7000-series card for the future.  

Video Codec Engine

Another new addition to the Southern Islands GPU is support for the AMD Video Codec Engine that changes the way video encoding is done on the GPU.  At its core the VCE offers an H.264 fixed function encoder that is both power efficient and faster than real-time working on 1080p 60 Hz content.  This technology can be seen as an answer to Intel's QuickSync technology.

View Full Size

There are two modes this technology can run in - full and hybrid.  The "full mode" runs completely on the fixed hardware and is the most power efficient HD encoding process on the AMD platform.  

View Full Size

In the "hybrid mode" the VCE takes advantage of the compute power of the GPU and improves performance with the scalable pipeline that SI provides. 

We didn't have any applications that implemented this technology available quite yet but we are curious to see how the image quality of each implementation compares and how it stacks up to the QuickSync technology offered on Sandy Bridge. 

December 22, 2011 | 12:15 AM - Posted by wargames12

If it's faster that the 580, it kind of makes sense that it costs a bit more. It's pretty disappointing to see the prices of current gen cards stay so high for so long though. Hopefully when we see Nvidia's new card we'll see some price drops all around on the current generation.

December 22, 2011 | 12:40 AM - Posted by Mr_Tea (not verified)

My thoughts exactly. I was hoping to see AMD drive the performance/dollar up with this release. At this rate single GPUs will be launching at $1000 in 2 years :(

December 22, 2011 | 12:47 AM - Posted by wargames12

I want to blame the rich guys who will actually pay the extra 2-300 dollars for the extra 10-20 percent performance, but I can't. I would do the same if I had the extra cash haha.

December 26, 2011 | 10:49 AM - Posted by Anonymous (not verified)

That is pretty funny! You obviously don't remember how much the 8800GTX and Ultra cost when they were launched. I paid $950.00 for my first 8800GTX. With the price drop just go Crossfire it keeps getting better and better, as well as cheaper.

January 3, 2014 | 11:21 AM - Posted by Anonymous (not verified)

Ha, turns out you were right.

December 22, 2011 | 12:57 AM - Posted by Ryan Shrout

I'm glad that point came across well in my review. I love the performance out of this, but I guess I just expected/wanted AMD to undercut NVIDIA to put pressure on them and start the price wars again.

There is still a chance that NVIDIA cuts the GTX 580 down to $425 or something - they have a lot of room with the GTX 570 priced at $340. If they do that, then AMD will have to drop the 7970 price.

December 22, 2011 | 12:51 AM - Posted by Buyers

Good solid review. I think i was hoping for a little more of a performance increase over the 580. I look forward to seeing eyefinity benchmarks with this card with crossfire setup.

Couple of edits:
Page 3, talking about DDMA:Multi-tasking under image of triple screen gaming with soccer on background tv:"With the Radeon HD 7970 it is not possible to both play a game and..." should that be now, given the way the rest of the sentence and paragraph read?

Metro 2033 @ 1920x1080 line graph X-axis labels are the default Series1/2/3/4 instead of the gpu name labels.

December 22, 2011 | 12:56 AM - Posted by Tim Verry

Yes, it is now possible.

December 22, 2011 | 12:58 AM - Posted by Ryan Shrout

Yeah, thanks for that type - kind of a big difference. :)

And also, yes, we are looking forward to doing both Eyefinity and CrossFire testing very soon!

Let me know if there are particular titles you want to see tested!

December 23, 2011 | 06:01 PM - Posted by Nacelle (not verified)

I'm sure it goes without saying BF3 in Eyefinity is what everyone wants. Not much else brings two 6970's to a crawl.

May 16, 2015 | 09:57 PM - Posted by Lucy (not verified)

Fastidious respond in return of this query with real arguments and explaining all about that.


December 22, 2011 | 12:57 AM - Posted by Slash3 (not verified)

Would it be possible to disable vsync on Skyrim and re-run your tests? There are several methods which successfully disable it, including editing the .cfg file (iPresentInterval=0) or using a utility like Radpro to force vsync disable on the process. As it stands, that particular game's set of benchmarks is totally useless. Nice card, though.

December 22, 2011 | 01:00 AM - Posted by Ryan Shrout

Based on my research and testing, I wasn't able to find a way to do disable Vsync with AMD cards without modding the game, which seems less than ideal.

If you have a link to a solution though directly, I'll gladly try it!

December 22, 2011 | 01:10 AM - Posted by Kennneth (not verified)

Who are you going to believe?

December 22, 2011 | 01:13 AM - Posted by Mr_Tea (not verified)

I saw a slide that showed this card doing separate audio out to each display and intelligently switching if a video was moved to another display. Any truth to that? Testing? That would be pretty awesome. Thanks.

December 22, 2011 | 01:19 AM - Posted by Mr_Tea (not verified)

Whoops, must have skipped that page.

June 13, 2013 | 02:33 AM - Posted by Launa (not verified)

ils peinture devis travaux pompe à chaleur
geothermie chauffage pompe à chaleur air air pac air eau
prix pompe à chaleur chauffage attendre bien que
les la clientèle détailler plage

December 22, 2011 | 01:25 AM - Posted by RashyNuke (not verified)

Ryan what is with porn music...Oh found my 7970 wetspot.

December 22, 2011 | 01:31 AM - Posted by jstnomega (not verified)

given the current state of the art re Vid games, isn't it all still a matter of pushing pixels? if that's the case, then clearly something aint right here - look at the 40nm vs 28nm Pixel Fillrate figures in Ryan's video - barely any gain at all

December 22, 2011 | 01:46 AM - Posted by Ryan Shrout

Pixel fill rate is not really the defining factor right now. It is not how many pixels you can push its what you calculate on those pixels in real-time. Shading power! Oh, and geometry is picking up again in importance.

December 22, 2011 | 02:25 AM - Posted by Anonymous (not verified)

Great review, its nice to see someone actually moving the technology along, and innovating. Seamlessly it seems, this time.

December 22, 2011 | 10:26 AM - Posted by Ryan Shrout

Agreed. More than than a year with the GTX 6970 and GTX 580 is enough for me.

December 22, 2011 | 03:05 AM - Posted by bjv2370

good review

December 22, 2011 | 04:52 AM - Posted by Irishgamer01

This card is way over priced.
I for one will be sticking to my current setups. For now.
The sweet spot for this level of card is 399.
While performance is better its not enough for me.
I want to see Nvidia's offering. If they hold their current pricing structure, match or better performance, then AMD will be punished big time.

Will give me a certain amount of pleasure, as I hate price milking, just because they can.

December 22, 2011 | 10:27 AM - Posted by Ryan Shrout

Yes, what NVIDIA has to say about GPUs in the next few months will be very important here. Curious to see if Kepler will hold up to its promises.

December 22, 2011 | 06:36 AM - Posted by Metwinge (not verified)

I shall be placing my 2 5870's for one of these as im very impressed by these benchmark scores especially in BF3 as thats the game im playing atm. I have 2 1080p monitors sitting under our tree so i can see these cards taking a bashing with the few high end games i play

Thanks for the excellent review Ryan

December 22, 2011 | 10:27 AM - Posted by Ryan Shrout


And have fun with Eyefinity!

December 22, 2011 | 08:37 AM - Posted by Kevin (not verified)

Great review, but I cant see spending upwards of $600 for it =/

December 22, 2011 | 08:44 AM - Posted by Apostrophe (not verified)

It is a lovely video card - shame the price is a bit high. I do hope Nvidia responds with something equally impressive. 2012 is going to be interesting.

By the way, have you guys considered adding Star Wars: The Old Republic to your battery of tests? It's the biggest MMO to launch in years and I would expect that a large portion of your readers will be interested.

December 22, 2011 | 10:28 AM - Posted by Ryan Shrout

We did consider it, but didn't have to validate before this article. We still might, depending on how GPU-bound the game is, if it all.

December 22, 2011 | 08:56 AM - Posted by nabokovfan87

Extremely glad the GCN architecture Isn't as terrible as bulldozer ended up. I went to the skyrim page first and was waiting for a huge dissapointment, but yeah.

I am intrigued to see how my box will handle the new card. I have an MSI 4850 512mb and am upgrading to the next card made by the 48xx series team. Sort of amazing to think of the differences in performance is going to be 5-10x better when comparing raw specifications alone.

If anyone is interested I will be doing some benchmarks and testing things out, if PCPER wants to use those for a writeup or discuss it, I would be more then happy to provide it and listen to your thoughts.


ALSO: There are a lot of people waiting to upgrade, waiting for the "new stuff" from either to decide on what to do, that is why the pricing is so high right away. Like I said above and I am sure many others will be grabbing the 4950, it is more about upgrading then it is price. These cards last 3-4 years, and the price initially is worth it for the upgrade possibilities and expanded feature set then what is currently available in terms of power usage, temperature, performance, and so on.

December 22, 2011 | 10:29 AM - Posted by Ryan Shrout

Sure man, we'd love to have you write a guest story about the upgrade experience. I am sure quite a few readers will be in the same boat.

December 22, 2011 | 08:56 AM - Posted by Richard (not verified)

I personally think that graphics cards are way overpriced. I will keep my 570 for a good while I suppose. I have a hard time spending more than 350$ especially when you see how games for PC are having problems to run well. Major issues with Battlefield 3, even in the menus, like it is corrupted, image and sound. The beta ran fine which is weird. Will try one more install and if not fixed will have to throw it away, 60$ badly invested. Other games run just fine. Wrote to the company, no reply. By the way, game prices are also way too high in my opinion. Saw Rage for 20$ free delivery at, same game at = 60$ +++ 10$ delivery. That is not right. Waiting for a newegg response...they will not sell to a Canadian on their US site of course.

December 22, 2011 | 10:30 AM - Posted by Ryan Shrout

Not sure about your BF3 issues, sorry to hear.

But yes, I tend to agree - if you have a GTX 570/580 or a Radeon HD 6900 card the difference in performance doesn't warrant the price right now.

December 23, 2011 | 08:36 AM - Posted by Richard (not verified)

Good news, the new drivers seem to have fixed Battlefield.
I gave it a try last night and I got stucked on it for a few hours. Good game by the way, it looks really good on my system. I get a few weird horizontal lines here and there but only in the menu section, strange. Once in the game it runs fine. I play it at 1080p and it is smooth. Happy!!!!!
Thanks for your excellent reviews by the way.

December 23, 2011 | 09:46 AM - Posted by Ryan Shrout

Thanks for reading them Richard!

December 22, 2011 | 09:33 AM - Posted by Shadow (not verified)

This card is not worth the $600 price tag, clearly overpriced in my opinion. And here I thought AMD was the best price to performace PC hardware company, but I guess if you want the first 28nm graphic card you have to pay up. I hope Nvidia comes out with their 600 series soon so the price wars can start on these new graphic cards, but I will be holding onto my EVGA Classified 590 for awhile though. I can play BF3, WOW, and Diablo 3 beta all at high/ultra -1920/1080 settings with no lag or hicups perfectly fine.

December 22, 2011 | 10:31 AM - Posted by Ryan Shrout

True, this is not a replacement for a GTX 590 at all.

December 22, 2011 | 11:57 AM - Posted by Imperfectlink

Looks like a solid showing from AMD. As long as the drivers match the drapes it should do pretty well.

We have to remember that TSMC is having a rough time meeting demand at the 28nm fabrication process so price will reflect that. Signs on the intarwebs point to Kepler lagging behind so why not take advantage of the lead? Bleeding edge has a price and right now it's $550. No doubt AMD will cut the price when yields are good, competition arrives and they can afford to.

December 22, 2011 | 12:51 PM - Posted by Swoosh (not verified)

@ Shadow

Not so fast, this is just a review and AMD is still in the
full authority for the "The last minute say" as far as retail pricing is concerned. For all we know and for what
AMD is known for is that the way they priced their video card products always makes it hard for nVidia to cope up
and compete, this is not just for the video card performance alone BUT for the price to performance ratio
which nVidia are loosing years past since the release of AMD's 5900 series video cards up to now and they may never
regain major market shares unless nVidia would drop their high end video card prices less than to what AMD's pricing
on counterpart models, WHICH will never happen.

Im not saying nVidia video cards are not good but, what AMD
did and currently doing (as we all know) it seems its a one sided race to price to performance ratio and chances are even if nVidia were to release their up coming "not sure
when" 6000 series video cards the same day as AMD's 7900 series video card launch, their prices would be more
expensive and doesnt favor the taste of many gamers now. Another thing that makes nVidia suffer from low sales is the issue of hypocrisy and bad publicity and getting a lot of bad karma on their unfair doings.

AMD for sure will try to make their prices reasonable. Talking about latest technology and hi performance offerings particularly the 7970 model, It is like releasing their 6900 series video cards where gamers have no qualms or issues since day one.

- Very bad for nvidia and its biased bloggers and reviewers.

December 22, 2011 | 01:42 PM - Posted by Anonymous (not verified)

How can I tell if my motherboard supports PCI Express 3.0? It's about 1.5 years old.


December 22, 2011 | 01:59 PM - Posted by Ryan Shrout

It doesn't require a PCIe 3.0 motherboard, PCIe 1.0 and 2.0 are backwards compatible still.

December 22, 2011 | 02:43 PM - Posted by Mechromancer (not verified)

We will finally have an AMD GPU worthy for FOLDING@HOME!!!!!!!!!!!!!!!!!!!!!!!!!

I've waited since the 2900XT for this day. Maybe Stanford will get around to making a good client for it in less than a year this time >:P

December 22, 2011 | 10:23 PM - Posted by nabokovfan87

Heh, and I thought waiting with the 4850 was a long time. Patience is a virtue, well done!

December 23, 2011 | 09:46 AM - Posted by Ryan Shrout

Wow, yah, you are going to see a HUGE performance difference here.

December 22, 2011 | 02:49 PM - Posted by Luzer (not verified)

I have two 5870's in Crossfire connected to three Dell 24"(something like 6080x1200 with bezel compensation on). If I sell both of my cards for ~$300 is it worth the $250 to get a 7970 or should I pick up another 5870 ($150) for triple crossfire or even two 6970's ($300 each)? I can’t really play BF3 with crossfire enabled and using all three displays, av. FPS is around 20 on high, but I can play on a single display, 1900x1200, on high, av. FPS 55, with crossfire.

In the end I probably wait for the 7970 to get down to $375 and put my 5870's in an old pc and use it as a space heater.

December 22, 2011 | 02:57 PM - Posted by JohnE (not verified)

I want to purchase the 6970 MSI Lightning today. I'll be happy with this card as it overclocks to 1ghz stable. When do you think the "older" cards will drop in price? I'm afraid i'll have to wait a month, I WANT MY TOYS NOW! (jk)

December 22, 2011 | 05:36 PM - Posted by Wolvenmoon (not verified)

Will hold on to my 9800GTX+ until prices come down or performance across the board doubles again.

December 23, 2011 | 09:47 AM - Posted by Ryan Shrout


December 24, 2011 | 07:04 AM - Posted by rrr (not verified)

If you have a small monitor, by all means do it.

February 4, 2012 | 07:19 AM - Posted by SiliconDoc (not verified)

I'm holding on to the 9500GT because it rocks out in BF3 at 12-15 fps and that's on a lower end pentium D with 256x4 of 533ddr2 and a 40g ide drive.
Whoo hoo, we iz gamin' !
It doesn't crash, at all, either, unlike the red cards.

December 22, 2011 | 09:04 PM - Posted by Anonymous (not verified)

omg did you test those benchmarks with 0% power control at the CATALYST control center ?. You shoulda test it with +20 power setting, and then try to OC it... you might even get a higher oc since other reviewers get more then 1100mhz out of the core.

December 23, 2011 | 09:47 AM - Posted by Ryan Shrout

We will give that a shot after the holiday.

December 22, 2011 | 10:22 PM - Posted by nabokovfan87

I wonder how the 3 gb Frame buffer affects 3D gaming w/ eyefinity.

December 23, 2011 | 09:48 AM - Posted by Ryan Shrout

It might not help THAT much for 3D as the amount of memory required for 3D isn't much than standard - just processing power. The reasoning: the textures / data need for the left eye are usually a pretty close match to what is needed for the right eye.

December 26, 2011 | 08:56 PM - Posted by nabokovfan87

But doesn't doing 3d require twice as much processing then traditional graphics? For instance, the metro 2033 normal AVG FPS is double what is in 3d. I wonder if the frame buffer would allow more things to be in VRAM or something would happen to give a boost, slight if any. It doesn't seem likely from what you are saying, but it would be something interesting to see.

Especially considering how the 69xx series has had 2 and 1 gb variants and the slight hits in perf. on those cards.

December 23, 2011 | 11:12 AM - Posted by soldierguy

Hey Ryan

Thanks for the review..luv your shows. I've got a pair of 5870 2Gb's in Crossfire powering 5760x1080 and an i7930 oc to 3.8ghz. I'm thinking of a pair of these for my next build or maybe just buying a pair of these and clunking them in the existing unit. Of course money and value are an issue. And would I really see any multi player benefit from just upgrading the gpu's? I guess you'll be doing a comparison against my setup or similar and that would help for sure. But in the real world would there be any "real" improvement in multi player for me. I mostly need the performance for the hi twitch games--for me COD. I play other games too but don't need more computing power for them. Goodluck with the new fiber optics....I'm green with envy..that's where I have problems and limitations.

December 23, 2011 | 01:41 PM - Posted by Mark (not verified)

What I'm getting out of the 7970 is that they realized Nvidia had a superior architecture and mimicked it. Now how this will affect their drivers will be interesting. If anything this equalizes the 2 companies.

Their charging a high premium for cards because they can. Prices will go down SIGNIFICANTLY when Kepler rears its turtle head......TURTLE!!!

Pretty much all this says is that AMD was early this round but Nvidia again will take the performance crown because if they don't surpase a product released 6-9 months prior to theirs then it will not bode well for the green machine.

God I sound like a Nvidia shill....

December 23, 2011 | 02:23 PM - Posted by Davo (not verified)

The prices will go down and quite quickly. AMDs top end cards are cheaper than Nvidia's at present and this series will follow suit.

These cards have a premium at the moment (and we are only speculating at present since they arent yet on sale) but competition amongst AMD resellers will see it go down anyway.

I cant blame them for making hay while the sun shiones at the moment. This card is the top end and represents a small percentage of the overall market and they will sell like hotcakes no matter where they are priced to people who want the top end.

Im happy AMD has at least one division thats actually making great product and competing strongly.

December 24, 2011 | 02:19 PM - Posted by Ryan Shrout

I hope you are right Davo.

February 4, 2012 | 08:45 AM - Posted by SiliconDoc (not verified)

Cheaper by what ? $10 per three hundred, or 3 percent...
Yeah, cheaper by three percent.
$10 bucks is not enough to suffer through the driver issues, the crash issues, the lack of new game support issues, the no PhysX issue...
To all those who've "never had a problem with their radeon cards" congratulations, miracles still do happen.

February 4, 2012 | 08:28 AM - Posted by SiliconDoc (not verified)

Gee Mark, telling the truth, while calling Nvidia "turtle!" a couple of times is shilling for the eeevil "them" ?
Is this place that much pop culture red amd fanboy that you have to excuse yourself even after calling Nvidia names?
It's a sad biased world when what is patently obvious to a truthful mind must be parsed and dissed just to try to get along and not be attacked.
I'm frankly sick of it.
It's okay ot say AMDATI is a copycat and came in late to superior Nvidia styled architecture and they will get beaten by Nvidia in the next release - it's OK to say it.
In fact, we've all probably heard the latter part already in leaked rumor - at which point, for those who keep up, the AMD fanboys went angry wild and screamed it was a group of Nvidia marketing team spammers who leaked $299 and faster... even though Charlie claimed he saw the benchmarks...
Well, in this case time will tell, and thanks for speaking your mind, and no thanks for calling Nvidia turtle twice, and don't apologize just because red fanboys will attack you, that is, if you can help yourself. Is it really that bad, the attacking of anyone who says a single bad word against AMD and has a good prediction for Nvidia that just makes sense based upon endless years of patterns...that you have to caveat some simple truths and cover the bases with some armor ?

December 23, 2011 | 03:04 PM - Posted by Stas (not verified)

From what I see, AMD takes over GTX580 by a ridiculous margin, being absurdly priced. Ryan, raise GTX580 gpu and memory clocks to meet HD 7970 ones, and GTX580 will show same or better performance. In the result, HD 7970 can be buried straight away.

February 4, 2012 | 08:36 AM - Posted by SiliconDoc (not verified)

That's an interesting thought. If that pans out, we shouldn't be hearing about "superior architecture" from those who promote AMD, endlessly, it seems.
However, they will just claim superior tech anyway and blame the "release" drivers and claim future releases will unleash the "true power". The next breath they will claim AMDATI no longer has driver issues and is now equivalent to Nvidia...
YES I say, let's see an apples clocks comparison.

December 24, 2011 | 11:13 AM - Posted by gabriel (not verified)

that makes my life even harder...

Now I don't know what to do, buy a 6990 or one 7970 and upgrade later by getting a second one, or even if I should wait for the 7990.

what do you guys think?

December 24, 2011 | 02:20 PM - Posted by Ryan Shrout

Hard to say - as I noted in my review, I almost always pick a single GPU over a dual-GPU solution if performance is close the same. Less hassle to deal with and fewer potential problems.

December 26, 2011 | 01:30 AM - Posted by Swoosh (not verified)

Since the release of AMD's 7900 series video cards is just
the corner, its best to wait for the release of the mighty
7990, so your options as far as future proof gaming will be
much longer and may infact (if you are using a single full HD
monitor) will never have to upgrade video card again because
that 7990 for sure is one helluva long, bad ass, intimidating,
very very fast video card that will give you the gaming
performance more than you expected and if you're into movie
editing, its performance (as far as openCL GPU acceleration
support is concerned) will even give you faster renderings
like never before that includes 3d renderings that supports
OpenCL. :-)

December 24, 2011 | 01:43 PM - Posted by Swede (not verified)

Great review!

Man, is there an easy way to get your hand on gfx cards in general and get them shipped overseas? As of right now the 580's are averaging at $680 over here (in Sweden) and I'd like to upgrade from my 5850.

December 26, 2011 | 01:34 AM - Posted by Swoosh (not verified)

Upgrading from 5850 to 7970?

Well, sounds like a very good upgrade decision for me!
go for it! :-)

December 26, 2011 | 09:05 PM - Posted by nabokovfan87

try amazon .uk/.de or somewhere

They have shipped stuff to me in the US, they might do it to your location as well. especially .uk to Sweden.

December 27, 2011 | 04:14 PM - Posted by Z (not verified)

HOW DO PEOPLE HAVE REVIEWS OUT ? the PROPER drivers for PCI-E Gen3 are not even out !-?

December 29, 2011 | 08:38 AM - Posted by Anonymous (not verified)

Just wondering how much WATTS does my power supply have to be to support it?

January 2, 2012 | 04:02 AM - Posted by crunzaty (not verified)

i think minimum is 450 Not quite sure ^^

January 2, 2012 | 04:01 AM - Posted by crunzaty (not verified)

I think nVIDIA will come with a new card that will beat this and be cheaper :)

January 9, 2012 | 09:02 AM - Posted by soldierguy

Hey mention a future review of Eyefinity/Crossfire...For sure I'd like to see that.
Nice review thanks.

January 18, 2012 | 01:04 PM - Posted by Anonymous (not verified)

This is a great card. I was a little hesitant to fire sale my 2 MSI 6970 Lightnings (which are amazing) and buying a single 7970, but it was unwarranted. The performance, once easily overclocked (by ATI/AMD's stock overdrive), is on par with my previous setup on games that supported crossfire. The games that didn't support it, well this card just blows my previous setup out of the water in that scenario.

Another point that ALMOST made me sit on my hands for a while and wait was going from top of the line custom cards (msi lightnings)to launch day OEM cards. I've only purchased cards with after market coolers for the past 3 generations but I just couldn't wait to try out the 7970. I'm happy to say that while gaming, I don't notice a difference in fan noise. If anything, it runs cooler than my previous setup (obviously crossfire had a lot to do with it) and it may be actually quieter. While the Lightnings had an excellent HSF along with all sorts of other custom parts, the 7970's 28nm process just really nullifies the need for a monster HSF setup.

Plenty of great reviews out there on the interwebz, go read those. Just wanted to 5 star this because of the fool with the 3 star review spreading FUD.

February 24, 2012 | 07:48 PM - Posted by Danny (not verified)

just a question..

i've bought one of this cards... but i have a problem. When i start gaming the Graphic card heats so much, over 70 degrees Celcios.

Any idea of what the problem?

October 8, 2012 | 10:20 AM - Posted by Anonymous (not verified)

but will it play Wolfenstein 3d in ultra settings. Just kidding this card rocks add a second card in xfire unstopable

July 12, 2013 | 09:52 PM - Posted by fake name forced me (not verified)

I get 10 or 20 fps of choppy, erratic performance in TF2 on w8000 in linux. A software renderer would probably be faster. The linux drivers are garbage.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.