Review Index:

MSI R7970 Lightning Review: AMD's HD 7970 Gets the Treatment

Author: Josh Walrath
Manufacturer: MSI

Results: DiRT 3

Codemasters has probably the best looking racing games on the planet. They pulled out all stops with DiRT 3. This particular title features plenty of DX11 effects and tessellation (though I think they took a step back with water effects as compared to DiRT 2). The game used the Ultra preset with 4X AA and 16X AF enabled. The built in Aspen benchmark is used.

View Full Size

View Full Size

View Full Size

DiRT 3 is another title that is pretty CPU dependent. This is also a game that is more NVIDIA friendly than the previous iteration, DiRT 2. The R7970 takes the lead in every test, but the differences are not as big as we have seen in previous tests. Even the R6970 is not all that far behind.


March 28, 2012 | 08:40 AM - Posted by kris livengood (not verified)

wow, didn't even test it agains another 7970 really guys, time to get on the ball.

March 28, 2012 | 09:49 AM - Posted by Anonymous (not verified)

If you're looking at this card and the premium it carries, its likely you're buying it for the overclocking headroom, not the base specs.

What I would have thought was more important is this vs a 680. I can understand not putting it in though. Its unlikely there was one available for review at the time this was being put together.

March 28, 2012 | 09:50 AM - Posted by Josh Walrath

Gotta work with what I have. The performance of a standard HD 7970 is not exactly a secret, so I decided to test it against the two previous Lightning cards to really detail what a user gets when upgrading to this particular overclocked monster. In hindsight I guess it would have probably behooved me to lower the clocks on this card to standard settings and gone from there. I will certainly keep that in mind next time I test an overclocked product like this. Also, Ryan is in Kentucky with the standard HD 7970s, and I live in Wyoming. Swapping parts between the two areas is a bit troublesome.

March 28, 2012 | 11:00 AM - Posted by kris livengood (not verified)

So do you think it really is worth the mark up in price?

March 28, 2012 | 12:08 PM - Posted by Josh Walrath

For the $50 increase in price over a stock? Yes, absolutely. But you must remember that this is a brand new product, and the GTX 680 is still not out in force. Once that happens, then I am sure the dynamics of the pricing of these cards will change drastically. I am judging this card by what is available today. So yes, at $599 it is a good card. Two months from now, when there are many different examples of not just HD 7970 cards, but also GTX 680... it might not look like such a nice product at that price. I am pretty sure though that prices will drop pretty dramatically during that time to keep it competitive with other offerings.

March 28, 2012 | 02:45 PM - Posted by kris livengood (not verified)

were you able to take the heatsink off and find out if it is on a reference pcb?

March 28, 2012 | 03:19 PM - Posted by Josh Walrath

It most certainly is not a reference PCB. A reference board has a 5+1 power phase setup (iirc), while this one is 17 total phases. If you look at the pictures of the boards from behind, you can see that the PCB is smaller at the front of the card (display outputs), then gets taller after the CrossFire connectors. It is also longer than the reference design. This is a much larger PCB to accomodate the more power phases, as well as give the necessary room to optimize trace pathways to the different components.

March 28, 2012 | 12:03 PM - Posted by SoneeOO7

Who cares I just bought a GTX680 Son!

March 28, 2012 | 12:19 PM - Posted by Josh Walrath

A good buy! I just hope we get to see more available GTX 680 products soon!!!

April 1, 2012 | 12:46 AM - Posted by Anonymous (not verified)

BUT although the 680 outperforms on a SINGLE monitor application, it would take 2 680's to do an eyefinity setup.

Advantage AMD

April 2, 2012 | 11:32 AM - Posted by Josh Walrath

Nope, the new GTX 680 can output to 4 monitors in total with one card. It will only do 3 monitors in NVIDIA Surround with the 4th being an "accessory monitor" when using 3D applications. So, users no longer require 2 NVIDIA video cards in SLI for more than 2 monitors.

March 28, 2012 | 12:15 PM - Posted by Nilbog

Great article, thanks! I am positive that there will be a 680 version, they would be foolish not to make one.Look forward to reading about it too

I also have to agree, the 50$ premium for better quality parts on the card is well worth it.
I bought a R6970 Lightning just for the quality parts.

March 28, 2012 | 04:56 PM - Posted by Josh Walrath

Obviously I haven't been given a timeline for the eventual GTX 680 Lightning card, but I would expect it to be around 3 months away due to the shortage of chips and the design time for the product as a whole.

March 28, 2012 | 05:09 PM - Posted by Nilbog

You don't think they got some chips already?
Or are you expecting it will be on the market by then?

March 28, 2012 | 05:30 PM - Posted by Josh Walrath

GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again. NVIDIA got a couple of complete chip shipments from them, but I think that until manufacturing starts up again, supply is going to be super tight. So tight that guys like Asus, MSI, and others will not have the amount of product on hand to create a second line of non-reference cards.

March 28, 2012 | 09:37 PM - Posted by Anonymous (not verified)

"GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again."

I doubt the GTX 680 could of been released worldwide, albeit in short supply, if it was.

March 29, 2012 | 12:45 AM - Posted by Josh Walrath

Well, the long and short of it is... NVIDIA set a date for release assuming that TSMC would be able to continue to process wafers at a certain rate until that date. TSMC dropped all production after NVIDIA had set the release date. NVIDIA had enough product out to release the card and have some decent numbers in retail, but after that it would be touch and go. I have heard that the beginning of April will have more cards available than at launch, but the big question is availability after that. I guess time will tell, but from what I am hearing availability might be scarce for a while.

May 28, 2013 | 09:34 PM - Posted by Arthur (not verified)

It's really very complex in this busy life to listen news on Television, thus I just use world wide web for that reason, and obtain the most up-to-date information.

my blog post: here are the findings

March 28, 2012 | 03:27 PM - Posted by Anonymous (not verified)

Thanks, I wanted someone to face off BF: Bad Company 2 and Battlefield 3 so I could compare then side by side.

March 28, 2012 | 05:22 PM - Posted by Anonymous (not verified)

lololololo my gtx 570 scores a 7135 in 3dmark11 ATI is heading downnnnnnn hilllllllllllllllllllll

March 29, 2012 | 11:10 AM - Posted by Finger (not verified)

Are u mentally impaired? His test bed consisted of an AMD Phenom CPU.
Ofcourse the overall score is going to take a major hit on all cards tested.

March 28, 2012 | 08:01 PM - Posted by nabokovfan87

Just got my 7970 Lightning in, BAAARELY fits in my antec 1200 with 3 hdd's (Had to move those).

It's a thing of beauty this is. Kind of sucks Josh had such a horrible experience.

Is there any way to tell if the retail ones that I and others will receive have the updated bios?

March 29, 2012 | 12:47 AM - Posted by Josh Walrath

I'll have to check and see. But if your card is working without issue, there is no real reason to flash the BIOS. VBIOS are typically much simpler than a motherboard BIOS.

March 29, 2012 | 07:40 PM - Posted by nabokovfan87

Last question,

You said you had some OC issues, what was the ASIC quality of your lightning card?

To find it, you go to GPU-Z, click on the upper left corner, and near the bottom it will have ASIC quality.

I know MSI has said in the past they don't bin their cards, but it would appear they may for the lightning. Mine was 82.5% or right around there. The lower the number, the better the OC possibility.

Thanks again man.

So far, benchmarks are:

Alan Wake 12.9 -> 81 AVG FPS by the upgrade
RUSE went from 30 up to 160.

Insanity, I'll try to OC it tonight, see how I do. Only the 2nd thing I have owned to OC it.

March 29, 2012 | 11:58 PM - Posted by Josh Walrath


Bios is (113-AD40900-X01)

March 30, 2012 | 07:44 AM - Posted by nabokovfan87

Bios: (113-AD40900-X01)

So yeah, same bios. Here is a portion of ASIC and OC results.

EDIT: After reading through the thread a bit again, there appears to be some sort of drop off between OC and ASIC Quality.

The lower number means higher voltage, higher potential for OC, but when you get to a point (say below 75 or above 95) there is a dropoff between OC and voltage heavily.

It will take some more looking into, but that is what I have seen for a bit now.

March 30, 2012 | 10:45 AM - Posted by Josh Walrath

Remember those special Phenom IIs that were aimed at the LN2 crowd and only like 1000 of them were made? This seems to be along the same line of thought. Leakier, hotter running chips that take super cooling really well. On air cooling, not an impressive overclock... on LN2, the sky is the limit. So yeah, I would imagine my sample might do well under LN2.

March 30, 2012 | 03:01 PM - Posted by nabokovfan87

It's wierd. I was looking at it tismorning. Stuff at around 65%, with 1175 and 1250 voltages. Got up to around 1250-1300, some on air, some on WC, but everything was drastically varied.

It just seems odd that these things with the same "quality" don't have the same characteristics. But then again, when you have 40+% leakage, it is a lot of heat in such a small area. I know for me I have to re-wire my case and move the HDD up to the top to free up the bottom two 1200 intakes for the GPU. Working on that this weekend, but yeah.

Thanks the insight man. I'll throw up something and try to OC for sure.

March 29, 2012 | 07:58 PM - Posted by nabokovfan87

I was thinking about the bios thing. I can combare the bios HEX to mine and get the version based on that. Can you post or add a GPU-Z screenshot of the MSI 7970 Lightning?

Appreciate it.

March 31, 2012 | 03:58 AM - Posted by nabokovfan87

Right now I'm at 1180 Core Clock and 1440 on the memory.

If I could add voltage to the memory then I would be able to up that as well, but right now It's locked? I read a post that said to mess with the powertune settings to 20%, but the memory voltage didn't change. Perhaps I need to switch to the LN2 bios selection?

EDIT: Had to back things off a bit, way too much voltage I'm guessing, but ended with 1175/1435.

March 31, 2012 | 01:00 PM - Posted by Josh Walrath

There is a CFG setting you have to manually put in with Afterburner to get the memory and PCI-E bus to over volt.

Surprised the memory doesn't go any higher than that for you. Also, put power tune to + 20%. That essentially increases the amount of available power to the GPU. Powertune was put in place so in corner cases like Furmark, the board/chip would not exceed the rated TDP (and shut down).

March 31, 2012 | 02:12 PM - Posted by nabokovfan87

Nevermind. I can OC the memory now. In the msi utility, to the right of the core voltage there is a small arrow, hit that and you can get the other 2 voltage settings.

March 31, 2012 | 03:48 PM - Posted by nabokovfan87

I got it up to 1220/1520 The memory is about done. Even 5 MHz and I get corruption. I don't know whether to add a ton more voltage (I have slider space for 75 mV). I tried 1240 on the core, but got some severe image corruption.

I think I will try some stability testing, just let it run for an hour or so instead of 5-10 loops on the metro benches, and see if anything ends up happenings in terms of corruption. As far as voltage goes, I'm not sure if there is a "less is more" type approach, or if you simply add more when you reach corruption.

From my Computer Engineering background I know a bit about how the ripple and stuff affects everything, and I'm not sure just how much ripply is being introduced and so forth, but, needless to say...

925 -> 1220 = 31.89% OC
1375 -> 1520 = 10.55% OC

April 2, 2012 | 11:33 AM - Posted by Josh Walrath

Wow, not a bad overclock at all. I had overclocked the one I have to 1100 MHz... but I was using Oblivion to play, so I don't know if the vid card was causing problems or that rather unstable game was...

I'm betting the game.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.