Review Index:

MSI R7970 Lightning Review: AMD's HD 7970 Gets the Treatment

Author: Josh Walrath
Manufacturer: MSI

Twin Frozr IV, Power, and Display Options for the R7970

MSI is calling the new cooling solution the Twin Frozr 4. A few weeks back we interviewed Alex Chang of MSI about what changes we would expect to see with TF4. Now we finally get to see the finished product. The biggest change is the use of 2 x 10 cm fans instead of the previous 8 cm products. These fans are still the “propeller” style units we saw introduced with Twin Frozr 3. They are still PWM controlled and at idle run at about 30% speed. These are very quiet units, even at 70%+. They do get pretty audible at 100% fan rate, but unless a user is running this card in 110F weather and overclocked, it is unlikely to get to that speed by itself. The heatsink is still the large, aluminum finned unit with multiple heatpipes coming out. The 8 mm “super pipe” is still in attendance, but the other smaller heatpipes have gone from 5 mm up to 6 mm. At startup the fans blow backwards to remove dust that has built up in the aluminum fins. It does this for about 30 seconds at first boot, and it appears that it does help to remove dust fairly effectively

View Full Size

The bundle itself is pretty good with the CrossFire cable, the voltage monitor cables, the PCI-E power adapters, manuals, CDs, Lightning certificate of quality, DVI to HDMI adapter, and the mini-DP to DP adapter.

The design includes the “dual form-in-one” heatplates which cover the memory and other power components throughout the design. It also acts as a stiffener to stop the PCB from warping when installed in a case. MSI has also included a rear plate which again acts as a stiffener and protects the components on the back of the card.

Component choice is a significant part of what makes this a Lightning card. These fall under the “Military Class III” components. This means that the components are all very rugged and can handle a whole lot of wear and tear. Probably more so than is likely encountered in a PC case. There are some new and old components mixed in with this design. As in the previous Lightning series, it features the CopperMOS chips for power delivery, and this provides for lower overall working temperature and a higher current capacity. Also from previous boards are the Hi-C Caps (aka Tantalum Capacitors) which feature a longer lifespan and less leakage. The next two components are new for this generation. Golden Solid State Chokes make their first appearance with this card. These are gold plated SSCs, supposedly for better heat dispersion and anti-corrosion purposes. In theory solid ferrite chokes can oxidize over time, and potentially lose performance due to that. It also supposedly increases current handling by about 10%. The second new component is the Dark Solid Capacitor. These are aluminum core solid caps with a nickel-plated cover. This promises a longer lifespan as well as lower ESR (equivalent series resistance- lower temperature and a higher efficiency). With these parts this card promises to last far longer than the rest of the components of a system combined. Of interest is the lack of the previously featured “Proadlizer” chips. These apparently were EOL’d and are no longer available

View Full Size

A good view of the outputs for this particular card.  Dual link DVI is so 2011...

The card is fed by 2 x 8 pin PCI-E power connections. This provides up to 300 watts of power for the card. Divvying up this power are a total of 14 phases for just the GPU, 2 phases for the memory, and one phase for the PCI-E/VDDCI connection. The memory phases are provided from the 2 x 8 pin connections, which apparently can be a lot more stable than the reference design which gets its power from the PCI-E/PEG slot. Each of these subsets of components can be adjusted in the Afterburner utility. The Lightning includes a digital PWM controller which allows for faster and more accurate switching and voltage control. There are two V-BIOS chips that can be selected via a toggle at the top of the card. The normal BIOS allows overclocking but does not unlock over current protection or disables active phase switching. The special “LN2” BIOS does in fact unlock OCP and disables active power phase switching. It also allows for a higher CCC range, as well as a higher PowerTune range. This *should* allow for higher overclocks, but the user runs the risk of burning up the chip. This should be used very carefully, and really it is provided for those who want to use LN2 cooling to achieve extreme overclocks. The design also features the “Lightning Power Layer” which provides another independent board layer to improve circuit design and lowers noise and interference.

The board is set up for some extreme monitoring. Not only are things like temperatures, load, and fan speed handled by the Afterburner utility, but there are available hardware points for more precise voltage monitoring. There are three connections that are built into the board that allows for external meters to be plugged directly into the board. This is for the LN2 overclocking group who need to see exactly what voltages the different components are running at

View Full Size

The view from the top is not nearly as exciting, but it gives and idea of the heatpipe placement.  2 x 8 pin power plugs are also nice to see.

Display support is an area that is good and bad. MSI includes two DVI ports as well as four Display Ports. This allows a single card to drive 6 monitors at once. There is a price to pay though. Unlike the Asus DirectCU II cards which feature a similar setup, MSI does not allow a switchable configuration. With the Asus boards a dual link DVI port shares resources with one of the Display Ports. So when the user needs to actually use a dual link capable display over DVI, they could flip a switch on the board and enable that functionality. This particular functionality is notably absent with the MSI board. Both DVI ports are limited to single link resolutions, which only go up to 1920x1200. For those with a 2560x1600 monitor that only uses DVI, then they need to buy a separate powered Display Port to DVI unit (which ranges in cost from $75 to $200). The more available “Active” DP to DVI plugs are only good for single link resolutions. If a user has 2560x1600 displays with native Display Port connectivity, then they will be able to game at that full resolution without the use of a separate adapter.

The final new feature is that of the GPU Reactor. This is a separate PCB from the video card, and it plugs into the back of the unit. It features an extra 8 tantalum capacitors which almost doubles Current Volume, all the while lowering ripple. It also features a few blue LEDs onboard, so when it is installed with the GPU Reactor cap in place, it has a nice blue glow emanating from it. This does free up some space from the board, and it should provide the advantages stated above. Whether or not it really will send the GPU to the next level is debatable. 

March 28, 2012 | 08:40 AM - Posted by kris livengood (not verified)

wow, didn't even test it agains another 7970 really guys, time to get on the ball.

March 28, 2012 | 09:49 AM - Posted by Anonymous (not verified)

If you're looking at this card and the premium it carries, its likely you're buying it for the overclocking headroom, not the base specs.

What I would have thought was more important is this vs a 680. I can understand not putting it in though. Its unlikely there was one available for review at the time this was being put together.

March 28, 2012 | 09:50 AM - Posted by Josh Walrath

Gotta work with what I have. The performance of a standard HD 7970 is not exactly a secret, so I decided to test it against the two previous Lightning cards to really detail what a user gets when upgrading to this particular overclocked monster. In hindsight I guess it would have probably behooved me to lower the clocks on this card to standard settings and gone from there. I will certainly keep that in mind next time I test an overclocked product like this. Also, Ryan is in Kentucky with the standard HD 7970s, and I live in Wyoming. Swapping parts between the two areas is a bit troublesome.

March 28, 2012 | 11:00 AM - Posted by kris livengood (not verified)

So do you think it really is worth the mark up in price?

March 28, 2012 | 12:08 PM - Posted by Josh Walrath

For the $50 increase in price over a stock? Yes, absolutely. But you must remember that this is a brand new product, and the GTX 680 is still not out in force. Once that happens, then I am sure the dynamics of the pricing of these cards will change drastically. I am judging this card by what is available today. So yes, at $599 it is a good card. Two months from now, when there are many different examples of not just HD 7970 cards, but also GTX 680... it might not look like such a nice product at that price. I am pretty sure though that prices will drop pretty dramatically during that time to keep it competitive with other offerings.

March 28, 2012 | 02:45 PM - Posted by kris livengood (not verified)

were you able to take the heatsink off and find out if it is on a reference pcb?

March 28, 2012 | 03:19 PM - Posted by Josh Walrath

It most certainly is not a reference PCB. A reference board has a 5+1 power phase setup (iirc), while this one is 17 total phases. If you look at the pictures of the boards from behind, you can see that the PCB is smaller at the front of the card (display outputs), then gets taller after the CrossFire connectors. It is also longer than the reference design. This is a much larger PCB to accomodate the more power phases, as well as give the necessary room to optimize trace pathways to the different components.

March 28, 2012 | 12:03 PM - Posted by SoneeOO7

Who cares I just bought a GTX680 Son!

March 28, 2012 | 12:19 PM - Posted by Josh Walrath

A good buy! I just hope we get to see more available GTX 680 products soon!!!

April 1, 2012 | 12:46 AM - Posted by Anonymous (not verified)

BUT although the 680 outperforms on a SINGLE monitor application, it would take 2 680's to do an eyefinity setup.

Advantage AMD

April 2, 2012 | 11:32 AM - Posted by Josh Walrath

Nope, the new GTX 680 can output to 4 monitors in total with one card. It will only do 3 monitors in NVIDIA Surround with the 4th being an "accessory monitor" when using 3D applications. So, users no longer require 2 NVIDIA video cards in SLI for more than 2 monitors.

March 28, 2012 | 12:15 PM - Posted by Nilbog

Great article, thanks! I am positive that there will be a 680 version, they would be foolish not to make one.Look forward to reading about it too

I also have to agree, the 50$ premium for better quality parts on the card is well worth it.
I bought a R6970 Lightning just for the quality parts.

March 28, 2012 | 04:56 PM - Posted by Josh Walrath

Obviously I haven't been given a timeline for the eventual GTX 680 Lightning card, but I would expect it to be around 3 months away due to the shortage of chips and the design time for the product as a whole.

March 28, 2012 | 05:09 PM - Posted by Nilbog

You don't think they got some chips already?
Or are you expecting it will be on the market by then?

March 28, 2012 | 05:30 PM - Posted by Josh Walrath

GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again. NVIDIA got a couple of complete chip shipments from them, but I think that until manufacturing starts up again, supply is going to be super tight. So tight that guys like Asus, MSI, and others will not have the amount of product on hand to create a second line of non-reference cards.

March 28, 2012 | 09:37 PM - Posted by Anonymous (not verified)

"GTX 680 chips are scarce. TSMC shut down the 28 nm line in mid-February, and I am unsure if they have started it back up again."

I doubt the GTX 680 could of been released worldwide, albeit in short supply, if it was.

March 29, 2012 | 12:45 AM - Posted by Josh Walrath

Well, the long and short of it is... NVIDIA set a date for release assuming that TSMC would be able to continue to process wafers at a certain rate until that date. TSMC dropped all production after NVIDIA had set the release date. NVIDIA had enough product out to release the card and have some decent numbers in retail, but after that it would be touch and go. I have heard that the beginning of April will have more cards available than at launch, but the big question is availability after that. I guess time will tell, but from what I am hearing availability might be scarce for a while.

May 28, 2013 | 09:34 PM - Posted by Arthur (not verified)

It's really very complex in this busy life to listen news on Television, thus I just use world wide web for that reason, and obtain the most up-to-date information.

my blog post: here are the findings

March 28, 2012 | 03:27 PM - Posted by Anonymous (not verified)

Thanks, I wanted someone to face off BF: Bad Company 2 and Battlefield 3 so I could compare then side by side.

March 28, 2012 | 05:22 PM - Posted by Anonymous (not verified)

lololololo my gtx 570 scores a 7135 in 3dmark11 ATI is heading downnnnnnn hilllllllllllllllllllll

March 29, 2012 | 11:10 AM - Posted by Finger (not verified)

Are u mentally impaired? His test bed consisted of an AMD Phenom CPU.
Ofcourse the overall score is going to take a major hit on all cards tested.

March 28, 2012 | 08:01 PM - Posted by nabokovfan87

Just got my 7970 Lightning in, BAAARELY fits in my antec 1200 with 3 hdd's (Had to move those).

It's a thing of beauty this is. Kind of sucks Josh had such a horrible experience.

Is there any way to tell if the retail ones that I and others will receive have the updated bios?

March 29, 2012 | 12:47 AM - Posted by Josh Walrath

I'll have to check and see. But if your card is working without issue, there is no real reason to flash the BIOS. VBIOS are typically much simpler than a motherboard BIOS.

March 29, 2012 | 07:40 PM - Posted by nabokovfan87

Last question,

You said you had some OC issues, what was the ASIC quality of your lightning card?

To find it, you go to GPU-Z, click on the upper left corner, and near the bottom it will have ASIC quality.

I know MSI has said in the past they don't bin their cards, but it would appear they may for the lightning. Mine was 82.5% or right around there. The lower the number, the better the OC possibility.

Thanks again man.

So far, benchmarks are:

Alan Wake 12.9 -> 81 AVG FPS by the upgrade
RUSE went from 30 up to 160.

Insanity, I'll try to OC it tonight, see how I do. Only the 2nd thing I have owned to OC it.

March 29, 2012 | 11:58 PM - Posted by Josh Walrath


Bios is (113-AD40900-X01)

March 30, 2012 | 07:44 AM - Posted by nabokovfan87

Bios: (113-AD40900-X01)

So yeah, same bios. Here is a portion of ASIC and OC results.

EDIT: After reading through the thread a bit again, there appears to be some sort of drop off between OC and ASIC Quality.

The lower number means higher voltage, higher potential for OC, but when you get to a point (say below 75 or above 95) there is a dropoff between OC and voltage heavily.

It will take some more looking into, but that is what I have seen for a bit now.

March 30, 2012 | 10:45 AM - Posted by Josh Walrath

Remember those special Phenom IIs that were aimed at the LN2 crowd and only like 1000 of them were made? This seems to be along the same line of thought. Leakier, hotter running chips that take super cooling really well. On air cooling, not an impressive overclock... on LN2, the sky is the limit. So yeah, I would imagine my sample might do well under LN2.

March 30, 2012 | 03:01 PM - Posted by nabokovfan87

It's wierd. I was looking at it tismorning. Stuff at around 65%, with 1175 and 1250 voltages. Got up to around 1250-1300, some on air, some on WC, but everything was drastically varied.

It just seems odd that these things with the same "quality" don't have the same characteristics. But then again, when you have 40+% leakage, it is a lot of heat in such a small area. I know for me I have to re-wire my case and move the HDD up to the top to free up the bottom two 1200 intakes for the GPU. Working on that this weekend, but yeah.

Thanks the insight man. I'll throw up something and try to OC for sure.

March 29, 2012 | 07:58 PM - Posted by nabokovfan87

I was thinking about the bios thing. I can combare the bios HEX to mine and get the version based on that. Can you post or add a GPU-Z screenshot of the MSI 7970 Lightning?

Appreciate it.

March 31, 2012 | 03:58 AM - Posted by nabokovfan87

Right now I'm at 1180 Core Clock and 1440 on the memory.

If I could add voltage to the memory then I would be able to up that as well, but right now It's locked? I read a post that said to mess with the powertune settings to 20%, but the memory voltage didn't change. Perhaps I need to switch to the LN2 bios selection?

EDIT: Had to back things off a bit, way too much voltage I'm guessing, but ended with 1175/1435.

March 31, 2012 | 01:00 PM - Posted by Josh Walrath

There is a CFG setting you have to manually put in with Afterburner to get the memory and PCI-E bus to over volt.

Surprised the memory doesn't go any higher than that for you. Also, put power tune to + 20%. That essentially increases the amount of available power to the GPU. Powertune was put in place so in corner cases like Furmark, the board/chip would not exceed the rated TDP (and shut down).

March 31, 2012 | 02:12 PM - Posted by nabokovfan87

Nevermind. I can OC the memory now. In the msi utility, to the right of the core voltage there is a small arrow, hit that and you can get the other 2 voltage settings.

March 31, 2012 | 03:48 PM - Posted by nabokovfan87

I got it up to 1220/1520 The memory is about done. Even 5 MHz and I get corruption. I don't know whether to add a ton more voltage (I have slider space for 75 mV). I tried 1240 on the core, but got some severe image corruption.

I think I will try some stability testing, just let it run for an hour or so instead of 5-10 loops on the metro benches, and see if anything ends up happenings in terms of corruption. As far as voltage goes, I'm not sure if there is a "less is more" type approach, or if you simply add more when you reach corruption.

From my Computer Engineering background I know a bit about how the ripple and stuff affects everything, and I'm not sure just how much ripply is being introduced and so forth, but, needless to say...

925 -> 1220 = 31.89% OC
1375 -> 1520 = 10.55% OC

April 2, 2012 | 11:33 AM - Posted by Josh Walrath

Wow, not a bad overclock at all. I had overclocked the one I have to 1100 MHz... but I was using Oblivion to play, so I don't know if the vid card was causing problems or that rather unstable game was...

I'm betting the game.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.