Review Index:

MSI GeForce GTX 780 3GB Lightning Graphics Card Review

Author: Ryan Shrout
Manufacturer: MSI

Power Design and Overclocking

After a great cooler design the next step in building a great custom graphics card is to customize the PCB and power delivery in interesting and useful ways.  MSI has stepped up the both of these areas on the GTX 780 Lightning.

View Full Size

All of the critical power delivery and regulation hardware is using what MSI calls "Military Class 4" components that are high quality and over designed for the stress of overclocking.  Included on the GTX 780 Lightning are dark solid capacitors, DrMOS4 MOSFETs, updated super ferrite chokes and Hi-C capacitors rated at 93% efficiency.  If these components seem familiar then you have probably read about MSI's Z87 M-Power motherboard that utilizes the same parts.

View Full Size

The Lightning was built with a triple PWM controller design; one for the memory, one for the GPU and an auxiliary.  By altering the path of incoming power so that both the GPU and memory get power exclusively through the power supply PCIe connectors, MSI claims to reduce noise and ripple on the memory to help with overclocking headroom.  It also can lower the stress on the motherboard power delivery (through the PEG slot).

View Full Size

This triple PWM design also multiplexes the signal with double feedback loops and creates a more stable delivery circuit.  The result of all of this is a smaller voltage ripple that fluctuates less during load.  In theory, this should mean better stability and higher overclocking limits.

View Full Size

That not enough for you?  MSI has included a switch on the top of the GTX 780 Lightning that switches between the standard operating mode and an LN2 mode that removes limits on core power, amperage and TDPs.  I am interested to see NVIDIA's response to this as they have been very tight lipped about voltage changes on Kepler graphics cards and I saw more than one previous instance of card functionality getting shut down.

View Full Size

On the back of the GTX 780 Lightning, as we saw with several previous Lightning designs, MSI has included its GPU Reactor.  This concentrated cap design provides extra power, and extra clean power, directly behind the GPU and it has 300% more power volume (capacitance) than the reference builds.

View Full Size

For overclockers MSI has provided a solid set of tools and options for them to take advantage of, either for LN2 users or the more common air cooling weekend warriors.  First up are the V-Check points that allow you to directly measure the voltages of the GPU, memory and auxiliary lines with a common volt meter. 

View Full Size

Next, MSI has updated its Afterburner software to support the GTX 780 Lightning and provide three different voltage tweaking settings without having to modify any hardware.  Again you can turn dials for the GPU, memory and auxiliary voltage lines though they are somewhat limited (likely due to the same NVIDIA restrictions mentioned before).  Still, adding 35 mV of power to the GPU can help take an overclocked setting on the edge to stable state.

View Full Size

Finally, you can monitor the temperatures of specific voltage locations as well - memory, GPU and VRM - all right in the Afterburner software. 

August 27, 2013 | 10:28 PM - Posted by Johnny Rook (not verified)

Cool card but 1173Mhz core boost overclock "only"? C'mom, my reference PNY GTX780 overclocks stable 1173MHz without any overvoltage... I was expecting a lot more of this Lightning...

August 27, 2013 | 11:04 PM - Posted by Ryan Shrout

Check the update on the last page - actually running at 1241 MHz.

August 30, 2013 | 08:04 AM - Posted by Wendigo (not verified)

ALL of the kepler cards have a Max Boost different to the official Boost. It isn´t only about MSI Lightning. Everyone card has a unique Max Boost, different to the official boost of the model and other units of it.

ALL of the kepler cards reach upper clocks than the official boost that GPU-z shows.

August 27, 2013 | 10:30 PM - Posted by Mr Marauder (not verified)

The GPU reactor can be taken off without any impact on performance, as it only starts to shine when watercooled or LN2 cooled.

August 27, 2013 | 11:04 PM - Posted by Ryan Shrout

Ah cool, thanks for that!  I'll try removing it tomorrow.

August 27, 2013 | 11:47 PM - Posted by tabuburn (not verified)

You should give EVGA a call and get a sample of their GTX 780 Classified and compare it to this since both tend to trade blows.

August 28, 2013 | 03:20 AM - Posted by capawesome9870

i just got an idea on how to improve the graphs you give out with these Frame Rating (although they are very interesting to me being an AMD user).

the idea is to give 'highlight' of the performance and actually give the number of the observed FPS 2-5 times per benchmark, just the observed because that is the one that matters.

another thing could you make the graphs stretched a little by cutting out the bottom portion of the graph. example: if all the FPS is above 20, why show the graph part under 15. this would be to give a little more clarity when just looking at the graphs in the article with out clicking on a particular graph to get the bigger picture of it.

August 28, 2013 | 08:33 AM - Posted by Ryan Shrout

Thanks for the ideas.  I'll play around with trying to make them more readable in the coming weeks!

August 29, 2013 | 03:07 AM - Posted by capawesome9870

cool: thanks Ryan. Keep up the good work.

September 5, 2013 | 10:19 AM - Posted by Mountainlifter

Hi Ryan, I was wondering if you could include an average line on the observed FPS graph or state the average. Like on this graph.

Or is there a reason why it is left out?
It makes it easier to compare with what other review sites report.

August 28, 2013 | 03:33 AM - Posted by Anonymous (not verified)

This card is design for overclocking, like the clasified you should be able to overvolt the core to 1.35 volt safely and probably hit 1300 to 1400 on the core. It is a bit pointles to test the card without any overvolting because the limitation of Nvidia of 1.21v will let the card perform like any other....

August 28, 2013 | 03:50 AM - Posted by not impressed (not verified)

overclock the dam thing and then people might care about this panzy review.

oc max vs calssified oc max vs gtx titan oc max at 1.35v. you won't because that's way above your skillset as usual ryan.

August 28, 2013 | 07:56 AM - Posted by Ian (not verified)

And he overclocked it to a level the typical gamer will use on this card.

Not everyone wants to massively clock their card just to see high numbers pop up on a screen once a graphics test is done because they lack any real sense of achievement in life that they feel great about dialing some numbers into an electronic device to make it run faster. Most people don't particularly care about that they just want a little bit of extra power and safe temps while gaming.

Yes this card is for overclocking and he overclocked it but the option is there should you wish to push it further. This doesn't mean a standard consumer who has no interest in number chasing cannot purchase this card for their own use to play games on a Titan level for less money.

Is it possible you are concerned as to what the new Lightning can achieve?

My Galaxy HOF is completely stable at 1293mhz on the core under gaming and can push 1330mhz for synthetic benchmarks. Lets hope the Lightning can at least exceed these. I myself am however just a normal gamer and not big into overclocking because its not my main focus to number chase. I have more interesting things to do with my time.

August 28, 2013 | 08:35 AM - Posted by Ryan Shrout

First, thanks for stopping by.  :)

Second, max OC vs max OC is probably one of the worst indicators for basing a conclusion of a graphics card on.  It's important, but can't be the only or even primary concern.  Why?  VARIABILITY.

Each card sample is going to be different on the Lightnings, each is going to be different on the Classified, and the difference between the lines is likely going to vary even more.  I can overclock this card further (and I will I'm sure once I get the updated BIOS from MSI) but even if I get 50 MHz additional clock doesn't really mean YOU or any buyer will.  

Now if MSI were to send me a dozen cards for each review we did, you'd have a better argument.

Also, go fuck yourself.

August 28, 2013 | 11:56 AM - Posted by Thedarklord

Great response, lol, you rock dude.

August 30, 2013 | 11:58 AM - Posted by snook

way to poop on him.

August 28, 2013 | 11:49 AM - Posted by mLocke

This may help you.

August 28, 2013 | 08:41 AM - Posted by Anonymous (not verified)

On the first comparative graphs, you reference the 680 and the 680 sli, but not the 780?

August 28, 2013 | 08:45 AM - Posted by Ryan Shrout

If you are looking at THIS page then you are just seeing our explanation of the TYPE of graphs you are going to see, not the specific results for this review.

August 28, 2013 | 09:20 AM - Posted by !Shocked! (not verified)

Did you JUST NOW figure out that GPUz doesnt show the ACTUAL boost clock? Wow guys... I love your reviews but holy cow, pull it together!

August 28, 2013 | 09:24 AM - Posted by !Shocked! (not verified)

So does this mean all your past reviews with boost clocks will need adjusting to show the actual clocks reached?

ANother question, did it actually HOLD that clock with the 109% limit or was it all over the place? I ask that as I am not sure if PEAK values are what should be reported. I am personally torn.

I know that 109% limit was hit VERY easily.

August 28, 2013 | 11:14 AM - Posted by Ryan Shrout

It held that based on the logs, yes.  We are awaiting another BIOS from MSI that will move the 109% slider a bit further...

August 28, 2013 | 09:40 AM - Posted by ProxYO (not verified)

EVGA's Classi vs Galaxy's HOF vs MSI's Lightning

plz&ty ;D

August 29, 2013 | 12:05 AM - Posted by tabuburn (not verified)

I second this.

August 28, 2013 | 10:36 AM - Posted by !Shocked! (not verified)

Last message!

The extra cooler does NOT go on top of the other one. It is used when you go Dry Ice or LN2 as the full cover bracket that comes with it can get in the way of mounting plates.

August 28, 2013 | 11:20 AM - Posted by Ryan Shrout


August 28, 2013 | 11:58 AM - Posted by Thedarklord

Something I noticed, and maybe I missed it in the review, but I've been seeing a lot of problems with benchmarking on Windows 8 and it not being accurate due to Real Time Clock issues?

Is that something that was noted in the article or was the hotfix applied?

August 28, 2013 | 12:44 PM - Posted by Anonymous (not verified)

Why for the dirt and skyrim config page screenshot do you show 1080p but show results in 1440p?

August 28, 2013 | 01:15 PM - Posted by Ryan Shrout

Settings screenshots show the SETTINGS but the resolution is variable.  Many times we test at 19x10, 25x14 and 57x10.

August 28, 2013 | 03:30 PM - Posted by Anonymous (not verified)

What is the max the voltage can be set to?

August 29, 2013 | 12:04 AM - Posted by tabuburn (not verified)

1.35v I believe.

December 14, 2013 | 02:07 PM - Posted by Anonymous (not verified)

1.212v in bios on P00
100mv in after burner (SE version) with voltage control enabled in settings.

you can use Rbby258 ABVoltmod to raise overvoltage in after burner above 100mv
but you won't be able to save.

i made my own bios which allows me to get better clocks then the MSI bios.

but i still need to find a way to raise the 1.212 too in bios itself.

actually i am busy on it right now:)
i was doing some reading to get more research on where it overrides the bios chip in regedit.

still cant find it:(
but i will

May 4, 2014 | 10:19 AM - Posted by Russell (not verified)

Thanks for your marvelous posting! I really enjoyed reading it, you might be a great
author.I will be sure to bookmark your blog and will eventually come back
down the road. I want to encourage you to definitely continue your great writing,
have a nice day!

Here is my weblog ... insurance car

August 28, 2013 | 05:30 PM - Posted by Anonymous (not verified)

GPU-Z does not misreport the boost clock, but neither is the boost clock is not the actual maximum speed of any Kepler part.

The discrepancy is because of GPU Boost. Effectively, Kepler has an additional boost on top of that boost clock, which comes into play if the card is still within certain thermal and power thresholds.

I'm sure you guys have noticed from well before that just about every GTX670 runs way higher than its rated 980MHz boost clock.

August 28, 2013 | 07:00 PM - Posted by Brett from Australia (not verified)

Nice review Ryan, I really like the design of this card and some very nice performance numbers there. Just like to also say you guys at do a great job with your hardware reviews.

August 28, 2013 | 07:48 PM - Posted by Anonymous (not verified)

Surely at that price point they could have gone 4GB?

Seems needlessly gimped on paper.

Otherwise, solid review.

August 29, 2013 | 12:02 AM - Posted by tabuburn (not verified)

The GK110 chip on the GTX 780 and GTX Titan has a 384-bit memory interface which means it'll only support VRAM of 1.5GB, 3GB, 4GB etc. As far as I know, NVIDIA has restricted board partners to only have 3GB on the GTX 780's to better differentiate them from the GTX Titans.

August 29, 2013 | 12:03 AM - Posted by tabuburn (not verified)

Sorry. I meant 1.5GB, 3GB, 6GB.

August 31, 2013 | 04:56 PM - Posted by Edge86 (not verified)

At first, thank you for this review, Ryan. :)

For reading out the actual GPU Boost Maximum you can use Orbmu2k's NVIDIA Inspector: blog)

direct download:

This multi functional tool btw. is also kind of a holy grail to image quality lovers allowing you to set SGSSAA with custom AA-bits in the profiles of DX9 games or enhance ingame MSAA to SGSSAA (not that good in DX10+, driver based downsampling combinded with ingame-AA looks better in that cases). But there are a lot more options. Another cool thing is that you can also use custom SLI-bits if the scaling isn't that good by default NV profiles when there are no official updates available from Santa Clara. It also comes with some nice monitoring graphs or mdps (multi display power saver). With mdps you can force the GPUs to clock in P8 or P12 state @ Fermi, and here mdps is very usefull - even with one display, allowing you for example to watch 720p material running smoothly @ P12.

I made some screenshots of the tool's options. :)

Note that for overclocking you need to run Inspector as an administrator since NV driver branch R325.

P.S.: You don't need to use a manuell negative LOD bias adjustment for SGSSAA in most cases any more because NV sets it by default for all possible levels when setting SGSSAA.

Another cool feature is that you can export and import all customized Driver profiles as .nip files - just check the top bar at Profile Settings, add new .exes to a profile, delete custom Driver profiles when NV added them and so on.... :)

I recommend the following forums threads when you are interested in sparse grid anti alaising or custom SLI-bits:

german 3D Center forums

Guru3D Forums


August 31, 2013 | 05:04 PM - Posted by Edge86 (not verified)

I forgot to mention that when you want to use OC and don't want to accept the warnings all over again, that you should make a link to the nv inspector.exe, right click -> properties -> target -> add " -disableWarning " like shown below.


September 3, 2013 | 04:23 PM - Posted by Zzzsleep (not verified)

It's too early to tell how good this card is going to end up. The 680 Lightnings were strong cards, but when they got released, they did not even had the voltage lock.

Air cooling tends to top out at about ~1.4 to 1.425V, so judging by Ryan's results, a mid 1300s OC is achievable, with a 1400 perhaps for benchmarking. That will represent the upper limit of what air cooling will do.

Out of curiosity Ryan, how was the cooler? Is this new 3 fan design better than the 2x 100mm fans of the previous generation?

September 13, 2013 | 01:08 AM - Posted by owie

any input will be helpful please.
Okay need some help please? I have just recently upgraded my rig.
• Asus ROG Maximus VI Hero Z87 ATX LGA Socket 1150
• Intel Core i7-4770K Processor
• Corsair Hydro Series H110 Liquid Cooler
• Kingston Hyperx 120gb ssd
• Seagate 1TB Serial ATA HD 7200/64MB/SATA-6G (Back up and Games)
• Kingston HyperX Beast 32GB (4 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800)
• XFX Radeon HD 7970 Double D 3GB DDR5 X2 In crossfire
• CORSAIR Professional Series Gold AX1200 (CMPSU-1200AX)
• Win 7 Ultimate
So here is the issues my graphics cards running in crossfire I just don’t seams that I get the performance that I should be getting low frames rates in servile games?
1. Current Driver for the Graphics card I have used Both, Not really much improvement with either of them.
The 13.4 Catalyst driver (5/29/2013)
And 13.10 Catalyst Beta driver (09/05/2013)

• Assassins Creed 3 less than 60fps while playing
• Batman Arkham City less than 60 fps
• Crysis 3 Same thing
• Battlefield 3 runs good over 70fps
• Bio shock Infinite runs good over 70fps
• Total War: Rome II on average about 30 to 40fps
So are they just S*&t cards or what, so I have been Kicking around the idea of just buying a MSI N780 LIGHTNING GeForce GTX 780. And selling my AMD cards.
So my question is will I SEE A NOTASBLE PERFROMANCE INPROVMENT with this card over my AMD cards? And is it worth the money? Now first I need you to understand that I work a lot of hours and when I am home I just want to turn on my computer and game not really in to the over clocking thing a lot now I will OC my chip but other than that just want to game. Now if there is a simple fix for my current Graphics card that fine but really just want the performance sit down and go? HELP???????

September 14, 2013 | 07:43 PM - Posted by N3n0 (not verified)

You're using AMD cards. Thats the problem.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.