Review Index:

MSI GTX 670 Power Edition: When Reference is Not Reference

Author: Josh Walrath
Manufacturer: MSI

Power, Temperature, and Overclocking

Power, Temperature, and Noise

Unlike previous generations of high end NVIDIA cards, the GTX 670 is downright miserly when it comes to power consumption and heat generation. The AMD 7000 series are all a step up from what NVIDIA delivers, and the top end cards often pull in upwards of 50 watts more from the wall at full load.

View Full Size

At idle all of these cards perform very much the same. The GPU and memory speeds on all cards are lowered and the card goes into a low power state. Nothing much to see here unless something is horrifically wrong.

View Full Size

When we get to load, things look a bit different. I am not sure why, but the R7950 that I received must have a really impressive GPU when it comes to fabrication. It sips power and it can overclock like the dickens. It is lower at load than either of the other cards. What is quite interesting though is that the R7870 sits right next to the GTX 670 in terms of load power, but is significantly slower than the other card. Then again, the R7870 destroys the GTX 670 when it comes to GPGPU. Jack of all trades?  I guess so.

Temperatures are going to be very similar between the cards, primarily because they all run an iteration of MSI’s Twin Frozr cooling solution. All of these cards are manufactured on the latest 28 nm HKMG process from TSMC.

View Full Size

View Full Size

At idle, the GTX 670 is downright frigid as compared to the rest. The R7950 is a bit warmer and the R7870 sits in between. At load, things do change up with the R7950 being the coolest and quietest solution.  The R7870 starts to really scream at around 65C, and it maxes out the temp at 69C. The GTX 670 stays at a very pleasant 64C, but most importantly it is not loud at all. The R7950 and GTX 670 are both very, very quiet at load. The R7870 gets surprisingly loud. Still, all of the cooling solutions keep these products under 70C, which is very impressive.



The GTX 600 series of cards introduced a new way of overclocking NVIDIA's graphics cards. Basically, a user pushes to maximum the power limit on the card (on MSI’s Afterburner it goes to 114%). The user can then leave it alone and the card will most likely stay at the maximum boost frequency in any application. The user also has the option to add MHz to the core clock, so if for example they add 50 MHz then a card that typically boosts at 1056 will go to 1106. This is very dependent on the silicon that a user gets when they purchase a card, but NVIDIA and its partners essentially guarantee that the card will at least run (part of the time) at the max boost speed.

There was a very interesting thing about this particular sample. When leaving all of the settings at default, the card was showing a max boost speed of 1179. This is well above the stated 1079 MHz that the documentation details. I am digging further into this, but I am assuming that this is correct. In this case MSI has given the user a nice little gift when it comes to stock performance. At no time was this card unstable during benchmarking and torture testing. And no, torture testing did not involve hours of Farmville [Josh has a smartphone for that now heh].

View Full Size

With just pushing the power limit to 114% I was able to take the clock up to 1250 MHz boost during benchmarking. It stayed at that setting no matter what was done, and would only drop down to lower speeds when the application I was using would close down. This is all before adjusting voltages. The memory was set at a static 1600 MHz (6.4 GHz effective).

Increasing the voltage allowed me to squeeze another 25 MHz out of this card pretty comfortably. 1275 MHz was the max with the extra 0.2v applied. The addition of the voltage unlocking with the Afterburner 2.3 software is a nice touch, but it will not unlock a GPU to go well beyond what stock voltage is able to afford.

August 24, 2012 | 08:29 AM - Posted by mdevos (not verified)

I was wondering how much of a performance gain is seen versus a reference GTX670. Unfortunately, this is not included in the graphs of the game testing. Too bad..

I don't understand the last part of the article, where you mention that MSI's default overclock, is actually causing crashes when put into benchmark / torture test. This would mean MSI's overclock is unstable?

August 27, 2012 | 10:34 AM - Posted by Josh Walrath

You are looking at most at about a 4% increase over a stock clocked GTX 670.  The differences are not all that great.  You are primarily getting this card for the cooling and unique design/voltage control rather than it being a much, much faster product than the stock GTX 670.

The last part actually reads, "At no time was this card unstable during benchmarking and torture tests..." So basically it was perfectly stable even with the boost speed up to 1170+ right out of the box.  The card performed without issue.

August 29, 2012 | 10:49 AM - Posted by Davin (not verified)

Hey Josh, I wanted to say thanks for the article. I have been on the fence about upgrading from my old xfx radeon 5870. I had been hunting the various 670/680 cards and the amd 7970 and so forth. After all is said and done I got this card yesterday for $395 with borderlands 2 and mafia 2. I have overclocked it an have the core at 1170 mhz and my boost has been anywhere from 1250-1350mhz. This is of course with a voltage bump and such. Anyway, thank you again for a quality article that has resulted in a massive performance bump for me. Temperatures are still low even under load and even at 50% fan the card is still nice and quiet. :D

September 1, 2012 | 05:59 PM - Posted by Gambler (not verified)

Josh, I was comparing the BF3 at 2560x1600 Ultra results in your article to those from the Galaxy GTX 670 GC 4GB article. Your results for the MSI GTX 670 PE card are 48fps min and 63.8fps avg. The results for the Galaxy card at the same settings are 26fps min, 41fps avg and 66fps max.

Can the BF3 results for the MSI and Galaxy cards from the two articles be compared? If so, why pay more for the extra 2GB of memory on Galaxy card? Any chance the results for the MSI card are avg and max?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.