Review Index:

NVIDIA GeForce GTX 780 Ti 3GB Review - Full GK110 Crashes into Hawaii

Manufacturer: NVIDIA

GK110 in all its glory

I bet you didn't realize that October and November were going to become the onslaught of graphics cards it has been.  I know I did not and I tend to have a better background on these things than most of our readers.  Starting with the release of the AMD Radeon R9 280X, 270X and R7 260X in the first week of October, it has pretty much been a non-stop battle between NVIDIA and AMD for the hearts, minds, and wallets of PC gamers. 

Shortly after the Tahiti refresh came NVIDIA's move into display technology with G-Sync, a variable refresh rate feature that will work with upcoming monitors from ASUS and others as long as you have a GeForce Kepler GPU.  The technology was damned impressive, but I am still waiting for NVIDIA to send over some panels for extended testing. 

Later in October we were hit with the R9 290X, the Hawaii GPU that brought AMD back in the world of ultra-class single GPU card performance.  It has produced stellar benchmarks and undercut the prices (then at least) of the GTX 780 and GTX TITAN.  We tested it in both single and multi-GPU configurations and found that AMD had made some impressive progress in fixing its frame pacing issues, even with Eyefinity and 4K tiled displays. 

NVIDIA dropped a driver release with ShadowPlay that allows gamers to record playback locally without a hit on performance.  I posted a roundup of R9 280X cards which showed alternative coolers and performance ranges.  We investigated the R9 290X Hawaii GPU and the claims that performance is variable and configurable based on fan speeds.  Finally, the R9 290 (non-X model) was released this week to more fanfare than the 290X thanks to its nearly identical performance and $399 price tag. 

View Full Size

And today, yet another release.  NVIDIA's GeForce GTX 780 Ti takes the performance of the GK110 and fully unlocks it.  The GTX TITAN uses one fewer SMX and the GTX 780 has three fewer SMX units so you can expect the GTX 780 Ti to, at the very least, become the fastest NVIDIA GPU available.  But can it hold its lead over the R9 290X and validate its $699 price tag?

Continue reading our review of the NVIDIA GeForce GTX 780 Ti 3GB GK110 Graphics Card!!

GeForce GTX 780 Ti - GK110 Full Implementation

GK110 isn't new to us, but this implementation of it is.  When NVIDIA launched the GTX TITAN it was with an incomplete GPU - there was a single SMX disabled of the available 15 on the die, bringing its CUDA core count to 2,688.

View Full Size

But now, with the new GeForce GTX 780 Ti, NVIDIA is enabling all 15 SMX units for a grand total of 2,880 cores, all running with a base clock of 875 MHz and a Boost clock (typical) of 928 MHz.  These clock speed are both increases over the GTX TITAN as well (39 MHz on base, 52 MHz on boost) which obviously tells us that overall compute performance of the 780 Ti will outstrip the TITAN for gaming.  Texture units also increase from 224 to 240 giving the card a bit more bandwidth for fill rate.

The 384-bit memory interface remains the same, though the GTX 780 Ti has 3GB of GDDR5 memory running at 7.0 Gbps, a full 1.0 Gbps increase over the TITAN specifications.  Memory bandwidth improves to 336 GB/s which just passes the 320 GB/s of the R9 290X and its 512-bit memory interface.  The 6GB frame buffer on TITAN is still obviously king for users that need it but in my testing there isn't a big gaming advantage for users compared to the 3GB buffer on the GTX 780 or GTX 780 Ti.

View Full Size

If we compare the GTX 780 to the GTX 780 Ti, obviously the newcomer has the advantage in all areas.

  GTX 780 Ti GTX TITAN GTX 780 R9 290X R9 290 R9 280X
Process 28nm 28nm 28nm 28nm 28nm 28nm
Transistors 7.1 billion 7.1 billion 7.1 billion 6.2 billion 6.2 billion 4.3 billion
Shaders 2880 2688 2304 2816 2560 2048
Clock Speed 875 MHz 836 MHz 863 MHz up to 1000 MHz up to 947 MHz 1000 MHz
Memory Width 384-bit 384-bit 384-bit 512-bit 512-bit 384-bit
Memory Clock 1750 MHz 1500 MHz 1500 MHz 1250 MHz 1250 MHz 1500 MHz
Compute Perf 5.04 TFLOPS 4.49 TFLOPS 3.97 TFLOPS 5.6 TFLOPS 4.9 TFLOPS 4.1 TFLOPS
Texture Units 240 224 192 176 160 128
ROPs 48 48 48 64 64 32
Frame Buffer 3GB 6GB 3GB 4GB 4GB 3GB

*Please note that these TFLOP ratings are using NVIDIA's base clock while AMD's results are using their "up to" peak clock.  While interesting to see, you should really be comparing only NV to NV on the table above.

With 25% more CUDA cores, while also running at slightly higher clocks speeds, the GTX 780 Ti should be a sizeable step ahead of the original GTX 780 in performance in all areas.  Add in the higher memory bandwidth

The GTX 780 Ti card itself is rather unremarkable in its deviation from the standard NVIDIA has set for itself with high end GPU offerings in the last year plus.  The 780 Ti uses the same style of cooler we saw first with the GTX 690 and it continues to be a stylish, high performance solution for cooling massive, hot GPUs.  What will be interesting to evaluate this time is how much louder and hotter NVIDIA was willing to go to compete with AMD's new standards on the R9 290X and R9 290.

View Full Size

The reference card still looks sexy and the addition of the darker lettering on the "GTX 780 Ti" routing adds some aggressiveness to the color scheme.  Card length remains the same as the GTX 780 and GTX TITAN at 10.5 inches

View Full Size

NVIDIA has continued to include a pair of dual-link DVI outputs, a full-size HDMI and a full-size DisplayPort for monitor connectivity.

View Full Size

Somewhat of a surprise, the GTX 780 Ti is still only using an 8+6 pin power configuration despite the higher core count and clock speed and maintains the same 250 watt TDP as GTX 780 and TITAN.

November 7, 2013 | 09:51 AM - Posted by Prodeous

Is there a possibility to see some GPU computing benchmarks?

like Blender or CIV or other. Just to see the differences between the 780, 780ti and TITAN, just to see if this double precision in fact make that much of a difference in these applications?

November 8, 2013 | 10:12 AM - Posted by Anonymous (not verified)

Tom's Hardware has a review that goes over a lot of that.

November 7, 2013 | 09:54 AM - Posted by EndTimeKeeper

Hey Ryan what software do you use to track GPU clock frequencies and Temps? I ask because I would like to monitor my new 290x and see what lvl my card is running at while playing BF4. Also can I trust The AMD control center app , when it says that my card is running at 92C with my GPU Clock speed at 1GHz threw out a 40 minute match of BF4? just to let you know my settings are set to Ultra @1440p

November 7, 2013 | 10:02 AM - Posted by Ryan Shrout

I'm just using GPU-Z.


November 8, 2013 | 03:40 AM - Posted by arbiter

hrm, my 780 i got running BF4 on ultra @ 1080p only tops at 67c with a clocking of 1175mhz gpu and 6.5ghz on the memory

November 7, 2013 | 10:09 AM - Posted by Anonymous (not verified)

Um, why is the Titan still about a thousand bucks? Surely 3gig more ram isn't worth the difference.

November 7, 2013 | 10:24 AM - Posted by Titan FP64 FTW (not verified)

Titan retains its 1/3 FP64 while the 780Ti only gets 1/24 which is really bad. Even the R2 290X gets its 1/8 FP64. Keep in mind these are only useful to those that need GPU computing cards. Gamers need not apply.

November 7, 2013 | 04:42 PM - Posted by Scott Michaud

While high precision is not necessary for games, there are a few areas where it can get chicken/eggy. Could they improve in some areas if the performance was ubiquitous? Maybe.

November 7, 2013 | 10:27 AM - Posted by jon (not verified)

seriously ryan where are the REAL measured clockspeeds?

Are we looking at a ~960mhz 780Ti beating the 290X @~800mhz?

Without the clock speeds this comparison is broken..

November 7, 2013 | 10:45 AM - Posted by Ryan Shrout

If you really want to see the clock speeds, check out this story:

It is interesting to discuss and showcase, but performance is performance, regardless of the clock rates.  So looking at gaming benchmarks is really where the decision should lie.

November 7, 2013 | 10:31 AM - Posted by Anonymous (not verified)

So the Titan is the poor man's Quadro! I too would like to see some Blender benchmarks, maybe someone at pcper, who is not a busy with the gaming benchmarks as Ryan, could do some benchmarking of open source graphics software using these GPUs! As low cost as the AMD R9 290 series is, and as powerfull as blender is becoming, it would be very helpfull if someone at pcper would test more open source graphics software, as most of the blender websites lack the technical hardware knoledge that pcper has!

November 7, 2013 | 10:45 AM - Posted by Ryan Shrout

We'll see what we can dig up!

November 7, 2013 | 10:33 AM - Posted by The Common Cold (not verified)

Is Skyrim in these benchmarks just to have a DX9 title in the mix? If that is the case, why not use a more demanding title, like The Witcher 2?

November 7, 2013 | 10:46 AM - Posted by Ryan Shrout

Hmm, that might be a good idea.  I just have a lot of legacy comparison results with Skyrim.  I'll try to install The Witcher 2 this weekend and see if I can find a good benchmark segment.

November 7, 2013 | 11:27 AM - Posted by The Common Cold (not verified)

Maybe we finally have a single GPU that can handle that crazy UberSampling...
Good luck!

November 7, 2013 | 12:48 PM - Posted by KevTheGuy (not verified)

Yeah that ubersampling is so demanding but it makes the game look so much better *_*

November 7, 2013 | 10:55 AM - Posted by TechZombie

I love competition! Personally I'm waiting for some benchmarks from Hawaii with 3rd party cooling before I make a buying decision. But you have to admit that with the games bundled Nvidia is doing an amazing job.

November 7, 2013 | 12:05 PM - Posted by azuza001 (not verified)

Sorry if this has already been brought up but I do have to ask, from the reviews I have read on the R9 290 vs the 290X it seems the general consensus is the regular 290 often gives the same results due to the way amd is handling boost speeds now. If that's the case then could you not further say that you could buy almost the same perfomance of the 780 ti for half the cost with a 290?

November 7, 2013 | 12:22 PM - Posted by Dave (not verified)

Ryan, when will you be implementing BF4 to the benchmark lineup? Please.

November 7, 2013 | 01:14 PM - Posted by Ryan Shrout

Soon hopefully!  I haven't even had a chance to install it yet honestly though...

November 7, 2013 | 12:42 PM - Posted by Mac (not verified)

Ryan, when you say this:
" Even though AMD's 290X cards have a better average frame rate and lower frame times, the variance in those frames is noticeably higher again. Take a look at the width of the blue line (780 Ti SLI) and the orange line (290X CrossFire) and you'll see that NVIDIA tends to be more consistent, making the average frame rate less of a victory for AMD than it might otherwise be."
Are you calling it on the data or is it something you actually see on screen? I think a good subjective opinion would be much more useful than any data when the variance is this low, what do you reckon?

November 7, 2013 | 01:15 PM - Posted by Ryan Shrout

The answer is actually both.  Sometimes, the data shows a problem that I then go back and verify and sometimes I see the problem first, then verify it with data.  At this point though, I have tested these games and these configurations enough that the surprise is gone.

In that particular case, I would tell you that you can definitely see and feel the difference while playing in real time.

November 7, 2013 | 01:32 PM - Posted by Mac (not verified)

I see, thank you, all this data hurts my head so I go by "feel" now :-)

November 7, 2013 | 01:12 PM - Posted by snook

the first R9 290 or R9 290X with aftermarket cooling is going beat the 780Ti.

this card had to destroy the the AMD offerings. it didn't.
how much crow is being served over the noise and power draw
of the 290's now. lots.

ryan, you seemed physically saddened by the underwhelming performance, loudness and power hungry traits of this card that Nv HAD to make. and that price...yeow.

this is as much a loss as it is a win for Nv. they were not ready for what AMD did. hell, now I'm sad. :/

November 7, 2013 | 01:17 PM - Posted by Ryan Shrout

I agree that NVIDIA should likely have wanted more performance out of the 780 Ti, but they didn't find it.  Still, they are getin 10-15% better perf in the most demanding games at less power, heat and noise than the 290X.  But it does cost more, which is its primary draw back.

November 7, 2013 | 01:23 PM - Posted by snook

ryan. any idea if AMD has a 300X hiding some place?
the fight could keep going next week if they did. lol

November 7, 2013 | 04:03 PM - Posted by Ryan Shrout

I don't think so, not for a while.

November 8, 2013 | 04:13 PM - Posted by BlackDove (not verified)

How were they not ready? GK110 is ancient now, and it manages to beat the newest AMD GPU. R290 and R290X use a brand new GPU with a 512bit memory bus and this thing still beats it, consumes less power, and doesn't get nearly as hot.

It's $150 more. That's about the only negative.


Do the 780 and 780Ti have the double precision capability that Titan does? If so, this and 780 are even better values.

November 9, 2013 | 11:45 PM - Posted by snook

so ancient it's their premier card, the "new" 290 is a modified GCN, not new. and, it's was pointed out correctly that these cards throttle (290) and the 780Ti wasn't doing that. it won on that strength alone.

and Nv had to allow higher core temp and fan speed to achieve this "win"...stunningly familiar.

the first partner produced R9 290X is gonna beat the 780Ti and I don't want to hear "it was running at a higher clock". ryan excused that away in this comment section.

and Nv wasn't ready or they wouldn't have made the mad scramble to reduce prices to save face. accept it.

November 7, 2013 | 01:29 PM - Posted by snook

forgot to say, great article. you again proved yourself an unbiased reviewer.
let your critics be damned.

November 7, 2013 | 03:51 PM - Posted by BA (not verified)

Ryan I have to say that your review was very well done. Thank you for this.

However, when you make a comment like:
"...if you are gaming on a graphics card like this on a standard 1080p panel, you are doing it wrong quite frankly. Sorry.",
you are completely out to lunch.

Those of us with 120 Hz and higher 1080 panels would beg to differ, and as you can clearly see from the benchmarks at techspot, these high-end cards are really necessary to get consistent >60 FPS in Battlefield 4 at the highest settings- not to mention the new games coming down the pipe in the near future.

November 7, 2013 | 05:11 PM - Posted by Ryan Shrout

Very true, I apologize for that.  120 Hz panels change that quite a bit.  I'll update!

November 7, 2013 | 06:42 PM - Posted by Anonymous (not verified)

Can we see a throttling comparison ?

780 Ti

Be nice to see which one is able to keep there performance the best over a longer period of gaming.

November 7, 2013 | 11:29 PM - Posted by arbiter

that is easy one, as if you look at 290x review when running a bench mark twice 1 right after another, amd card had like 10% drop in 2nd run cause the cooler is soaked with heat and not cooling as well. AMD what i see used 95c cause they wanted to beat Nvidia, but cause now gpu is running near max what it can it has no room past its set max. As for which you are asking will keep performance, it would be 780/780ti's that will as they are running cooler so don't have to throttle as much.

November 8, 2013 | 05:59 AM - Posted by Anonymous (not verified)

This site indicate the 780 Ti still throttles unless you increase the fan speed to 80%.,Z-X-408...

November 9, 2013 | 01:03 AM - Posted by arbiter

card being running a default settings, which is 80c throttle, sure the card hits that 80c and throttles, running fan at 80% push's more air through so card is cooler hence higher clock.

November 9, 2013 | 11:47 PM - Posted by snook

and likewise with the 290's

November 10, 2013 | 01:57 AM - Posted by arbiter

well problem with 290 is being 95c it has almost no room past that, where as the nvidia card has plenty of it. i could set my 780 to 95c if i wanted before clocking back but card will never get that high.

November 10, 2013 | 02:48 PM - Posted by snook

at 80% fixed fan speed i'm willing to bet the 290 doesn't hit 95c either. Nv is saving face with this loud, hot card. you can't down AMD for it and then give Nv a pass.

they have the same issues to varying degrees

July 10, 2015 | 11:59 PM - Posted by Rita (not verified)

If that is the case you may contact their insurance company and discuss the actual terms
with the policy. Is the total cost as it can be, or perhaps there
is additional charges. They protect your car while shipping and help you save money with the same time.

Here is my webpage; Hot Rod Shipping - Walter,

November 7, 2013 | 08:24 PM - Posted by Anonymous (not verified)


November 8, 2013 | 04:11 AM - Posted by Anonymous (not verified)

Quote "Titan retains its 1/3 FP64 while the 780Ti only gets 1/24 which is really bad".
Is there any truth in this? The chart on the first page clearly shows the 780 Ti has 2880 single precision cores and 960 double, 2880 divide 960 is 3. According to this chart then this 780 Ti is not 1/24 FP64 but the same as the Titan, 1/3, is this not correct?

November 8, 2013 | 10:01 AM - Posted by BBMan (not verified)

After reading about all these cards and their growing power consumption, is there any way yous might be able to do a review on "green" cards that still deliver decent numbers but are much more concerned about our power $ockets?

November 8, 2013 | 07:11 PM - Posted by arbiter

these are the highest end cards, power consumption is not really looked at when buying them.

November 8, 2013 | 02:06 PM - Posted by Anonymous (not verified)

Nvidia won !

November 8, 2013 | 02:09 PM - Posted by Anonymous (not verified)

Nvidia Launches GTX 780 Ti -- Faster Than Titan For $300 Less...FORBES

November 8, 2013 | 05:15 PM - Posted by Ahlan (not verified)

This was the review at techpowerup...

what a joke reviewers have become...

November 8, 2013 | 08:55 PM - Posted by arbiter

Funny part is that is data from 2 different sites. if you used techpowerup's review, they only say 39dB for fan noise of 780ti not 50+. so? It also neglects its a 84c vs 95c temp differences. noise level givin the levels they show it seems like they got mic right next to the GPU so compared to numbers else were they are higher.


here is fixed version of the image.

November 8, 2013 | 09:36 PM - Posted by Airbrushkid (not verified)

I still rather but the Titan. I use 3 monitors and with the Titan you only need 1 card. It seems that the 780 ti you would need 2 cards. So the titan is cheaper. Am I wrong?

November 8, 2013 | 09:54 PM - Posted by Oubadah (not verified)

Why do you guys insist on using that Skyrim sequence to benchmark GPUs. You could have barely chosen a more ill suited location. All that effort wasted because you run a GPU benchmark in a thoroughly CPU limited area, when there are actually plenty of GPU limited areas to choose from in the game.

November 10, 2013 | 11:49 AM - Posted by Jodiuh (not verified)

What game/test is used to determine total power usage of the entire system? How is this number identified?

November 10, 2013 | 08:29 PM - Posted by drbaltazar (not verified)

780ti is capable of way more , cooler don't like it tho .so now all those engineering need to be sent to cooling a computer 101 if they hope to gain more performance in the future .cause clearly .both nvidia and AMD have reached or are very close to have reached a plateau .

November 10, 2013 | 09:21 PM - Posted by raghunathan (not verified)

why are the exact settings for crysis 3 not shown as they are for BF3 or Metro LL. this is an inconsistency you should eliminate.

also did you run R9 290X with uber fan speed. because otherwise the throttling is quite severe for R9 290X CF. there is 10 - 20% perf loss at the lower fan speed.

people who spend USD 550 are definitely going to try and get the best performance possible. since AMD supports uber mode with a BIOS switch there is every reason to test it since its guaranteed by AMD to work and they back it with their warranty.

November 11, 2013 | 02:11 PM - Posted by Anonymous (not verified)

I am getting pretty tired of benchmarks like these, not comparing a new flagship card with the previous title holder on resolutions where they matter. Sure, You give me the most common, such as 1920*1080 and 2560*1440, but nothing for multi-monitor support, and even your 4K resolution review excludes the previous title holder, the GTX Titan.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.