Review Index:
Feedback

NVIDIA GeForce GTX 780 Ti 3GB Review - Full GK110 Crashes into Hawaii

Author:
Manufacturer: NVIDIA

GK110 in all its glory

I bet you didn't realize that October and November were going to become the onslaught of graphics cards it has been.  I know I did not and I tend to have a better background on these things than most of our readers.  Starting with the release of the AMD Radeon R9 280X, 270X and R7 260X in the first week of October, it has pretty much been a non-stop battle between NVIDIA and AMD for the hearts, minds, and wallets of PC gamers. 

Shortly after the Tahiti refresh came NVIDIA's move into display technology with G-Sync, a variable refresh rate feature that will work with upcoming monitors from ASUS and others as long as you have a GeForce Kepler GPU.  The technology was damned impressive, but I am still waiting for NVIDIA to send over some panels for extended testing. 

Later in October we were hit with the R9 290X, the Hawaii GPU that brought AMD back in the world of ultra-class single GPU card performance.  It has produced stellar benchmarks and undercut the prices (then at least) of the GTX 780 and GTX TITAN.  We tested it in both single and multi-GPU configurations and found that AMD had made some impressive progress in fixing its frame pacing issues, even with Eyefinity and 4K tiled displays. 

NVIDIA dropped a driver release with ShadowPlay that allows gamers to record playback locally without a hit on performance.  I posted a roundup of R9 280X cards which showed alternative coolers and performance ranges.  We investigated the R9 290X Hawaii GPU and the claims that performance is variable and configurable based on fan speeds.  Finally, the R9 290 (non-X model) was released this week to more fanfare than the 290X thanks to its nearly identical performance and $399 price tag. 

View Full Size

And today, yet another release.  NVIDIA's GeForce GTX 780 Ti takes the performance of the GK110 and fully unlocks it.  The GTX TITAN uses one fewer SMX and the GTX 780 has three fewer SMX units so you can expect the GTX 780 Ti to, at the very least, become the fastest NVIDIA GPU available.  But can it hold its lead over the R9 290X and validate its $699 price tag?

Continue reading our review of the NVIDIA GeForce GTX 780 Ti 3GB GK110 Graphics Card!!

GeForce GTX 780 Ti - GK110 Full Implementation

GK110 isn't new to us, but this implementation of it is.  When NVIDIA launched the GTX TITAN it was with an incomplete GPU - there was a single SMX disabled of the available 15 on the die, bringing its CUDA core count to 2,688.

View Full Size

But now, with the new GeForce GTX 780 Ti, NVIDIA is enabling all 15 SMX units for a grand total of 2,880 cores, all running with a base clock of 875 MHz and a Boost clock (typical) of 928 MHz.  These clock speed are both increases over the GTX TITAN as well (39 MHz on base, 52 MHz on boost) which obviously tells us that overall compute performance of the 780 Ti will outstrip the TITAN for gaming.  Texture units also increase from 224 to 240 giving the card a bit more bandwidth for fill rate.

The 384-bit memory interface remains the same, though the GTX 780 Ti has 3GB of GDDR5 memory running at 7.0 Gbps, a full 1.0 Gbps increase over the TITAN specifications.  Memory bandwidth improves to 336 GB/s which just passes the 320 GB/s of the R9 290X and its 512-bit memory interface.  The 6GB frame buffer on TITAN is still obviously king for users that need it but in my testing there isn't a big gaming advantage for users compared to the 3GB buffer on the GTX 780 or GTX 780 Ti.

View Full Size

If we compare the GTX 780 to the GTX 780 Ti, obviously the newcomer has the advantage in all areas.

  GTX 780 Ti GTX TITAN GTX 780 R9 290X R9 290 R9 280X
Process 28nm 28nm 28nm 28nm 28nm 28nm
Transistors 7.1 billion 7.1 billion 7.1 billion 6.2 billion 6.2 billion 4.3 billion
Shaders 2880 2688 2304 2816 2560 2048
Clock Speed 875 MHz 836 MHz 863 MHz up to 1000 MHz up to 947 MHz 1000 MHz
Memory Width 384-bit 384-bit 384-bit 512-bit 512-bit 384-bit
Memory Clock 1750 MHz 1500 MHz 1500 MHz 1250 MHz 1250 MHz 1500 MHz
Compute Perf 5.04 TFLOPS 4.49 TFLOPS 3.97 TFLOPS 5.6 TFLOPS 4.9 TFLOPS 4.1 TFLOPS
Texture Units 240 224 192 176 160 128
ROPs 48 48 48 64 64 32
Frame Buffer 3GB 6GB 3GB 4GB 4GB 3GB

*Please note that these TFLOP ratings are using NVIDIA's base clock while AMD's results are using their "up to" peak clock.  While interesting to see, you should really be comparing only NV to NV on the table above.

With 25% more CUDA cores, while also running at slightly higher clocks speeds, the GTX 780 Ti should be a sizeable step ahead of the original GTX 780 in performance in all areas.  Add in the higher memory bandwidth

The GTX 780 Ti card itself is rather unremarkable in its deviation from the standard NVIDIA has set for itself with high end GPU offerings in the last year plus.  The 780 Ti uses the same style of cooler we saw first with the GTX 690 and it continues to be a stylish, high performance solution for cooling massive, hot GPUs.  What will be interesting to evaluate this time is how much louder and hotter NVIDIA was willing to go to compete with AMD's new standards on the R9 290X and R9 290.

View Full Size

The reference card still looks sexy and the addition of the darker lettering on the "GTX 780 Ti" routing adds some aggressiveness to the color scheme.  Card length remains the same as the GTX 780 and GTX TITAN at 10.5 inches

View Full Size

NVIDIA has continued to include a pair of dual-link DVI outputs, a full-size HDMI and a full-size DisplayPort for monitor connectivity.

View Full Size

Somewhat of a surprise, the GTX 780 Ti is still only using an 8+6 pin power configuration despite the higher core count and clock speed and maintains the same 250 watt TDP as GTX 780 and TITAN.

November 7, 2013 | 06:51 AM - Posted by Prodeous

Is there a possibility to see some GPU computing benchmarks?

like Blender or CIV or other. Just to see the differences between the 780, 780ti and TITAN, just to see if this double precision in fact make that much of a difference in these applications?

November 8, 2013 | 07:12 AM - Posted by Anonymous (not verified)

Tom's Hardware has a review that goes over a lot of that. http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks...

November 7, 2013 | 06:54 AM - Posted by EndTimeKeeper

Hey Ryan what software do you use to track GPU clock frequencies and Temps? I ask because I would like to monitor my new 290x and see what lvl my card is running at while playing BF4. Also can I trust The AMD control center app , when it says that my card is running at 92C with my GPU Clock speed at 1GHz threw out a 40 minute match of BF4? just to let you know my settings are set to Ultra @1440p

November 7, 2013 | 07:02 AM - Posted by Ryan Shrout

I'm just using GPU-Z.  http://www.techpowerup.com/gpuz/

 

November 8, 2013 | 12:40 AM - Posted by arbiter

hrm, my 780 i got running BF4 on ultra @ 1080p only tops at 67c with a clocking of 1175mhz gpu and 6.5ghz on the memory

November 7, 2013 | 07:09 AM - Posted by Anonymous (not verified)

Um, why is the Titan still about a thousand bucks? Surely 3gig more ram isn't worth the difference.

November 7, 2013 | 07:24 AM - Posted by Titan FP64 FTW (not verified)

Titan retains its 1/3 FP64 while the 780Ti only gets 1/24 which is really bad. Even the R2 290X gets its 1/8 FP64. Keep in mind these are only useful to those that need GPU computing cards. Gamers need not apply.

November 7, 2013 | 01:42 PM - Posted by Scott Michaud

While high precision is not necessary for games, there are a few areas where it can get chicken/eggy. Could they improve in some areas if the performance was ubiquitous? Maybe.

November 7, 2013 | 07:27 AM - Posted by jon (not verified)

seriously ryan where are the REAL measured clockspeeds?

Are we looking at a ~960mhz 780Ti beating the 290X @~800mhz?

Without the clock speeds this comparison is broken..

November 7, 2013 | 07:45 AM - Posted by Ryan Shrout

If you really want to see the clock speeds, check out this story: http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Configurable-GPU

It is interesting to discuss and showcase, but performance is performance, regardless of the clock rates.  So looking at gaming benchmarks is really where the decision should lie.

November 7, 2013 | 07:31 AM - Posted by Anonymous (not verified)

So the Titan is the poor man's Quadro! I too would like to see some Blender benchmarks, maybe someone at pcper, who is not a busy with the gaming benchmarks as Ryan, could do some benchmarking of open source graphics software using these GPUs! As low cost as the AMD R9 290 series is, and as powerfull as blender is becoming, it would be very helpfull if someone at pcper would test more open source graphics software, as most of the blender websites lack the technical hardware knoledge that pcper has!

November 7, 2013 | 07:45 AM - Posted by Ryan Shrout

We'll see what we can dig up!

November 7, 2013 | 07:33 AM - Posted by The Common Cold (not verified)

Is Skyrim in these benchmarks just to have a DX9 title in the mix? If that is the case, why not use a more demanding title, like The Witcher 2?

November 7, 2013 | 07:46 AM - Posted by Ryan Shrout

Hmm, that might be a good idea.  I just have a lot of legacy comparison results with Skyrim.  I'll try to install The Witcher 2 this weekend and see if I can find a good benchmark segment.

November 7, 2013 | 08:27 AM - Posted by The Common Cold (not verified)

Maybe we finally have a single GPU that can handle that crazy UberSampling...
Good luck!

November 7, 2013 | 09:48 AM - Posted by KevTheGuy (not verified)

Yeah that ubersampling is so demanding but it makes the game look so much better *_*

November 7, 2013 | 07:55 AM - Posted by TechZombie

I love competition! Personally I'm waiting for some benchmarks from Hawaii with 3rd party cooling before I make a buying decision. But you have to admit that with the games bundled Nvidia is doing an amazing job.

November 7, 2013 | 09:05 AM - Posted by azuza001 (not verified)

Sorry if this has already been brought up but I do have to ask, from the reviews I have read on the R9 290 vs the 290X it seems the general consensus is the regular 290 often gives the same results due to the way amd is handling boost speeds now. If that's the case then could you not further say that you could buy almost the same perfomance of the 780 ti for half the cost with a 290?

November 7, 2013 | 09:22 AM - Posted by Dave (not verified)

Ryan, when will you be implementing BF4 to the benchmark lineup? Please.

November 7, 2013 | 10:14 AM - Posted by Ryan Shrout

Soon hopefully!  I haven't even had a chance to install it yet honestly though...

November 7, 2013 | 09:42 AM - Posted by Mac (not verified)

Ryan, when you say this:
" Even though AMD's 290X cards have a better average frame rate and lower frame times, the variance in those frames is noticeably higher again. Take a look at the width of the blue line (780 Ti SLI) and the orange line (290X CrossFire) and you'll see that NVIDIA tends to be more consistent, making the average frame rate less of a victory for AMD than it might otherwise be."
Are you calling it on the data or is it something you actually see on screen? I think a good subjective opinion would be much more useful than any data when the variance is this low, what do you reckon?

November 7, 2013 | 10:15 AM - Posted by Ryan Shrout

The answer is actually both.  Sometimes, the data shows a problem that I then go back and verify and sometimes I see the problem first, then verify it with data.  At this point though, I have tested these games and these configurations enough that the surprise is gone.

In that particular case, I would tell you that you can definitely see and feel the difference while playing in real time.

November 7, 2013 | 10:32 AM - Posted by Mac (not verified)

I see, thank you, all this data hurts my head so I go by "feel" now :-)

November 7, 2013 | 10:12 AM - Posted by snook

the first R9 290 or R9 290X with aftermarket cooling is going beat the 780Ti.

this card had to destroy the the AMD offerings. it didn't.
how much crow is being served over the noise and power draw
of the 290's now. lots.

ryan, you seemed physically saddened by the underwhelming performance, loudness and power hungry traits of this card that Nv HAD to make. and that price...yeow.

this is as much a loss as it is a win for Nv. they were not ready for what AMD did. hell, now I'm sad. :/

November 7, 2013 | 10:17 AM - Posted by Ryan Shrout

I agree that NVIDIA should likely have wanted more performance out of the 780 Ti, but they didn't find it.  Still, they are getin 10-15% better perf in the most demanding games at less power, heat and noise than the 290X.  But it does cost more, which is its primary draw back.

November 7, 2013 | 10:23 AM - Posted by snook

ryan. any idea if AMD has a 300X hiding some place?
the fight could keep going next week if they did. lol

November 7, 2013 | 01:03 PM - Posted by Ryan Shrout

I don't think so, not for a while.

November 8, 2013 | 01:13 PM - Posted by BlackDove (not verified)

How were they not ready? GK110 is ancient now, and it manages to beat the newest AMD GPU. R290 and R290X use a brand new GPU with a 512bit memory bus and this thing still beats it, consumes less power, and doesn't get nearly as hot.

It's $150 more. That's about the only negative.

Ryan,

Do the 780 and 780Ti have the double precision capability that Titan does? If so, this and 780 are even better values.

November 9, 2013 | 08:45 PM - Posted by snook

so ancient it's their premier card, the "new" 290 is a modified GCN, not new. and, it's was pointed out correctly that these cards throttle (290) and the 780Ti wasn't doing that. it won on that strength alone.

and Nv had to allow higher core temp and fan speed to achieve this "win"...stunningly familiar.

the first partner produced R9 290X is gonna beat the 780Ti and I don't want to hear "it was running at a higher clock". ryan excused that away in this comment section.

and Nv wasn't ready or they wouldn't have made the mad scramble to reduce prices to save face. accept it.

November 7, 2013 | 10:29 AM - Posted by snook

forgot to say, great article. you again proved yourself an unbiased reviewer.
let your critics be damned.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.