Review Index:
Feedback

NVIDIA GeForce GTX 780 Ti 3GB Review - Full GK110 Crashes into Hawaii

Author: Ryan Shrout
Manufacturer: NVIDIA

Skyrim

The Elder Scrolls V: Skyrim (DirectX 9)


 

The Empire of Tamriel is on the edge. The High King of Skyrim has been murdered.

Alliances form as claims to the throne are made. In the midst of this conflict, a far more dangerous, ancient evil is awakened. Dragons, long lost to the passages of the Elder Scrolls, have returned to Tamriel.

The future of Skyrim, even the Empire itself, hangs in the balance as they wait for the prophesized Dragonborn to come; a hero born with the power of The Voice, and the only one who can stand amongst the dragons.

Our settings for Skyrim

Here is a video our testing run through, for your reference

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

 

At 1920x1080, there is really no point in having ANY of these cards for Skyrim...

 

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

 

2560x1440 is a bit more stressful on the GPUs but we still have a tie at the top with the R9 290X and the new GTX 780 Ti.

 

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

Even at 4K these cards struggle to separate but they are able to maintain more than 70 FPS on average with Skyrim maximum settings - impressive!

November 7, 2013 | 09:51 AM - Posted by Prodeous

Is there a possibility to see some GPU computing benchmarks?

like Blender or CIV or other. Just to see the differences between the 780, 780ti and TITAN, just to see if this double precision in fact make that much of a difference in these applications?

November 8, 2013 | 10:12 AM - Posted by Anonymous (not verified)

Tom's Hardware has a review that goes over a lot of that. http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks...

November 7, 2013 | 09:54 AM - Posted by EndTimeKeeper

Hey Ryan what software do you use to track GPU clock frequencies and Temps? I ask because I would like to monitor my new 290x and see what lvl my card is running at while playing BF4. Also can I trust The AMD control center app , when it says that my card is running at 92C with my GPU Clock speed at 1GHz threw out a 40 minute match of BF4? just to let you know my settings are set to Ultra @1440p

November 7, 2013 | 10:02 AM - Posted by Ryan Shrout

I'm just using GPU-Z.  http://www.techpowerup.com/gpuz/

 

November 8, 2013 | 03:40 AM - Posted by arbiter

hrm, my 780 i got running BF4 on ultra @ 1080p only tops at 67c with a clocking of 1175mhz gpu and 6.5ghz on the memory

November 7, 2013 | 10:09 AM - Posted by Anonymous (not verified)

Um, why is the Titan still about a thousand bucks? Surely 3gig more ram isn't worth the difference.

November 7, 2013 | 10:24 AM - Posted by Titan FP64 FTW (not verified)

Titan retains its 1/3 FP64 while the 780Ti only gets 1/24 which is really bad. Even the R2 290X gets its 1/8 FP64. Keep in mind these are only useful to those that need GPU computing cards. Gamers need not apply.

November 7, 2013 | 04:42 PM - Posted by Scott Michaud

While high precision is not necessary for games, there are a few areas where it can get chicken/eggy. Could they improve in some areas if the performance was ubiquitous? Maybe.

November 7, 2013 | 10:27 AM - Posted by jon (not verified)

seriously ryan where are the REAL measured clockspeeds?

Are we looking at a ~960mhz 780Ti beating the 290X @~800mhz?

Without the clock speeds this comparison is broken..

November 7, 2013 | 10:45 AM - Posted by Ryan Shrout

If you really want to see the clock speeds, check out this story: http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Configurable-GPU

It is interesting to discuss and showcase, but performance is performance, regardless of the clock rates.  So looking at gaming benchmarks is really where the decision should lie.

November 7, 2013 | 10:31 AM - Posted by Anonymous (not verified)

So the Titan is the poor man's Quadro! I too would like to see some Blender benchmarks, maybe someone at pcper, who is not a busy with the gaming benchmarks as Ryan, could do some benchmarking of open source graphics software using these GPUs! As low cost as the AMD R9 290 series is, and as powerfull as blender is becoming, it would be very helpfull if someone at pcper would test more open source graphics software, as most of the blender websites lack the technical hardware knoledge that pcper has!

November 7, 2013 | 10:45 AM - Posted by Ryan Shrout

We'll see what we can dig up!

November 7, 2013 | 10:33 AM - Posted by The Common Cold (not verified)

Is Skyrim in these benchmarks just to have a DX9 title in the mix? If that is the case, why not use a more demanding title, like The Witcher 2?

November 7, 2013 | 10:46 AM - Posted by Ryan Shrout

Hmm, that might be a good idea.  I just have a lot of legacy comparison results with Skyrim.  I'll try to install The Witcher 2 this weekend and see if I can find a good benchmark segment.

November 7, 2013 | 11:27 AM - Posted by The Common Cold (not verified)

Maybe we finally have a single GPU that can handle that crazy UberSampling...
Good luck!

November 7, 2013 | 12:48 PM - Posted by KevTheGuy (not verified)

Yeah that ubersampling is so demanding but it makes the game look so much better *_*

November 7, 2013 | 10:55 AM - Posted by TechZombie

I love competition! Personally I'm waiting for some benchmarks from Hawaii with 3rd party cooling before I make a buying decision. But you have to admit that with the games bundled Nvidia is doing an amazing job.

November 7, 2013 | 12:05 PM - Posted by azuza001 (not verified)

Sorry if this has already been brought up but I do have to ask, from the reviews I have read on the R9 290 vs the 290X it seems the general consensus is the regular 290 often gives the same results due to the way amd is handling boost speeds now. If that's the case then could you not further say that you could buy almost the same perfomance of the 780 ti for half the cost with a 290?

November 7, 2013 | 12:22 PM - Posted by Dave (not verified)

Ryan, when will you be implementing BF4 to the benchmark lineup? Please.

November 7, 2013 | 01:14 PM - Posted by Ryan Shrout

Soon hopefully!  I haven't even had a chance to install it yet honestly though...

November 7, 2013 | 12:42 PM - Posted by Mac (not verified)

Ryan, when you say this:
" Even though AMD's 290X cards have a better average frame rate and lower frame times, the variance in those frames is noticeably higher again. Take a look at the width of the blue line (780 Ti SLI) and the orange line (290X CrossFire) and you'll see that NVIDIA tends to be more consistent, making the average frame rate less of a victory for AMD than it might otherwise be."
Are you calling it on the data or is it something you actually see on screen? I think a good subjective opinion would be much more useful than any data when the variance is this low, what do you reckon?

November 7, 2013 | 01:15 PM - Posted by Ryan Shrout

The answer is actually both.  Sometimes, the data shows a problem that I then go back and verify and sometimes I see the problem first, then verify it with data.  At this point though, I have tested these games and these configurations enough that the surprise is gone.

In that particular case, I would tell you that you can definitely see and feel the difference while playing in real time.

November 7, 2013 | 01:32 PM - Posted by Mac (not verified)

I see, thank you, all this data hurts my head so I go by "feel" now :-)

November 7, 2013 | 01:12 PM - Posted by snook

the first R9 290 or R9 290X with aftermarket cooling is going beat the 780Ti.

this card had to destroy the the AMD offerings. it didn't.
how much crow is being served over the noise and power draw
of the 290's now. lots.

ryan, you seemed physically saddened by the underwhelming performance, loudness and power hungry traits of this card that Nv HAD to make. and that price...yeow.

this is as much a loss as it is a win for Nv. they were not ready for what AMD did. hell, now I'm sad. :/

November 7, 2013 | 01:17 PM - Posted by Ryan Shrout

I agree that NVIDIA should likely have wanted more performance out of the 780 Ti, but they didn't find it.  Still, they are getin 10-15% better perf in the most demanding games at less power, heat and noise than the 290X.  But it does cost more, which is its primary draw back.

November 7, 2013 | 01:23 PM - Posted by snook

ryan. any idea if AMD has a 300X hiding some place?
the fight could keep going next week if they did. lol

November 7, 2013 | 04:03 PM - Posted by Ryan Shrout

I don't think so, not for a while.

November 8, 2013 | 04:13 PM - Posted by BlackDove (not verified)

How were they not ready? GK110 is ancient now, and it manages to beat the newest AMD GPU. R290 and R290X use a brand new GPU with a 512bit memory bus and this thing still beats it, consumes less power, and doesn't get nearly as hot.

It's $150 more. That's about the only negative.

Ryan,

Do the 780 and 780Ti have the double precision capability that Titan does? If so, this and 780 are even better values.

November 9, 2013 | 11:45 PM - Posted by snook

so ancient it's their premier card, the "new" 290 is a modified GCN, not new. and, it's was pointed out correctly that these cards throttle (290) and the 780Ti wasn't doing that. it won on that strength alone.

and Nv had to allow higher core temp and fan speed to achieve this "win"...stunningly familiar.

the first partner produced R9 290X is gonna beat the 780Ti and I don't want to hear "it was running at a higher clock". ryan excused that away in this comment section.

and Nv wasn't ready or they wouldn't have made the mad scramble to reduce prices to save face. accept it.

November 7, 2013 | 01:29 PM - Posted by snook

forgot to say, great article. you again proved yourself an unbiased reviewer.
let your critics be damned.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.