Review Index:
Feedback

NVIDIA GeForce GTX TITAN Performance Review and Frame Rating Update

Author: Ryan Shrout
Manufacturer: NVIDIA

DiRT 3 - SLI

DiRT 3 (DirectX 11)


 

A continuation of the Colin McRae series, but without his name, DiRT 3 is one of the top racing games in the world and offers stunning imagery along with support for features of DirectX 11. 

Our settings for DiRT 3

View Full Size

View Full Size

View Full Size

Click to Enlarge

View Full Size

At 1080p, our SLI configurations are seeing the same high frametime variance that we saw with the GTX 690 on the previous page. 

View Full Size

View Full Size

View Full Size

Click to Enlarge

View Full Size

The GeForce GTX TITAN in SLI is able to scale by 48% at 2560x1440 though the GTX 680 in SLI increases its performance by 86% over the single GTX 680.  Obviously we are seeing more CPU limitedness on DiRT 3 and you can see the frametime variance on the TITAN SLI results are much higher as well.

February 21, 2013 | 09:21 AM - Posted by DeadOfKnight (not verified)

I see Dirt 3 Graphs on the Crysis 3 page.

February 21, 2013 | 10:00 AM - Posted by Ryan Shrout

Fixed!

February 21, 2013 | 09:22 AM - Posted by j0hndoe

Man, that card is a BEAST. Look forward to reading more thoughts on it as you have more time with it! Great write up, Ryan. Thanks

February 21, 2013 | 10:03 AM - Posted by YTech2 (not verified)

TITAN is back for more! - Clarification - From photo provided, card offers 1 dual-link DVI-I and 1 dual-link DVI-D. For those using a DVI to VGA (Analogue) adapter would use the DVI-I.

Yes, various Analogue displays are still being used.

I do wonder about the performance of the TITAN in Stereoscopic Systems.

February 21, 2013 | 10:21 AM - Posted by Anonymous (not verified)

Well from what I'm seeing three gtx 680sc are better (faster) than two titans and I'm running 6000×1200 I could only afford two and with the limited quality of titan cards I think I'll pass...

February 21, 2013 | 10:22 AM - Posted by Anonymous (not verified)

meant to say limited quantity I'll be passing......

February 26, 2013 | 02:18 PM - Posted by John Doe (not verified)

Indeed, while the fuckers are eVGA are eating their hamburgers over those shitty "SuperClocked" cards they ripped you off $20 each.

"Superclocking" is flashing the cards BIOS 50 Mhz over the stock clock and HOSING BLIND FUCKERS.

sigh...

April 9, 2013 | 09:22 AM - Posted by Cito (not verified)

OMG!!!!!!!!!!!! 20$ WOW!!! they are robing people blind.....

Dude chill its just 20 bux i did it might as well.

But if you were to talk about the signature edition now then i would agree with you.

February 21, 2013 | 10:28 AM - Posted by D1RTYD1Z619

Finally a BF3 2560 x 1440 gtx 680 2gb sli benchmark. Do you think you'll bench the Titan against the 680 4gb versions?

February 21, 2013 | 11:05 AM - Posted by tackle70 (not verified)

BTW the temporary fix for CF stuttering is to either use Radeon Pro or MSI Afterburner to limit the framerate. If you are in a setting where you can get fairly consistent framerates at or above your monitor's refresh rate, you just limit the framerate to your refreshrate (60 in most cases). Otherwise, you limit it to around your average FPS.

It's not a perfect solution, but it does reliably deal with stutter on crossfire 7970s.

March 28, 2013 | 04:24 PM - Posted by Anonymous (not verified)

I'm so vindicated, and so disappointed...

" We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS. As such, I decided that it was simply irresponsible of me to present data to readers that I would then immediately refute on the final pages of this review - it would be a waste of time for the reader and people that skip only to the performance graphs wouldn't know our theory on why the results displayed were invalid. "

AMD sucks so badly, standard apologist mantra is issued.
This is the sub par life of AMD video cards.

They really suck, but our test that is more accurate than fraps shows that, so we will refute our own test and say they don't really suck, so we won't show you the pathetic cheaty runt missed frame data and PROVE AMD HAS BEEN CHEATING LIKE HECK FOR YEARS ON END!

Some of us always knew it, and always said it, and we were attacked relentlessly.
Well, let the attacks continue, as the cover up is still ongoing.

Maybe by now AMD has fixed their YEARS LONG ISSUES so a just cleaned up totally new result can be shown soon --- FORGETTING THE YEARS AMD USERS SUFFERED WITH THE AMD CRAP THAT WON'T BE SHOWN UNTIL IT IS "FIXED".

Then of course the bank robbers get off scott free.
Good job AMD, the Mob wishes it had that kind of pull, as does every politician in the entire world.

February 21, 2013 | 11:54 AM - Posted by Dan (not verified)

So, if you have $1k to power a 5760x1080 gaming setup, do you personally go with the Titan, 2x7970, 2x680 or a 690?

February 22, 2013 | 01:35 AM - Posted by Eric B (not verified)

3x7970

February 21, 2013 | 11:58 AM - Posted by grommet

Where are the promised "TITAN up the graphics" quotes?
I am disappointed.

February 21, 2013 | 12:18 PM - Posted by luciano (not verified)

Your framerate rating method looks like the best.
Could you guysreview, as other user comment here suggests, the frame cap "fix" results?
Thanks for the great review

February 21, 2013 | 12:28 PM - Posted by Tommy (not verified)

No 2560x1600?

No 690 sli in the sli results?

February 23, 2013 | 11:12 AM - Posted by btdog

No answer to your first question, but to your second question, a 690 is technically already in SLI (it's two 680s SLI'd on one card)

February 24, 2013 | 08:03 PM - Posted by Tommy (not verified)

I should have been more specific. By 690 SLI I mean quad-sli 690s.

February 21, 2013 | 01:19 PM - Posted by Anonymous (not verified)

I take it when they bring out a dual-GPU TITAN to compare against the 690s, it will be far better? Then you can test SLI for both dual-GPU TITAN and 690s. Will watch this space

February 21, 2013 | 01:36 PM - Posted by MarkT (not verified)

Feels like the drivers for titan are premature, Ryan is this the kind of perforance Nvidia is expecting or are there upcoming driver enhancements for this card?

February 21, 2013 | 01:53 PM - Posted by Zorkwiz

I'm not sure why this review justified a Gold Award at the $1000 price point. Sure, it's the best single GPU card you can get and the power/thermals look impressive, but the conclusion basically says that it's beaten in single screen setups by both the 690 and 680 SLI, which are the same price or cheaper.

I'm not convinced that the promise of better multi-monitor gaming performance gives it the Gold, especially since you guys have such a limited set of benchmarks for those setups thus far.

I'd still love to own one, but I feel like $899 would have made it a viable option, not $999.

February 21, 2013 | 06:31 PM - Posted by Ryan Shrout

How could $100 really matter in a card this pricey?

Your other point is fair, but I HAVE seen the 5760x1080 numbers and the potential perf advantage there is realized.

February 25, 2013 | 03:59 PM - Posted by HeavyG (not verified)

The people willing to pay for this card are going to buy it regardless if it is $999, $899, or $1099. There is a specific market for it, and it surely isn't the "best bang for your buck" target.

Heck, I bought a 690 just for the cool factor. I have the space on my board, and the 680s would have squeaked by a bit more in terms of performance. The 690 looked way cooler and I liked the idea of one day adding a 2nd 690. This card is really no different.

And by all means, you can still buy 680 SLIs if you want. Nobody is stopping you and nobody is saying that it isn't the right choice for you.

February 21, 2013 | 02:32 PM - Posted by Kitten Masher (not verified)

It's great to see some concrete frame rating data, but I feel like your presentation and analysis of it is a bit off.

Specifically, when you're looking at the distribution of rates, you're using a bar graph to try to help us infer the variance. Why don't you just show us a graph of the distribution of the frame ratings instead? I think visually it would be much easier to compare looking for a "skinnier" distribution with a smaller variance, and you can still mark up the distribution percentiles on it. I'm really curious to see how that might look with the crossfire data you showed, I would almost expect to see two 'mounds' on either side of the mean (although I could be wrong).

Also, your analysis feels a bit hollow, sometimes just reiterating what we saw in the graph without telling us what truly matters: do we care? Was there 'stuttering'? This is especially true since you can't just base it off of a % difference in the times across the percentiles. The actual time taken is also important, and it is also relative to the recorded FPS. All in all, what frame rate times actually mean to the end user isn't as black and white as "lower is better".

OR I'm completely clueless as to what's going on.

Anyways, it's still awesome to see some frame rating data and I look forward to seeing more in the future!

February 21, 2013 | 07:05 PM - Posted by Ryan Shrout

Appreciate the feedback, I'll look into that presentation option for data this week.

As for the analysis, I was kind of purposefully vague as I have a lot more data and compile to present the "whole" picture of CrossFire versus SLI.  

February 22, 2013 | 11:30 AM - Posted by Lord Binky (not verified)

Some games like Mechwarrior Online do not support SLI, so single GPU performance is still relevant.

February 25, 2013 | 04:07 PM - Posted by HeavyG (not verified)

It is relevant, but MOST games support SLI. Games that don't support it on release typically don't do very well. Take Rage, for example.

As for Mech Warrior... it is still in beta, right? It is using CryENGINE 3, right? I don't see it being a problem for too long, but I have been wrong before. After all, I pre-ordered Aliens Colonial Marines.

February 21, 2013 | 03:29 PM - Posted by serpico (not verified)

thank you, I appreciate the frametime graphs.

February 21, 2013 | 06:21 PM - Posted by 6GB Guy (not verified)

Could you test the TITAN against the GTX 680 4GB versions and the Sapphire HD 7970 TOXIC Edtion? It would be interesting to see how well a 6gb 7970 does against the Titan.

February 21, 2013 | 08:40 PM - Posted by ocre (not verified)

man you really went the extra mile on your frame time method. I second that it could be the best method so far. Great work. cant wait till your finally the point you can reveal all your findings. I know you put so much work into it. It seems very promising.

June 7, 2013 | 07:34 PM - Posted by Shari (not verified)

" This mid-video dialogue is really interesting that we haven't seen such cute dialogue in music video for a long time. All what you need to know about becoming discovered online. This network will soon be chock full of independent and original productions that are maintained only on the web.

My webpage ::

v=lQ9Sn0bhOpΕ">youtube views

February 21, 2013 | 11:02 PM - Posted by mateo (not verified)

Can't wait for you guys to set your new benching machine.

No freebies anymore for GPUs, nor for game developers ;)

February 21, 2013 | 11:31 PM - Posted by arbiter

As you said on TWiCH AMD wasn't to happy with these results. I would expect that bit of an understatement. They made claims the 7970 Ghz was the fastest gpu on the market for a while, This new testing method kinda casts a major shadow on that claim if they are using so many "runt frames" to boost fps score's

February 28, 2013 | 10:19 AM - Posted by Anonymous (not verified)

As I understand it, the runt frames were a crossfire characteristic. The single GPU setup was fine.

February 22, 2013 | 04:28 AM - Posted by techno (not verified)

This is all very interesting stuff about the frame rate analysis.

It seems to be the case that frame rate variance reduces as you test at more gpu intensive settings.

This suggests to me that the variance is being caused by cpu bottle necking. Amd's set up would appear to be more sensitive to these cpu bottle necks than nvidia's.

This also explains how the temporary fix of limiting the max frame rate on Amd set ups works as it is preventing rapid frame rates from causing cpu bottle necks.

Was wondering what clock speed the test platforms 3960 was running at and it you have tested how this frame time variance is affected by different cpu clock speeds?

February 28, 2013 | 10:27 AM - Posted by Identity Unknown (not verified)

This makes a lot of sense as (IIRC) a while back AMD actively pursued higher CPU utilization to lighten the GPU load and better balance to system resource utilization. Being more sensitive to CPU hiccups would be an undesirable side effect of such a pursuit.

February 22, 2013 | 06:34 AM - Posted by BiggieShady

Zoomed FRAPS frame time graph is not labeled properly.

Great review btw, specifically the page before the last.

February 22, 2013 | 10:01 AM - Posted by yeki (not verified)

Seriously? You're still doing this? Can you please give me one reason, just ONE reason, as to why I would want to play a game without Vsync and Triple Buffering? Because that is the only, and I repeat, the ONLY situation in which your "novel" and "state-of-the-art" method of benchmarking VGAs would be relevant to an actual real-life scenario. I'd nominate you for a Nobel prize if you do.

February 22, 2013 | 11:31 AM - Posted by Atomic Walrus (not verified)

A little aggressive considering the topic, maybe?

Competitive multi-player games. Triple buffered vsync introduces significant amounts of input latency. Enough to detect in a blind test, and certainly way more than any display or input device a gamer would be likely to use.

This might not matter to you, but it does to plenty of other PC gamers. I found this article to be extremely valuable.

February 22, 2013 | 01:50 PM - Posted by Ryan Shrout

I kind of agree on Vsync, but frame buffering, probably not:

"1. If it is not properly supported by the game in question, it can cause visual glitches. Just as tearing is a visual glitch caused by information being transferred too fast in the buffers for the monitor to keep up, so too in theory, can triple buffering cause visual anomalies, due to game timing issues for example.

2. It uses additional Video RAM, and hence can result in problems for those with less VRAM onboard their graphics card. This is particularly true for people who also want to use very high resolutions with high quality textures and additional effects like Antialiasing and Anisotropic Filtering, since this takes up even more VRAM for each frame. Enabling Triple Buffering on a card without sufficient VRAM results in things like additional hitching (slight pauses) when new textures are being swapped into and out of VRAM as you move into new areas of a game. You may even get an overall performance drop due to the extra processing on the graphics card for the extra Tertiary buffer.

3. It can introduce control lag. This manifests itself as a noticeable lag between when you issue a command to your PC and the effects of it being shown on screen. This may be primarily due to the nature of VSync itself and/or some systems being low on Video RAM due to the extra memory overhead of Triple Buffering."

http://www.tweakguides.com/Graphics_10.html

February 22, 2013 | 02:42 PM - Posted by mateo (not verified)

All valid points, but the question remains.

Does turning Vsync ON, fixes CF presenting issues.
If it does, is the performance of Vsycned CF in line when compared to Vsync OFF results.

Also does the system exhibits frame skipping like with RadeonPro "smoothing"?

And I'm not asking specifically about CF alone, but SLI and single GPUs also.

February 24, 2013 | 01:49 AM - Posted by nabokovfan87

Testing with VSync would "optimize" the video output on either vendor. So, maybe if you did an article with it on and base it as a "quality" type of test.

Both cards would be outputting as best they could with the monitor output being as optimized as it can be in terms of timing the frames. Compare the high end cards on 10 or so games, and it gives you an idea of which vendor has the best "quality" of game.

As far as GPU testing, I wouldn't want to see it. I would want the game to run and output as much frames as possible with everything turned on, check the times, and see how bad the timing gets.

February 22, 2013 | 11:15 AM - Posted by Anonymous (not verified)

Why nobody is testing game performance with vsync ON?

February 22, 2013 | 11:46 AM - Posted by Anonymous (not verified)

Because vsync sucks, limits framerate and increases latency. Triple buffering even more so.

February 22, 2013 | 12:32 PM - Posted by Lord Binky (not verified)

Vsync is the easy way to avoid getting set up correctly.

February 22, 2013 | 05:41 PM - Posted by David (not verified)

"For 1920x1080 gaming there is no reason to own a $999 GPU and definitely not this one."

That line is debatable, and all depends on how you game. I spend a lot of time with heavily modified Skyrim and that game gives my 7970 GHz a run for its money at 1080p. One particular setup I use (mainly for screenshots) kills that card, frequently averaging at 30fps.

This is shared in the Crysis 3 graph. Even the Titan struggles to max that game out at 1080p.

There certainly are reasons to own a Titan for 1080p gaming, provided you have the coin.

February 22, 2013 | 09:20 PM - Posted by icebug

I am really liking the new charts and all except it is a little hard to read the names of the cards on the frametime charts. Would be nice to have a little bit of a larger font on those.

February 23, 2013 | 10:13 AM - Posted by Anonymous (not verified)

Well done
That's what I always say there is a problem on graphics cards that No transfers true FPS
Larger screens 46 "

February 23, 2013 | 04:08 PM - Posted by Qiplayer (not verified)

If you test a card such this without using 3 hd screens you (in my opinion) missed the point.

February 23, 2013 | 07:27 PM - Posted by elel (not verified)

What about GPGPU benchmarks? Is that still coming, or did I miss the article with them? But, thanks for the review!

February 24, 2013 | 12:25 AM - Posted by ME (not verified)

With this test you become number 1 website !

February 24, 2013 | 01:52 AM - Posted by nabokovfan87

I really don't understand the point of this card. I get it, it has lots of cores, but that isn't really reflected in any normal use. It would make sense for CAD type application, rendering video with GPU acceleration, folding, etc. As far as useability, it just seems like $1k of graphic card made for having a LOT of VRAM for high resolutions, but the price sort of makes it not "worth" that.

If I was reading the specs, comparing them to the 680/7970 and looking at those prices, I would find it really hard to justify the difference if you weren't looking at VRAM. Architecturally it has more "stuff", but isn't really a major shift.

680: 1536 cores @ 1000 MHz
Titan: 2688 cores @ 836-876 MHz

That gives you the same, less, or 10-20 AVG FPS more based on the charts (sleeping dogs being the example of above, dirt 3 isn't worth discussing because f/r is so high on these cards).

I don't really know where I'm going with this, but hopefully some amount of a point has come across. What on earth is this card for?

Ryan: Side note, is it possible to add a Blu-Ray or HD video render test with GPU acceleration on your benchmarks? It seems like something worthwhile to demonstrate when you having something like this where the product may be meant for uses other than gaming.

February 25, 2013 | 04:15 PM - Posted by HeavyG (not verified)

I just didn't see this the same at all. The 680 vs the Titan isn't even close. Other than being "single GPU", they aren't even in the same class.

This card brings a LOT of different stuff to the table, such as the new GPU boost, focus on acoustics (if that is your thing), and temp control. I thought the 690 was a big breakthrough, but the Titan probably impressed me more than the 690 did on release.

February 27, 2013 | 11:39 PM - Posted by nabokovfan87

Game: AVG FPS(680 / Titan / 7970)
----------------------------------
FC3: 40.6/39.5/32.7
Crysis3: 30.1/41.6/30.6
Sleeping Dogs: 42/63.7/53.2

It depends on the game, but like I said earlier, it isn't about gaming performance at all. From what Ryan said on tekzilla the main point of titan is the actual physical issues behind it's fabrication and how THAT will be a big thing for the future of hardware or gives some insight or something.

I think it would be really interesting to see some BD decode testing.

February 24, 2013 | 05:41 AM - Posted by Pete (not verified)

Definitely the best Titan review out the PCPER! Excellent job guys! Frame Time analysis is invaluable!!!

Question:

What did you have your 3960X clocked at for benchmarks?

Wondering about Titan SLI scaling.....

February 24, 2013 | 07:49 AM - Posted by Pete (not verified)

Ryan can you please review frame-times in BF3 @ 5760x1080 with Titan, SLI, & 3-way SLI.

My currents 670s stutter with MSAA turned on even though vram not close to max.

Your frame-time review will be the determining factor if i get the Titans or not.

February 25, 2013 | 06:22 AM - Posted by Anonymous (not verified)

Wow - seems this site got big money from nvidia !

Did the same test they did and had no differences in framerate whatsoever for amd cf.
If you want to call me biased - go right ahead - I have pcs with nvd cards and pcs with amd cards - I don't care about the manufacturer, I only care about bang for the buck and titan is a huge bust - two 7970s (680$) outperform a titan (1000$), so for me there is no question about what to get.

February 25, 2013 | 03:28 PM - Posted by Jeremy Hellstrom

Which DVI capture cards were you using? 

... and where is my cheque NVIDIA!

February 26, 2013 | 11:03 AM - Posted by Anonymous (not verified)

Deltacast Delta-dvi

February 25, 2013 | 10:21 AM - Posted by Elvis (not verified)

Great Article. Nice to see original thought and work in a tech Blog (instead of more useless fps number) .
The Titan has impressive performance, too bad the price is outta my reach :'(
BTW Anyone else notice that 7970GE is starting to kick 680 butt. AMD driver team is on a roll! However CF looks bad and they need to correct it, seeing as they have no single chip competitor to Titan.
@Ryan : To make things interesting, why not benchmark some games which are not as "popular", i.e driver optimized.
Also, in my (humble?) opinion your articles would be more professional(better) if you avoided superlatives and words like beast (so clichéd). They make refutation of bias harder.
Once again, great work.

February 25, 2013 | 10:24 AM - Posted by Elvis (not verified)

@Ryan could you please block the ip of the rabid fan-atic. Really spoils the whole comments section...
@Anonymous : obvious troll. Not gonna bother replying. Sod off!

February 25, 2013 | 10:44 AM - Posted by Trey Long (not verified)

This business of runt frames significantly padding Crossfire's fps numbers is a huge story in the GPU world. There needs to be a major effort to expose the truth of this, whatever it is.

February 25, 2013 | 05:56 PM - Posted by Epoq (not verified)

I agree wholeheartedly. If this is true in it's entirety it would destroy AMD's credibility in the multi-GPU realm. Before this, most people were in agreeance that for high res and multi-display configurations, AMD is the way to go. This would change everything.

March 2, 2013 | 06:19 AM - Posted by Max Klar (not verified)

I doubt that the new 3dMark is reliable. Maybe it's just created to boost Titan.
I did the benchies with GTX680SLI/i7-3930K/X79. In 3dMark11 I get about nearly P16000 @stock and +P18000 occed.
But in Firestrike:
GTX680 single: 6300
GTX680SLI: 4200 (!)

And it's not only my system, you can find these biased or faulty results easily.

So the 3dMark benchmark is a joke at the actual state and should not be used in a professional environment.

March 3, 2013 | 01:06 AM - Posted by Anonymous (not verified)

Little bumed with your test???? this is a 2d surround card I am running 3 evga 680`s sc at 6000×1200 please let's see the real meat and potatoes people buying this card ( me) want to see ....well let's say three 680`s at 6000×1200 and two titans sli at same res that I think is all that really matters here rite? the 680 only having 2gb men must fail hard against two Titans with its 6gb the titan card is very specifically a hi-res surround gaming card I know you need to test everything but I think you should have started the other way around IMO..

March 3, 2013 | 03:02 PM - Posted by KansasCityTom (not verified)

I just spent a little over $1100 on 3 x 7970's and against my friends new Titan, I basically walk over him in all benchmarks.

March 4, 2013 | 07:55 PM - Posted by Trey Long (not verified)

Except that Crossfire is a sham. You get no better performance than one card. Read the article before spouting. And Tech Report and Hardocp. Runt frames a real disaster for Crossfire as they can be seen, measured, and exposed, unlike the fraps number which includes totally degraded frames in their fps number you rely on. Latencies have long been an issue with AMD cards and this clarifies in Crossfire.

April 3, 2013 | 01:49 AM - Posted by Anonymous (not verified)

Great work Ryan keep up the great work...by the way how do we support your site make donations !!!!! p.s I'm loving my tri sli titans this is the card I have been waiting for my hole life... been builbing systems starting back in the voodoo days finally 6000×1200 plays like butter nvidia is really something eh!!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.