Review Index:
Feedback

NVIDIA GeForce GTX TITAN Performance Review and Frame Rating Update

Author: Ryan Shrout
Manufacturer: NVIDIA

Frame Rating Update - Why CrossFire was Omitted

Why are you not testing CrossFire??

One question I know will be asked about this review is that in our benchmarks today you will not see results from AMD CrossFire configurations in 2-Way or 3-Way combinations.  AMD is only represented by a single Radeon HD 7970 GHz Edition card while we are using both the GTX 690 dual-GPU card, GTX 680s in SLI and GTX TITANs in SLI. 

Why the bias?

If you haven't been following our sequence of stories that investigates a completely new testing methodology we are calling "frame rating", then you are really missing out.  (Part 1 is here, part 2 is here.)  The basic premise of Frame Rating is that the performance metrics that the industry is gathering using FRAPS are inaccurate in many cases and do not properly reflect the real-world gaming experience the user has.

Because of that, we are working on another method that uses high-end dual-link DVI capture equipment to directly record the raw output from the graphics card with an overlay technology that allows us to measure frame rates as they are presented on the screen, not as they are presented to the FRAPS software sub-system.  With these tools we can measure average frame rates, frame times and stutter, all in a way that reflects exactly what the viewer sees from the game.

We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS.  As such, I decided that it was simply irresponsible of me to present data to readers that I would then immediately refute on the final pages of this review - it would be a waste of time for the reader and people that skip only to the performance graphs wouldn't know our theory on why the results displayed were invalid.

Many other sites will use FRAPS, will use CrossFire, and there is nothing wrong with that at all.  They are simply presenting data that they believe to be true based on the tools at their disposal.  More data is always better. 

Here are these results and our discussion.  I decided to use the most popular game out today, Battlefield 3 and please keep in mind this is NOT the worst case scenario for AMD CrossFire in any way.  I tested the Radeon HD 7970 GHz Edition in single and CrossFire configurations as well as the GeForce GTX 680 and SLI.  To gather results I used two processes:

  1. Run FRAPS while running through a repeatable section and record frame rates and frame times for 60 seconds
  2. Run our Frame Rating capture system with a special overlay that allows us to measure frame rates and frame times with post processing.

Here is an example of what the overlay looks like in Battlefield 3.

View Full Size

Frame Rating capture on GeForce GTX 680s in SLI - Click to Enlarge

The column on the left is actually the visuals of an overlay that is applied to each and every frame of the game early in the rendering process.  A solid color is added to the PRESENT call (more details to come later) for each individual frame.  As you know, when you are playing a game, multiple frames will make it on any single 60 Hz cycle of your monitor and because of that you get a succession of colors on the left hand side.

By measuring the pixel height of those colored columns, and knowing the order in which they should appear beforehand, we can gather the same data that FRAPS does but our results are seen AFTER any driver optimizations and DX changes the game might make.

View Full Size

Frame Rating capture on Radeon HD 7970 CrossFire - Click to Enlarge

Here you see a very similar screenshot running on CrossFire.  Notice the thin silver band between the maroon and purple?  That is a complete frame according to FRAPS and most reviews.  Not to us - we think that frame rendered is almost useless. 

 

View Full Size

Data gathered from FRAPS

Here is a typical frame rate over time graph as generated by FRAPS.  Looks good right?  CrossFire and SLI are competitive with the advantage to the HD 7970s.

View Full Size

Data gathered from Frame Rating Capture

This is the same graph with data gathered from our method that omits RUNT frames that only represent pixels under a certain threshold (to be discussed later).  Removing the tiny slivers gives us a "perceived frame rate" that differs quite a bit - CrossFire doesn't look faster than a single card.

 

View Full Size

Data gathered from FRAPS - Click to Enlarge

This is the raw frame times as captured by FRAPS - again we are looking for a narrow band of frametimes to represent a smooth experience.  Both single cards do pretty well, but SLI sees a bit more variance and CrossFire sees a bigger one.  Quite a bit bigger.

View Full Size

Data gathered from Frame Rating Capture - Click to Enlarge

Here is the same data gathered by our new capture system - the CrossFire configuration looks MUCH worse with many frames hitting near 0ms of screen time.  That would be great if they were ALL like that but unfortunately they also scale up to 20ms and higher quite often.  Also notice NVIDIA's is actually MORE uniform indicating that there is some kind of smoothing going on after the frame leaves the game engine's hands. 

 

View Full Size

Data gathered from FRAPS - Click to Enlarge

Let's zoom in a bit - here is 100 frames of the FRAPS frametimes from above.  Notice the see-saw effect that the CrossFire output has...

View Full Size

Data gathered from Frame Rating Capture - Click to Enlarge

And see how much worse it is here in our Frame Rating Capture configuration.  The pattern is actually exaggerated on the CrossFire solution while the SLI configuration is smoother. 

 

Here are a couple more screenshots from our captures.

View Full Size

Frame Rating capture on Radeon HD 7970 CrossFire - Click to Enlarge

 

View Full Size

Frame Rating capture on GeForce GTX 680s in SLI - Click to Enlarge

 

View Full Size

Frame Rating capture on Radeon HD 7970 CrossFire - Click to Enlarge

 

View Full Size

Frame Rating capture on GeForce GTX 680s in SLI - Click to Enlarge

 

This is just a preview of what we have planned for our new Frame Rating Capture performance testing method.  We have gone through many games with this and the results can vary from looking better than FRAPS to looking much, much worse. 

I am eager to get your feedback - please feel free to leave comments below and the follow on to the conclusion of our GeForce GTX TITAN review!

June 7, 2013 | 07:34 PM - Posted by Shari (not verified)

" This mid-video dialogue is really interesting that we haven't seen such cute dialogue in music video for a long time. All what you need to know about becoming discovered online. This network will soon be chock full of independent and original productions that are maintained only on the web.

My webpage ::

v=lQ9Sn0bhOpΕ">youtube views

February 21, 2013 | 11:02 PM - Posted by mateo (not verified)

Can't wait for you guys to set your new benching machine.

No freebies anymore for GPUs, nor for game developers ;)

February 21, 2013 | 11:31 PM - Posted by arbiter

As you said on TWiCH AMD wasn't to happy with these results. I would expect that bit of an understatement. They made claims the 7970 Ghz was the fastest gpu on the market for a while, This new testing method kinda casts a major shadow on that claim if they are using so many "runt frames" to boost fps score's

February 28, 2013 | 10:19 AM - Posted by Anonymous (not verified)

As I understand it, the runt frames were a crossfire characteristic. The single GPU setup was fine.

February 22, 2013 | 04:28 AM - Posted by techno (not verified)

This is all very interesting stuff about the frame rate analysis.

It seems to be the case that frame rate variance reduces as you test at more gpu intensive settings.

This suggests to me that the variance is being caused by cpu bottle necking. Amd's set up would appear to be more sensitive to these cpu bottle necks than nvidia's.

This also explains how the temporary fix of limiting the max frame rate on Amd set ups works as it is preventing rapid frame rates from causing cpu bottle necks.

Was wondering what clock speed the test platforms 3960 was running at and it you have tested how this frame time variance is affected by different cpu clock speeds?

February 28, 2013 | 10:27 AM - Posted by Identity Unknown (not verified)

This makes a lot of sense as (IIRC) a while back AMD actively pursued higher CPU utilization to lighten the GPU load and better balance to system resource utilization. Being more sensitive to CPU hiccups would be an undesirable side effect of such a pursuit.

February 22, 2013 | 06:34 AM - Posted by BiggieShady

Zoomed FRAPS frame time graph is not labeled properly.

Great review btw, specifically the page before the last.

February 22, 2013 | 10:01 AM - Posted by yeki (not verified)

Seriously? You're still doing this? Can you please give me one reason, just ONE reason, as to why I would want to play a game without Vsync and Triple Buffering? Because that is the only, and I repeat, the ONLY situation in which your "novel" and "state-of-the-art" method of benchmarking VGAs would be relevant to an actual real-life scenario. I'd nominate you for a Nobel prize if you do.

February 22, 2013 | 11:31 AM - Posted by Atomic Walrus (not verified)

A little aggressive considering the topic, maybe?

Competitive multi-player games. Triple buffered vsync introduces significant amounts of input latency. Enough to detect in a blind test, and certainly way more than any display or input device a gamer would be likely to use.

This might not matter to you, but it does to plenty of other PC gamers. I found this article to be extremely valuable.

February 22, 2013 | 01:50 PM - Posted by Ryan Shrout

I kind of agree on Vsync, but frame buffering, probably not:

"1. If it is not properly supported by the game in question, it can cause visual glitches. Just as tearing is a visual glitch caused by information being transferred too fast in the buffers for the monitor to keep up, so too in theory, can triple buffering cause visual anomalies, due to game timing issues for example.

2. It uses additional Video RAM, and hence can result in problems for those with less VRAM onboard their graphics card. This is particularly true for people who also want to use very high resolutions with high quality textures and additional effects like Antialiasing and Anisotropic Filtering, since this takes up even more VRAM for each frame. Enabling Triple Buffering on a card without sufficient VRAM results in things like additional hitching (slight pauses) when new textures are being swapped into and out of VRAM as you move into new areas of a game. You may even get an overall performance drop due to the extra processing on the graphics card for the extra Tertiary buffer.

3. It can introduce control lag. This manifests itself as a noticeable lag between when you issue a command to your PC and the effects of it being shown on screen. This may be primarily due to the nature of VSync itself and/or some systems being low on Video RAM due to the extra memory overhead of Triple Buffering."

http://www.tweakguides.com/Graphics_10.html

February 22, 2013 | 02:42 PM - Posted by mateo (not verified)

All valid points, but the question remains.

Does turning Vsync ON, fixes CF presenting issues.
If it does, is the performance of Vsycned CF in line when compared to Vsync OFF results.

Also does the system exhibits frame skipping like with RadeonPro "smoothing"?

And I'm not asking specifically about CF alone, but SLI and single GPUs also.

February 24, 2013 | 01:49 AM - Posted by nabokovfan87

Testing with VSync would "optimize" the video output on either vendor. So, maybe if you did an article with it on and base it as a "quality" type of test.

Both cards would be outputting as best they could with the monitor output being as optimized as it can be in terms of timing the frames. Compare the high end cards on 10 or so games, and it gives you an idea of which vendor has the best "quality" of game.

As far as GPU testing, I wouldn't want to see it. I would want the game to run and output as much frames as possible with everything turned on, check the times, and see how bad the timing gets.

May 30, 2014 | 06:23 AM - Posted by Israel (not verified)

It virtually erases new scars and significantly reduces visibility of old scars.
The most effective issue to don't forget so that you
can effectively eliminate acne scars isn't to panic. Cucumber is the best astringent that you
can use because it is very effective when it comes to tightening the pores of the skin.

Have a look at my site; how to get rid of acne scars

February 22, 2013 | 11:15 AM - Posted by Anonymous (not verified)

Why nobody is testing game performance with vsync ON?

February 22, 2013 | 11:46 AM - Posted by Anonymous (not verified)

Because vsync sucks, limits framerate and increases latency. Triple buffering even more so.

February 22, 2013 | 12:32 PM - Posted by Lord Binky (not verified)

Vsync is the easy way to avoid getting set up correctly.

February 22, 2013 | 05:41 PM - Posted by David (not verified)

"For 1920x1080 gaming there is no reason to own a $999 GPU and definitely not this one."

That line is debatable, and all depends on how you game. I spend a lot of time with heavily modified Skyrim and that game gives my 7970 GHz a run for its money at 1080p. One particular setup I use (mainly for screenshots) kills that card, frequently averaging at 30fps.

This is shared in the Crysis 3 graph. Even the Titan struggles to max that game out at 1080p.

There certainly are reasons to own a Titan for 1080p gaming, provided you have the coin.

February 22, 2013 | 09:20 PM - Posted by icebug

I am really liking the new charts and all except it is a little hard to read the names of the cards on the frametime charts. Would be nice to have a little bit of a larger font on those.

February 23, 2013 | 10:13 AM - Posted by Anonymous (not verified)

Well done
That's what I always say there is a problem on graphics cards that No transfers true FPS
Larger screens 46 "

February 23, 2013 | 04:08 PM - Posted by Qiplayer (not verified)

If you test a card such this without using 3 hd screens you (in my opinion) missed the point.

February 23, 2013 | 07:27 PM - Posted by elel (not verified)

What about GPGPU benchmarks? Is that still coming, or did I miss the article with them? But, thanks for the review!

February 24, 2013 | 12:25 AM - Posted by ME (not verified)

With this test you become number 1 website !

February 24, 2013 | 01:52 AM - Posted by nabokovfan87

I really don't understand the point of this card. I get it, it has lots of cores, but that isn't really reflected in any normal use. It would make sense for CAD type application, rendering video with GPU acceleration, folding, etc. As far as useability, it just seems like $1k of graphic card made for having a LOT of VRAM for high resolutions, but the price sort of makes it not "worth" that.

If I was reading the specs, comparing them to the 680/7970 and looking at those prices, I would find it really hard to justify the difference if you weren't looking at VRAM. Architecturally it has more "stuff", but isn't really a major shift.

680: 1536 cores @ 1000 MHz
Titan: 2688 cores @ 836-876 MHz

That gives you the same, less, or 10-20 AVG FPS more based on the charts (sleeping dogs being the example of above, dirt 3 isn't worth discussing because f/r is so high on these cards).

I don't really know where I'm going with this, but hopefully some amount of a point has come across. What on earth is this card for?

Ryan: Side note, is it possible to add a Blu-Ray or HD video render test with GPU acceleration on your benchmarks? It seems like something worthwhile to demonstrate when you having something like this where the product may be meant for uses other than gaming.

February 25, 2013 | 04:15 PM - Posted by HeavyG (not verified)

I just didn't see this the same at all. The 680 vs the Titan isn't even close. Other than being "single GPU", they aren't even in the same class.

This card brings a LOT of different stuff to the table, such as the new GPU boost, focus on acoustics (if that is your thing), and temp control. I thought the 690 was a big breakthrough, but the Titan probably impressed me more than the 690 did on release.

February 27, 2013 | 11:39 PM - Posted by nabokovfan87

Game: AVG FPS(680 / Titan / 7970)
----------------------------------
FC3: 40.6/39.5/32.7
Crysis3: 30.1/41.6/30.6
Sleeping Dogs: 42/63.7/53.2

It depends on the game, but like I said earlier, it isn't about gaming performance at all. From what Ryan said on tekzilla the main point of titan is the actual physical issues behind it's fabrication and how THAT will be a big thing for the future of hardware or gives some insight or something.

I think it would be really interesting to see some BD decode testing.

February 24, 2013 | 05:41 AM - Posted by Pete (not verified)

Definitely the best Titan review out the PCPER! Excellent job guys! Frame Time analysis is invaluable!!!

Question:

What did you have your 3960X clocked at for benchmarks?

Wondering about Titan SLI scaling.....

February 24, 2013 | 07:49 AM - Posted by Pete (not verified)

Ryan can you please review frame-times in BF3 @ 5760x1080 with Titan, SLI, & 3-way SLI.

My currents 670s stutter with MSAA turned on even though vram not close to max.

Your frame-time review will be the determining factor if i get the Titans or not.

February 25, 2013 | 06:22 AM - Posted by Anonymous (not verified)

Wow - seems this site got big money from nvidia !

Did the same test they did and had no differences in framerate whatsoever for amd cf.
If you want to call me biased - go right ahead - I have pcs with nvd cards and pcs with amd cards - I don't care about the manufacturer, I only care about bang for the buck and titan is a huge bust - two 7970s (680$) outperform a titan (1000$), so for me there is no question about what to get.

February 25, 2013 | 03:28 PM - Posted by Jeremy Hellstrom

Which DVI capture cards were you using? 

... and where is my cheque NVIDIA!

February 26, 2013 | 11:03 AM - Posted by Anonymous (not verified)

Deltacast Delta-dvi

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.