Review Index:

NVIDIA GeForce GTX TITAN Preview - GK110, GPU Boost 2.0, Overclocking and GPGPU

Author: Ryan Shrout
Manufacturer: NVIDIA

Big and Little Systems, Conclusions

NVIDIA is very proud of the GeForce GTX TITAN; that much is obvious.  They are even claiming that the TITAN will find its way into small form factor and mini-ITX systems from the likes of MAINGEAR, Falcon Northwest, Digital Storm and more.  With the 10.5-in card length and the focus on temperature and sound levels, the GK110 could make a surprisingly good gaming graphics card for a SFF box.  System builders could change the default settings and lower the temperature target from 80C to 75C for example, limiting the clock rates and performance some but creating a quieter and more confined-space friendly design.

View Full Size

The problem with this idea is that the GeForce GTX TITAN is a $999 graphics card and most of these SFF gaming rigs are in the $1000-2000 range completely configured.  Doubling the cost of your system with a TITAN that isn't fully utilized from a performance perspective seems like a bit of an eccentric choice.  It is a great talking point and slide in a presentation but I question the usefulness of putting such a powerful card in such a small space, even if the hardware today can keep up with it.

View Full Size

I think a much more likely scenario is the full-size ATX gaming system with one, two or three TITANS are humming along with some moderate overclocks.  The Geekbox Ego Maniacal rig we were sent is the epitome of gaming perfection: 3-Way GTX TITAN SLI, 4.6 GHz Sandy Bridge-E, 32GB of memory, RAID-0 SSDs!  But you're going to have to pay for that kind of power!

View Full Size

Normally in this spot we would summarize our findings on the performance of the NVIDIA GeForce GTX TITAN graphics card, talk about the pricing and availability and then finalize a conclusion for our readers to base an informed buying decision on.  Not today.  NVIDIA has decided to hold back on the benchmarks on TITAN until Thursday morning, so you'll just have to stop back here for the results and our final analysis then.  I can tell you though that you'll like what you see.

I am allowed to talk about pricing though and as I have mentioned several times, the MSRP for TITAN is $999, the same price as the GTX 690.  The GTX 690 has a pair of GK104 GPUs and 2GB of memory each while TITAN has a single larger GPU with 6GB of memory.  Being as non-specific as possible, the GTX 690 is still faster in most cases but as I have said since SLI was reintroduced in 2004, if it is possible to buy a single GPU that can perform on the same level as a pair of GPUs, that is going to present the better gaming solution.  Doing away with the need to balance the performance of two cards, even identical ones, will always benefit the gamer in the long run.

View Full Size

In addition to our benchmarks of the GeForce GTX TITAN on Thursday, I would highly recommend you schedule to be on our PC Perspective Live! page at 2pm ET / 11am PT on Thursday, February 21st as I will be joined by NVIDIA's Tom Petersen for a discussion and demonstration of the TITAN performance and features.

Our NVIDIA GeForce GTX TITAN Coverage Schedule:

For now, watch our video on the GeForce GTX TITAN graphics card and prepare for Thursday!!


Video Loading...

Watch this same video on our YouTube channel

February 19, 2013 | 09:57 AM - Posted by Kyle (not verified)

Yeah, but will it run Crysis?

February 19, 2013 | 10:01 AM - Posted by Ryan Shrout

We'll find out on Thursday.  :)

February 20, 2013 | 03:49 PM - Posted by Anonymous (not verified)


February 22, 2013 | 07:07 AM - Posted by Anonymous (not verified)

Actually, the better question is: will it blend?

February 19, 2013 | 10:03 AM - Posted by I don't have a name. (not verified)

Crysis 3, more like. :) Can't wait to see the performance results!

February 19, 2013 | 10:05 AM - Posted by I don't have a name. (not verified)

Might the GTX Titan review be the first to use your new frame-rating performance metric? Can't wait! :)

February 19, 2013 | 10:06 AM - Posted by Ryan Shrout

I can tell you right now that it won't be the full version of our Frame Rating system, but we'll have some new inclusions in it for sure.

February 19, 2013 | 10:21 AM - Posted by stephenC (not verified)

I am really looking forward to seeing how this card will perform with crysis 3

February 19, 2013 | 10:40 AM - Posted by Anonymous (not verified)

I think I'll get one. I have no need for it, but it looks cool. So, from someone who knows very little about computers or gaming, I take it this would be "overkill" to have my buddy build me a new system around this?

February 19, 2013 | 10:43 AM - Posted by Prodeous (not verified)

Hope to see some GPGPU testing and comparsion to 680. I can assist with Blender3D as it support GPGPU rendering on CUDA cards if you guys want.. Need to find out if I need to start putting coins away for another puchase ;)

February 19, 2013 | 11:31 AM - Posted by Ryan Shrout

Point me in the right direction!

February 19, 2013 | 10:16 PM - Posted by Riley (not verified)

+1 for blender render times! Go to to download blender (looks like it is down as I post this, probably for v2.66 release). could be used to find stuff to render. That way the files would be publicly available to anyone who wanted to compare your times to their own. Just make sure the files use the cycles rendering engine (not blender internal).

Instructions at can be followed to swap between CPU and GPU rendering.

February 20, 2013 | 05:22 PM - Posted by Gadgety

I'm looking to replacing three GTX580 3GB mainly for rendering. With 6GB memory and the massive amount of CUDA cores I'm really looking forward to Blender Cycles speeds, but also Octane, VRay-RT. This must be one of the usage scenarios for the Titan, rather than pure gaming.

February 21, 2013 | 05:49 PM - Posted by Prodeous (not verified)

Download blender from:

Windows 64bit install is the simplest.

The most common test for GPGPU rendering is to use the link below. "The unofficial Blender Cycles benchmark"

And then use the information provided by Riley:

February 19, 2013 | 11:12 AM - Posted by Jumangi (not verified)

*shrug* I guess its fun for the reviewer to get something new in the office but this might as well still be a myth as 99.99% of users will never even think about getting something this expensive.

Give us something new that regular people can actually afford...

February 19, 2013 | 11:18 AM - Posted by Rick (not verified)

i agree with above. one of these or two 670? i hope we see that comparison in your review.

February 19, 2013 | 11:32 AM - Posted by Ryan Shrout

670s are too cheap - you'll see two GTX 680s though.

February 19, 2013 | 01:12 PM - Posted by mAxius

ah the nvidiots will gladly put out $1000.00 for a graphics card that is nothing more than a 680 on roid rage... i know im not touching this card unless i win the lotto and get hit in the head with a brick making me mentally incapacitated.

February 19, 2013 | 01:38 PM - Posted by Oskars Apša (not verified)

If PcPer'll have the time and blessing of Nvidias highers and Geekbox Gods, make a review with a system comprised of these cards in sli and dual Intel Xeons and/or AMD Opterons.

February 19, 2013 | 01:52 PM - Posted by Anonymous (not verified)

Sweet mother of GOD!

I want! I can't wait to see the benches. They outta make Titan version the norm for future generations to bring down the x80 series lol.

February 19, 2013 | 02:23 PM - Posted by Anonymous (not verified)

It would be nice is someone would test this card with blender 3d to see how it rates for rendering against the Quadro GPUs that cost way more! Does anyone know of websites that benchmark open source software using GPUs, as most of the benchmarking sites are using Maya and other high priced software, and little opensource free software for their tests.

February 19, 2013 | 02:30 PM - Posted by PapaDragon

ALL that POWER from one Card, shame about the price. This is is my Adobe Premiere Pro and After Effects dream card, plus gaming! Too bad for the price.

Anyhow, a lot of great features, cant wait for the review, great informative video Ryan!

February 19, 2013 | 03:23 PM - Posted by David (not verified)

I wish NVIDIA had been a little more aggressive on the price but I still couldn't have afforded it. Just upgraded both my and my wife's system to 7970 GHzs because the bundle deal was too good to pass up.

That said, for all the people who say cards like this are overkill, my current Skyrim install would give this card a run for its money even at 1080p60.

February 19, 2013 | 03:32 PM - Posted by Anonymous (not verified)

Can I mount a 4-way Titan SLI?

February 19, 2013 | 06:44 PM - Posted by Josh Walrath

I don't know, that sounds uncomfortable.

February 19, 2013 | 05:53 PM - Posted by ezjohny

AMD here is your competition, What do you have in mind.
This graphic card is going to pack a punch people!!!

February 19, 2013 | 05:59 PM - Posted by Evan (not verified)

What is this? Lucky i got a ps3 last week! it looks better and pre ordered crysis 3 and it has blu ray so its better graphics!

February 19, 2013 | 07:32 PM - Posted by Scott Michaud

Lol... the sad part is there are some people who actually believe that.

February 19, 2013 | 08:20 PM - Posted by Anonymous (not verified)

The sad part is people who go from SD to HD and see no difference.

February 19, 2013 | 06:18 PM - Posted by papajohn

i will take out a loan to buy this card! :)

February 19, 2013 | 07:46 PM - Posted by Wolvenmoon (not verified)

Damn, I wish I could afford it. :'(

February 19, 2013 | 08:31 PM - Posted by razor512

Can you try running Doom 1 on that system when you get the chance? :)

February 20, 2013 | 12:41 AM - Posted by razor512

(Weird, why did it double post)

February 20, 2013 | 12:06 AM - Posted by Anonymous (not verified)

Man I hope they are readily available I would like to at least get two of these for SLI I'm kind of worried about the limited run comments..?? I am running three 680sc on three 27` for a 6000×1200 hmmmm.... guess well find out what u guys think.

February 20, 2013 | 12:14 AM - Posted by ftimster

By the way love the pcper podcast u guys are halarious :-)

February 20, 2013 | 02:25 AM - Posted by Mig (not verified)

+1 on the GPGPU benchmarks as well; I'd like to see octane render and blender test results....also if you guys can get your hands on a 4k display and see how much you can push the 690 SLI and Titan SLI to run crysis or 3dmark at the higher resolution.

February 20, 2013 | 03:57 AM - Posted by Mangix

Now I get why Nvidia locked the maximum Pixel Clock frequency starting around the ~304 series of drivers. It was all in preparation for this card.

For anyone who wishes to "overclock" their displays, see here:

February 20, 2013 | 08:01 AM - Posted by Skidmarks (not verified)

While many people desire it, few people will buy it. At it's projected price it will be redundant.

February 21, 2013 | 07:06 PM - Posted by TinkerToyTech

what's the name(s) of some of the ray tracing demo's that are out there? light the mirror orbs that you move around and things like that, one of the graphics card vendors has it all the time but I wasnt able ot write the names down

February 26, 2013 | 09:37 AM - Posted by Bpositive (not verified)

I would to see some Cuda performance test as well, in particular Vray RT.

May 14, 2013 | 07:53 PM - Posted by Mr. wtf (not verified)

Christ.. why wouldn't they enable this feature on the 600x series?!? At least up the throthle temperature to 80 from it's current 70 and call it 2.0 GPU boost.. The stock may get fried but.. people who have a reference card could easly handle 80deg.

August 27, 2013 | 03:13 PM - Posted by mohamed (not verified)

i want to ask aquestion
iam now go to buy acomputer for 3d max rendring an design
and I am very confused between asus geforce gtx 780 6gb titanum and gtx 690 4gb
please help me

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.