Review Index:
Feedback

The NVIDIA GeForce GTX TITAN Z Review

Author: Ryan Shrout
Manufacturer: NVIDIA

Bioshock Infinite

Bioshock Infinite (DirectX 11)


 

BioShock Infinite is a first-person shooter like you’ve never seen. Just ask the judges from E3 2011, where the Irrational Games title won over 85 editorial awards, including the Game Critics Awards’ Best of Show. Set in 1912, players assume the role of former Pinkerton agent Booker DeWitt, sent to the flying city of Columbia on a rescue mission. His target? Elizabeth, imprisoned since childhood. During their daring escape, Booker and Elizabeth form a powerful bond -- one that lets Booker augment his own abilities with her world-altering control over the environment. Together, they fight from high-speed Sky-Lines, in the streets and houses of Columbia, on giant zeppelins, and in the clouds, all while learning to harness an expanding arsenal of weapons and abilities, and immersing players in a story that is not only steeped in profound thrills and surprises, but also invests its characters with what Game Informer called “An amazing experience from beginning to end."

Our Settings for Bioshock Infinite

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

Man oh man, Bioshock Infinite. This game still has lots and lots of stuttering issues with all hardware, but in particular with the NVIDIA solutions. At 2560x1440 you'll find that the R9 295X2 is hitting nearly 180 FPS, 25% faster than the GeForce GTX Titan Z. To be fair though, the Titan Z is able to push out 140 FPS, more than enough for single screen gaming. The GTX 780 Ti SLI setup is getting more than 155 FPS on average though, coming in 10% faster than the Titan Z.

 

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

At 4K, we get even more frame rate variances and hitches, which is clearly not a good thing. Performance results are significantly closer in this case as well, with the GTX 780 Ti SLI actually taking the lead at just under 90 FPS on average. The GTX Titan Z and R9 295X2 both range from 81-83 FPS, essentially a tie, though the AMD multi-GPU experience is a bit more smooth.

June 11, 2014 | 11:34 AM - Posted by snook

google this: 780 ti sli vs 295x2

simple fact is 780Ti sli doesn't beat it.

I'm always at a loss for the "overclock them and they will win" statements. no Nv card runs in game at its baseclock. they all boost...all of them. so this isn't 780Ti sli @ 876Mhz vs 295X2 @ 1018Mhz. more likely 780Ti sli @ 1000Mhz vs 295X2 @1018.

and for all the 295X2 is hot as hell folks...the 780Ti sli will be 10C hotter at those clocks.

TLDR: the cards (290X/780Ti) are clock for clock equals and the 290X2 is the fastest single graphics card...

June 11, 2014 | 05:15 PM - Posted by Anonymous (not verified)

Nope, clock for clock 780ti is BETTER then 290x. In fact, even at stock 780ti is better then 290x at stock, with lower clocks.

June 12, 2014 | 10:26 AM - Posted by ThorAxe

Agreed. Increasing clocks on the 780Ti has a massive increase in performance when compared to a similarly overclocked 290x.

June 13, 2014 | 01:22 AM - Posted by RE4PRR (not verified)

Most benchmarks I've seen how the 780ti SLI beat the 295x2 at every res with most games except 4K.

Boost clock is only 926mhz, add another 100 or get 2 of the ghz editions to make it a more fair comparison and 780ti sli is a guaranteed win.

And lets not ignore the fact you can overclock the 780ti's even further to 1200mhz stable gaming on air, you can barely overclock the 295x2 as has been proven in all other reviews.

June 11, 2014 | 03:43 AM - Posted by Relayer (not verified)

I'd like to see the benchmarks run after the cards are warmed and the clocks have settled. nVidia cards can maintain pretty high boosts for a minute or two, but tend to drop off pretty substantially after that until the temps settle.

June 11, 2014 | 10:12 AM - Posted by Anonymous (not verified)

I'm hoping Nvidia releases a GTX 790.

Vast majority of gamers aren't interested in double precision.

June 11, 2014 | 11:47 AM - Posted by snook

now this would be sweet. but at what price? $1500 and it's awesomeness for days, and it would be THE card for MITX builds assuming 2 slot compatibility

June 11, 2014 | 12:21 PM - Posted by Airbrushkid (not verified)

Thanks for no help. I found out that 2 Titan Blacks in sli are faster then 1 Titan Z.

June 11, 2014 | 03:23 PM - Posted by simieon (not verified)

i wish we could all stop arguing about the titan line. it is true that there is people who would benefit from this inbetweener, and that it would be more suited under for instance the tessla line.
but nvidia made the choice to put this sub line under its gaming line, and actively market it at gamers, which means that this is how they should be judged.

June 11, 2014 | 08:47 PM - Posted by Peter_M (not verified)

A couple of Z's with SLI and that basically kisses goodbye to the use of any of your other slots, or am I missing something?

June 12, 2014 | 08:03 AM - Posted by Anonymous (not verified)

2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.

You would have to be idiot to get this.

June 12, 2014 | 08:03 AM - Posted by Anonymous (not verified)

2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.

You would have to be idiot to get this.

June 12, 2014 | 08:03 AM - Posted by Anonymous (not verified)

2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.

You would have to be idiot to get this.

June 11, 2014 | 11:03 PM - Posted by Anonymous (not verified)

While the DirectX 11 performance comparison can be appreciated, I think we would also like to see more OpenGL comparisons for balance sake. AMD has been known not to support OpenGL standards, and a consumer would want to know that a card will run the games they wish to play. This is also important for DirectX 9 games which many of the most popular games are still using that.

When testing games like Crysis 3 and others could you also post the settings so we know if any NVidia specific features such as tessellation or PhysX are enabled.

I do not favor one brand over another, but currently choose NVidia cards because they support features in the games I want to play while AMD does not at this time.

On the other hand, if I want a card for mining purposes I would definitely choose an AMD card, so maybe you should cover that kind of performance too. I am sure it is VERY important to some consumers to know that AMD is the CLEAR winner in this category of use in consumer grade cards.

I appreciate your testing methods, but they could still use some improvements by covering a wider range of uses and features each manufacturer provides.

When you test games like Skyrim also, you should perhaps mod the game (with the top graphical mods) the way most users still playing it would so you can ascertain performance in that scenario rather than the vanilla game which few if any are still playing it in that form.

June 12, 2014 | 01:43 PM - Posted by Anonymous (not verified)

Well still showing bias regardless. Microsoft stopped giving DX9 library support about 3 years ago, nobody less than the 10% of games released uses DX9, cause by the time you push the DX9 to the limits, you will be far more CPU bound than GPU bound thanks to the terrible overhead this API has. There are a couple of games that have a DX9 fallback code, but its only usefull if you don't have DX11/DX10 capable hardware or don't have the performance to run it, so there is no point caring much about DX9, cause today's hardware is fast enough to run any DX9 game, heck even was able to max STALKER with the Complete Mod which uses DX9 and is one of the most demanding DX9 applications ever, and ran over 65fps most of the time with my GTX 560M. AMD with the latest driver supports OpenGL 4.4 which is no problem, they are slower to adapt OpenGL than nVidia, but is as capable. Crysis 3 has Tessellation on by default and you can even turn it off manually if you want, but there is no point disabling it specially when the game is playable on most high end hardware with no issues. PhysX is a mimick which eventually will die as only one game or two that uses it are released every year or every two years may be. I love the technology but seeing how much underutilized went, I ended off selling my secondary GTX 650 that I had for PhysX processing.

June 12, 2014 | 08:04 AM - Posted by Anonymous (not verified)

2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.

You would have to be idiot to get this.

June 12, 2014 | 08:56 AM - Posted by Anonymous (not verified)

I think they would have gone with plait or EVGA 780 ti
and titan z companies will o.c it soon
cause 295x2 is overclocked

June 12, 2014 | 08:59 AM - Posted by Anonymous (not verified)

more accurately result when putting 780 ti from 1020 palit
or ghz cause 295x2 is already factory overclocked

June 12, 2014 | 10:27 AM - Posted by Trey Long (not verified)

Two OC'd Titan Blacks if you need compute. Two OC'd 780 ti s for gaming. Check out the inno ichill 780 ti which runs OC in the high 50s on AIR!!! AMD can only dream with their power hungry space heaters. Titan Z is for the I don't care what it costs crowd and Nvidia has been pretty good holding that segment.

June 12, 2014 | 03:26 PM - Posted by Airbrushkid (not verified)

2 titan blacks stock is fast then 1 titan z stock. And the titan z is basically 2 titan blacks in one package.

June 12, 2014 | 06:23 PM - Posted by Razvedka (not verified)

Any info on usable ram? its advertised as 12GB, but is all of that usable or is it merely 6GB mirrored?

June 15, 2014 | 02:22 AM - Posted by Anonymous (not verified)

I don't like AMD going outside the power specifications, but that is the only thing that really makes the product make any sense. Nvidia stayed within the 375 W limit, so they have lower performance than the 780Ti in SLI. Given the high price, there isn't much of a reason for this card to exist.

These dual gpu cards seem to really only be for marketing, rather than a real product. AMD made dual gpus closer to a real product by actually enabling full speed by using water cooling. It still doesn't make much sense (higher price for same performance) unless you are building a space constrained system. The Titan Z fails at being a marketing/publicity stunt since it obviously can not compete with the 295x2 in performance. AMD and Nvidia want to stay in the public consciousness, even when they do not have a (real) new product to release. These cards will be super low production volume, but they obviously generate a lot of publicity. Since AMD beat them to market with the 295x2, they do not want any publicity, since this has turned to negative publicity.

These dual gpu cards will not really make sense until they actually connect the gpus together to allow them to share memory and act more like a single gpu. To share memory, they need really high bandwidth interconnect (~100 GB/s), which is probably doable between chips so close together. I have been wondering if they could just shrink the memory controller and use the pins for interconnect instead, or even make them programmable to allow the same chip so be used in a single gpu card with full width memory interface or in a multi-gpu card with narrower memory interface. This would also allow use of less memory. Designing and producing such a design may not be feasible right now due to the low volume nature of the super high-end gpu market though. I don't think it is that useful for the compute market due the way compute gpus are being used (parallel independent task).

June 19, 2014 | 05:50 AM - Posted by vag (not verified)

So i am glad i built my dream machine with 3 gtx blacks that cost me as 1 z. i know it is overpriced, compared to 780ti, which is also overpriced, but running them underwater gave me huge potential for oc and smooth 4k. it was just a dream machine that i don't plan to swap for the forthcoming years. To my opinion, graphic cards should not be priced more than 400$ i don't believe there is no room for profit even for the z if this was the price target.. anyway, just an opinion.

June 30, 2014 | 06:17 AM - Posted by Gwendolyn1121

good test!

June 29, 2014 | 12:13 PM - Posted by Paul Deemer (not verified)

Actually after Watchdogs came out it has shown 3gb at Ultra Settings just doesn't cut it anymore when developers push the limits.

I'm ready to build my Rig soon but tired of waiting for GTX880 with 8gb. If the 880s are not due out until Christmas I may just go ahead with ASUS Strix GTX780 6gb in SLI.

August 28, 2014 | 07:43 PM - Posted by Broken_Heart

I don't know who made the decision to price this card at $3000. It can't even outperform AMD's $1500 card. This card should be sold for $1500 if not less.

As much as I hate to admit it, I think Nvidia would be unstoppable if it capitalizes on the Maxwell GPUs. The 750Ti is an impressive GPU and that was just the tip of the iceberg

September 5, 2014 | 12:32 PM - Posted by rikaj (not verified)

They are just about equal while the 780Ti's are running at 876Mhz. Too bad you can easily get cards that run 1300 or even 1400 on water. Just imagine the performance gap they would have with +50% clockspeed on the nvidia cards :D What a joke

September 17, 2014 | 05:01 AM - Posted by Nex (not verified)

Note: FLOPS are primarily important to people mining for Bitcoins or for protein folding. It means very little in the way of gaming.

With that being said, I think the only way that the TITAN-Z would pull ahead of other SLI/Crossfire setups is at 8K resolutions because of its immense 12GB of VRAM and 768-bit bus. Even then, I'm not sure that the ROPs, shaders and texture procs on the TITAN-Z would be able to keep up with that resolution.

I honestly think these cards' target audience is those planning on mining, and that's why they justify the exorbitant price tag despite the lackluster gaming performance but with a notable increase in teraFLOPS performance.

November 7, 2014 | 01:07 PM - Posted by Anonymous (not verified)

I've seen a bunch of vendors dropping the price significantly on the TitanZ, Alienware had a TitanZ half off promotion and now CyberPower is giving 500.00 off the price. Has this made this card a better value? Trying to price a new system, have about 4g to spend. I do not want to put it together myself or support it, I am laaaaaazy.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.