Review Index:

The NVIDIA GeForce GTX TITAN X 12GB GM200 Review

Manufacturer: NVIDIA

GM200 Specifications

With the release of the GeForce GTX 980 back in September of 2014, NVIDIA took the lead in performance with single GPU graphics cards. The GTX 980 and GTX 970 were both impressive options. The GTX 970 offered better performance than the R9 290 as did the GTX 980 compared to the R9 290X; on top of that, both did so while running at lower power consumption and while including new features like DX12 feature level support, HDMI 2.0 and MFAA (multi-frame antialiasing). Because of those factors, the GTX 980 and GTX 970 were fantastic sellers, helping to push NVIDIA’s market share over 75% as of the 4th quarter of 2014.

View Full Size

But in the back of our mind, and in the minds of many NVIDIA fans, we knew that the company had another GPU it was holding on to: the bigger, badder version of Maxwell. The only question was going to be WHEN the company would release it and sell us a new flagship GeForce card. In most instances, this decision is based on the competitive landscape, such as when AMD might be finally updating its Radeon R9 290X Hawaii family of products with the rumored R9 390X. Perhaps NVIDIA is tired of waiting or maybe the strategy is to launch soon before Fiji GPUs make their debut. Either way, NVIDIA officially took the wraps off of the new GeForce GTX TITAN X at the Game Developers Conference two weeks ago.

At the session hosted by Epic Games’ Tim Sweeney, NVIDIA CEO Jen-Hsun Huang arrived when Tim lamented about needing more GPU horsepower for their UE4 content. In his hands he had the first TITAN X GPU and talked about only a couple of specifications: the card would have 12GB of memory and it would be based on a GPU with 8 billion transistors.

Since that day, you have likely seen picture after picture, rumor after rumor, about specifications, pricing and performance. Wait no longer: the GeForce GTX TITAN X is here. With a $999 price tag and a GPU with 3072 CUDA cores, we clearly have a new king of the court.

Continue reading our review of the NVIDIA GeForce GTX Titan X 12GB Graphics Card!!

GM200 GPU Specifications

The basis for the new GeForce GTX TITAN X is NVIDIA’s GM200 GPU. This large, beastly part is based on the same Maxwell architecture that we know and love from the GTX 980, GTX 970 and GTX 960. It is also built on the same 28nm process technology that those same GPUs use, too. NVIDIA is not yet moving over to another process tech quite yet, despite the rumors that AMD is going to be migrating to 20nm on its next flagship GPU.

View Full Size

The GM200 includes 3072 CUDA cores, 192 texture units and a 384-bit memory bus. Clearly this GPU isn’t bluffing, it has some power.

  TITAN X GTX 980 TITAN Black R9 290X
GPU GM200 GM204 GK110 Hawaii XT
GPU Cores 3072 2048 2880 2816
Rated Clock 1000 MHz 1126 MHz 889 MHz 1000 MHz
Texture Units 192 128 240 176
ROP Units 96 64 48 64
Memory 12GB 4GB 6GB 4GB
Memory Clock 7000 MHz 7000 MHz 7000 MHz 5000 MHz
Memory Interface 384-bit 256-bit 384-bit 512-bit
Memory Bandwidth 336 GB/s 224 GB/s 336 GB/s 320 GB/s
TDP 250 watts 165 watts 250 watts 290 watts
Peak Compute 6.14 TFLOPS 4.61 TFLOPS 5.1 TFLOPS 5.63 TFLOPS
Transistor Count 8.0B 5.2B 7.1B 6.2B
Process Tech 28nm 28nm 28nm 28nm
MSRP (current) $999 $549 $999 $359

Essentially, the GTX TITAN X’s compute structure is exactly a 50% boost over the GeForce GTX 980 with just a slight reduction in clock rates at stock settings. You get 50% more processing cores, 50% more texture units, 50% more ROP units, 50% larger L2 cache and a 50% larger memory bus. Dang – that’s going to provide some impressive computing power, resulting in a peak theoretical throughput of 6.14 TFLOPS single precision.

During the keynote at GTC, NVIDIA's CEO quoted the single precision compute performance as 7.0 TFLOPS, which differs from the table above. The 6.14 TFLOPS rating above is based on the base clock of the GPU while the 7.0 TFLOPS number is based on "peak" clock rate. Also, just for reference, at the rated Boost clock the Titan X is rated at 6.60 TFLOPS.

View Full Size

A unique characteristic of this TITAN X card is that it does not have an accelerated performance configuration for double precision computing, which is something that both the TITAN and the TITAN Black had before it. The double precision performance is still a 1/32nd ratio (relative to single precision). That gives the TITAN X DP compute capability at just 192 GFLOPS. For reference, the TITAN Black has DP performance rated at 1707 GFLOPS with a 1/3rd ratio of the GPU’s 5.12 TFLOPS single precision capability. It appears that NVIDIA is not simply disabling the double precision compute capability on the GM200 GPU, hiding it and saving it for another implementation. Based on the die size, shader count and transistor count, it looks GM200 just doesn't have it. NVIDIA must have another solution up it's sleeve for Maxwell double precision compute.

Oh, and yes, before you ask me in the comments below, I have directly asked NVIDIA for comment about memory configuration concerns, in regards to the GTX 970. The TITAN X does not have any divided memory pools, the 384-bit memory bus is not sectioned off into any sub-groups that may or may not run at slower throughput. So, there’s that.

View Full Size

The GM200 is built on 24 SMX modules and this is the full GPU implementation so you should not expect another variant to show up down the road with anything higher than 3072 CUDA cores, unless the company spins a completely new GPU revision. Being that the GPU is still based on the 28nm TSMC process, and is compiled with 8 billion transistors, this is not a small component. Measured die size was 25mm x 25mm or 625mm2. The previous largest GPU we had seen is the GK110 used 7.1 billion transistors and had a die size of around 561mm2. (Note that NVIDIA sets the die size at 601mm2 based on measurements of 24.66mm x 24.38mm.)

Clock speeds on the TITAN X are lower than the GTX 980 at stock, as you would expect, but the differences aren’t drastic. With a base clock of 1000 MHz and rated Boost clock of 1075 MHz. The GTX 980 reference clocks are 1126 MHz/1216 MHz respectively resulting in a 12.6% decrease comparatively. The memory clock is still starting at 7.0 GHz, the same speed as the GTX 980 and even the previous GTX TITAN Black. The memory configuration is improved with 50% higher peak bandwidth, hitting 336.5 GB/s, which is higher than the 320 GB/s rated by the AMD Radeon R9 290X flagship.

Speaking of memory, the TITAN X will ship with 12GB of memory. Gulp. That is 3x the memory found on the GTX 980 and 2x the GTX TITAN Black which seemed crazy (at 6GB) when it shipped in February of 2014. I can already hear the debate and no, 12GB of memory is not going to be beneficial for gamers for quite some time – likely a time span even further out than the life of this GPU, to be fair. With the recently debate around the 3.5GB and 4GB frame buffers, we saw that most games available today, even when pushed with settings intending to increase memory usage, rarely extend in the world of 4GB. If you want to think really crazy, let’s assume you are planning on doing on 4K Surround gaming with three displays; you might be able to stretch that out to 7-8GB if you try real hard. Of course there is a secondary audience for this card that focus on GPGPU compute rather than gaming, where that 12GB of memory could be more useful, more quickly.

View Full Size

The only negative change on the GM200 (compared to GM204) is its rated TDP. With a listing of 250 watts, the GTX TITAN X is clearly going to run hotter than the GTX 980, which has a 165 watt TDP. (Of note, this is a 51% increase in TDP that is in line with the other specification changes.) Keep in mind that the Radeon R9 290X has a TDP of 290 watts, though we have measured it higher than that in our power testing several times. Both the new TITAN X and the R9 290X require a 6+8 pin power connection so it will be interesting to see how the real-world power consumption varies between these two GPUs.

The new GTX TITAN X shares the same feature set and capability as the GTX 980. That includes support for MFAA, VXGI acceleration, Dynamic Super Resolution, VR Direct and more. For more information on those features and what they bring to the table check out the links below.

Now, let’s dive into the design of the GTX TITAN X graphics card itself!

March 17, 2015 | 03:07 PM - Posted by Anonymous (not verified)

Not too bad. I really like the look of it. 4k performance is pretty good, but still not capable enough, not for $1000.

March 17, 2015 | 03:23 PM - Posted by CrisisHawk

Seems 4K is still very much the domain of SLI/Crossfire configurations. For high settings 60 fps at least.

March 17, 2015 | 05:07 PM - Posted by Polycrastinator (not verified)

All depends on managing expectations. I ran 4K on a single 760 for a while. Fell on its face if you enabled MSAA, but on low-medium settings without, you could mostly maintain 60fps (to be clear, I bought the monitor for the clarity while running multiple VMs in my day-job, not for gaming, and anyone doing a lot of work on the desktop, I'd recommend it). With 4K, lines are sharp enough that losing MSAA isn't the annoyance it is at lower resolutions. I'm now on a single 780, and I can run FXAA/MSAA with High-Ultra on most games. It looks pretty damn good. You just need to understand that 2xMSAA or 4xMSAA is going to make things unplayable fast.

March 23, 2015 | 09:07 AM - Posted by obababoy

You make good points that 4K is playable with lower cards. The only issue I have is that I value the Ultra settings over 4k. I would rather see all of the visual eye candy that was developed with the game engine then run at 4K with a 1440p or 1080p this can be had now days with any single $350 and below card. I just don't think 4K is worth it now.

March 17, 2015 | 03:12 PM - Posted by jharderson

Whats the probability of a GTX 980ti that uses the GM200

March 17, 2015 | 03:53 PM - Posted by Zorkwiz

I'd put money on a 6GB 980Ti based on this chip right around when AMD comes out with the 390X later this spring/early summer. I've read rumors of a ~$700 price for that. That would be in line with the 780Ti launch. I hope we see $500-550 980Tis by the holidays.

March 17, 2015 | 03:19 PM - Posted by pdjblum

I guess anybody that can afford this will not mind that pascal, which Huang says will significantly outpace maxwell, will be released in 2016.

March 21, 2015 | 05:24 AM - Posted by BlackDove (not verified)

I was thinking about a year ago that GM200 would be 20nm and would be a similar improvement to GK110 that it was to GF100.

Instead, Nvidia kept the GM200 on 28nm and decided to completely omit DP performance.

GM200 is a compromise from the start. Its basically Nvidias Haswell refresh.

The fact that GK210 and the K80 exist is further evidence of this.

I guess they need to release something between now and 2016 though. In reality this GPU seems to be the successor to the 780ti 6GB and not the Titan at all.

Im usually called an Nvidia fanboy and i can see these things selling well but i dont think its nearly as impressive as the Titan Black. Im actually glad i got such a good deal on my 780 when i did. This GM generation of GPUs isnt very impressive by comparison even though theyre faster.

GP100 should be the next GK110.

March 17, 2015 | 03:23 PM - Posted by boot318

Great card! Great review! AMD could learn a thing or two on power consumption from Nvidia. I'm going to hate seeing AMD's 600mm2(ish) card. :(

March 17, 2015 | 03:48 PM - Posted by Ophelos

Not really at all. This has been explained so many times over years and years of reviews of GPUs etc.

The higher the vram bit-rate the more power it needs. An this is also why AMD does so much better in 4k gaming benchmarks then Nvidia.

Google vram it to get alot more tech details on this. So stop complaining.

March 17, 2015 | 05:06 PM - Posted by trenter (not verified)

According to leaks the 390x will outperform titan x and pull inly 20 watts more power. The same sites that leaked accurate titan x specs many months ago, so the 390x leaks are most likely legit. Now what were you saying?

March 22, 2015 | 04:41 AM - Posted by Gammett (not verified)

WE are saying that you are a lying bitch because you said months ago, we didn't know the dang thing existed until recently. Plus the specs on new high end gpus are easy to guess. Also one correct estimate does not mean it's credible at all and the statement is about different card vendors, what if they only have a Nvidia source but not an amd source? This would make anything that was amd on their site look like it was pulled out of their asses.

March 17, 2015 | 03:24 PM - Posted by Airbrushkid (not verified)

Comparing the Titan X to the 295X is like comparing apples and oranges! No matter what the 295X is a dual GPU. and the Titan X is ONE GPU!! Now take two take 2 Titan X cards against the 295X and there will be a big difference!!

March 17, 2015 | 03:29 PM - Posted by boot318

There will be a big difference in price also.

The X almost matches the 295x2 @4k when overclocked. Pretty impressive if you ask me.

March 17, 2015 | 05:18 PM - Posted by Airbrushkid (not verified)

If price is a problem then get out of pc gaming and buy a console.

March 17, 2015 | 07:21 PM - Posted by Daniel Nielsen (not verified)

Oh please, you should probably get off the internet with that lame comment.

March 18, 2015 | 12:27 AM - Posted by Airbrushkid (not verified)

It's looks like you're another one that needs a console and dump your pc.

March 18, 2015 | 03:47 PM - Posted by Anonymous (not verified)

You're an idiot. The conscientious buyer has always ruled the PC, not the brain dead moron who thinks bigger price = better. Don't bring the console vs PC crap into this shitstorm of idiocracy you're pushing.

Doesn't matter if Titan X is a single slot single GPU design if it costs 30% more than a single card dual GPU solution that matches its performance. Charging +80% over the previous card for a product that gets +30-40% performance definitely forces this into an Apples to Apples argument.

March 18, 2015 | 05:11 PM - Posted by Anonymous (not verified)

It has always been the case that the price/performance gets significantly higher at the high end. Look at Intelsat CPUs. For the highest end consumer part it is over $1000. Is it twice the performance of a $500 part?

March 18, 2015 | 05:11 PM - Posted by Anonymous (not verified)

It has always been the case that the price/performance gets significantly higher at the high end. Look at Intelsat CPUs. For the highest end consumer part it is over $1000. Is it twice the performance of a $500 part?

March 27, 2015 | 02:02 AM - Posted by SiliconDoc (not verified)

i AGREE with you.

The little whining gasbags are wailing about the most expensive toppest end consumer gaming gpu in the entire world...

None of the whiners are ever going to buy one, for any reason.

It's like taking a t-shirt and worn jeans 99%er out of their flop tent in protest park because they demand to go to the car show with you paying for it all and driving, then they get there and wail and moan the Lambo is overpriced...

This is what it's like, they do it every time, every time they wail and moan on it.

March 17, 2015 | 03:30 PM - Posted by Zach Hafen (not verified)

yeah... but the 295x2 costs less and still performs better

March 17, 2015 | 05:17 PM - Posted by Airbrushkid (not verified)

It only performs better because it has 2 gpu's if it had 1 it wold not perform worth crap.

March 18, 2015 | 07:22 AM - Posted by Someguy (not verified)

Then it would a 290x then wouldn't it? Great job informing us though AMD might have slipped that one past us if you weren't on the job.

March 18, 2015 | 07:19 AM - Posted by Someguy (not verified)

Yeah compare $2k to $699 great idea. The downsides of the 295x2 due to it being a dual GPU have been fully covered in the article.

April 26, 2015 | 12:13 PM - Posted by Anonymous (not verified)

when the 295x2 came out it was alot more than 699 get over it

March 17, 2015 | 03:46 PM - Posted by godrilla (not verified)

Any word on the compute performance vs titan black?

March 22, 2015 | 04:45 AM - Posted by Gammett (not verified)

Wouldn't be good because of the fact that it does not have dual-precision anymore like the old titans, But what do I know? They came out with 4-way Titan-X Compute boxes built by nvidia

March 17, 2015 | 03:51 PM - Posted by X-mass (not verified)

Confusion - Typo or badly signaled joke?

is this suppose to be a joke?

"So much in fact that I am going to going this data the PCPER ISU"

did you mean

"So much in fact that I am going to call this data the PCPER ISU"

Or did you mean it to be a joke by stuttering in the sentance

and just didn't signal that it was a visual joke?

I'm not certain hence confusion and this question?

March 17, 2015 | 04:04 PM - Posted by X-mass (not verified)

another small typo in the conclusion

"The Titan X is built of gamers"


"The Titan X is built for gamers"

sorry to be a pendant - my spelling and gramme are often far, far worse

March 19, 2015 | 01:07 AM - Posted by Anonymous (not verified)

sorry to be a

      - my spelling and gramme are often far, far worse.

Now there is what I call a giant typo that left the writer hanging.

March 17, 2015 | 04:26 PM - Posted by trenter (not verified)

Gtx 780 ti also pulled higher than 300 watts in some situations and power consumption was very near the 290x. Why not compare titan x to 780ti/titan black instead of getting your jabs in on amd? Change your name to Nvidia PcPR and it'll make more sense. You've already got the green in your logo and everything.

March 17, 2015 | 05:02 PM - Posted by trenter (not verified)

"Of course there is a secondary audience that focuses on gpgpu compute", this card is hardly capable of gpgpu without the double least any more so than what we already have on the market. Most of those are far cheaper than titan x. Why does this get a gold award? Because it looks pretty and costs a lot so it must be worth it I guess. I would buy this at $799, there is no reason for this to cost $1000.

March 23, 2015 | 09:14 AM - Posted by obababoy

Agreed. $750-800 tops.

March 17, 2015 | 05:12 PM - Posted by Anonymous (not verified)

"GM200 based GTX Titan Z... with 12 Gigs of memroy."
That's what the front page cover says. Must be pretty tired of playing with awesome hardware, Ryan.

March 18, 2015 | 12:43 PM - Posted by Ryan Shrout

Whoops, thanks!

March 19, 2015 | 10:31 AM - Posted by Irishgamer01

My compliments on getting a dig in about the price in your live video about the Titan. Didn't think you would mention it. Very diplomatic...

March 17, 2015 | 05:21 PM - Posted by X-mass

I left a note in another place about the competition but before we had been given the secret word I was flagging the typos here

have i stopped myself from being included?

If so that's life but could you flag that one shouldn't leave comments until you have the secret word, in any future competion
and yes I know what the secret word is and it isn't red cause that would be far too AMD

March 17, 2015 | 05:22 PM - Posted by xxMR.IVOxx

hey ryan i seen ur live stream for the titan x giveaway came here from youtube chat was disabled so im new to ur website

March 17, 2015 | 05:40 PM - Posted by Master Chen (not verified)

Dead-on-arrival. Pointless. Useless. Dead.
Especially for that money.
You'll be a completely retarded idiot to buy this now, even if you're a heavy pro-nGreedia brainwashed fanboy.

4GB R9 390X already beats both it and the upcoming 980 Ti, not even mentioning the upcoming 8GB 390 X. 4GB R9 390X beat both the Titanic X and the 980 Ti even without new optimized drivers and other bonuses (like Vulkan/DirectX 12/Mantle, HSA, or stacked memory), while also obviously costing less money.

Titanic X, as well as 980 Ti, are complete and absolute failure. It's very apparent and obvious now that TSMC screwed nGreedia truly hard. Much harder than we've thought initially. Even though nGreedia has more dough/better stock than AMD, it's very clear that they're seriously at least one whole year behind AMD now both the node and the technology-wise. And it's very easily understandable that AMD will milk each and every last drop out of this opportunity that has been presented to it. 2015~2016 is completely the Radeon time now. You'll be a total moron if you actually in all seriousness try to deny this fact in any way whatsoever, you'll also be a complete idiot if you actually, in all seriousness, buy this dead-on-arrival Titan X crap.

Also, please remember this:
FOUR 980s BARELY beat one 4GB 390X. That's how much more powerful 390X is. And it costs less money. There's literally no reason for Titan X and 980 Ti existence. Absolutely. Completely. Totally. Undoubtedly.

March 17, 2015 | 08:56 PM - Posted by wolfkabal (not verified)

What about for non-gaming uses, like GPU computing, or pure rendering? I'd say this is a great card for those needs. So not exactly "DOA"

March 17, 2015 | 09:55 PM - Posted by Master Chen (not verified)

There's one slight HUEG problem with that though. Go look up how absolute majority of noVideots threat Huang's Titanics and how nGreedia ITSELF promotes and advertises this crap.
It would've been absolutely fine if these were purely workstation offerings. But they're seriously putting them through like gaming cards for gamers. Do you see the problem? Do you see it, huh? I surely do. And it's a HUEG one.

March 18, 2015 | 12:18 AM - Posted by Sonic4Spuds

This is actually a big win for nVidia, seeing as they will sell some to gamers as well as the compute oriented people who would be buying them anyways.

March 18, 2015 | 04:05 AM - Posted by Master Chen (not verified)

Um...I'm not afraid to stomp on your balls, so I'm telling you this right now: it turned out this piece of trash is not even good for work. Because 0.2TF of Double Precision. This shit's not good for ANYTHING, lel. Not good for games, not good for computing. This is literal DEAD.

March 19, 2015 | 01:51 PM - Posted by Anonymous (not verified)

Why don't you use correct spelling and grammar in an argument instead of looking like a 10 year old who just discovered the internet on his dad's computer.

March 21, 2015 | 02:27 AM - Posted by Master Chen (not verified)

STFU. lamer.

March 27, 2015 | 02:08 AM - Posted by SiliconDoc (not verified)

Who would want a crap 4 gig piece of junk amd oldcore ?


Now there''s no 3.5 gig whines !

12 gigs and no slant there !

BWHAHAHHAAHHA amd is dead.

March 17, 2015 | 09:45 PM - Posted by Mandrake

Nice try, you paid Nvidia shill! Trying to claim that these "GTX 980" and "Titan X" products are somehow real! Charlie Demerjian has stated that Nvidia will actually never be able to launch these so called 'products' because they'll have no money and will therefore be finished for good:

AMD 390X has ALREADY launched with 4TB of ram and 5000x faster than your so called 'titan x' video card that certainly does not exist. This will also use less than 3W of power obviously.

Now how does it feel, being called out for the paid Nvidia shill that you are?!

March 17, 2015 | 10:03 PM - Posted by Master Chen (not verified)

Thanks for the laughs, d00d. No, seriously, that was a pretty good one.

BTW FYI: I never used any offerings from nGreedia in my personal PC builds (which I have six of, in my house at this right moment) pretty much all the way since the GODLIKE GTX 285, the currently last GTX card I've bought without feeling like I'm gonna vomit. GTX 285 (the very first Twin Frozr) was the last truly great, absolutely worthwhile Nvidia card. Anything and everything else nGreedia shat out after that, was not decent enough for me to be interested in it highly enough. Yes, I've been using Radeons for a long time now, but that doesn't make me "shill" or a "fanboy" as you might think there, because I've had enough practice with using products from both of the sides. Before the GODLIKE GTX 285, I've been using GS 7100, and before that - GeForce 2 MX 400 (and before that - some VooDoos). After the GODLIKE GTX 285 my route went HD 4730 (TUL) -> HD 6850 (Gigabyte Windforce) -> HD 7870 (MSi HAWK) -> R9 270X (TUL Devil) -> R9 290X (Sapphire Tri-X Vapor-X 8GB). And I feel absolutely great about it. Because nGreedia really let me down after GTX 285, while Radeons worked absolutely fine for so many years since the day-one I've started using them. I'd gladly try any new Huang card IF THEY WERE TRULY WORTH OF MY ATTENTION, but, so far, there was nothing there but just one massive letdown after the other from the green side.

March 17, 2015 | 10:21 PM - Posted by ThorAxe

That 4870X2 was a real winner and 6870 Crossfire was even better! Glad I bought them. AMD/ATI has never let me down.

March 17, 2015 | 10:31 PM - Posted by Master Chen (not verified)

I really don't know if I'm supposed to take this as a sarcastic remark or not.

March 17, 2015 | 11:05 PM - Posted by ThorAxe

It was absolutely awesome paying twice the price for the same performance of a single GPU. I would sit wondering why frame rates looked good but performance was a stuttering mess.

March 17, 2015 | 11:16 PM - Posted by Master Chen (not verified)

So you ARE a troll after all, huh...good I've clarified. Now I know how to treat you.

March 18, 2015 | 12:50 AM - Posted by ThorAxe

Yes; If telling you about a real experience is trolling.

I dare say I have owned far more ATI/AMD cards than you.

Perhaps I am just trolling a troll?

March 18, 2015 | 04:09 AM - Posted by Master Chen (not verified)

My deepest condolences to your family/relatives, kiddo.

March 18, 2015 | 10:30 PM - Posted by ThorAxe

If I am trolling the question is who am I trolling for given the cards I have owned?

Voodoo 2 12mb SLI
Riva TNT
GeForce Pro
GeForce 2 Ultra
GeForce Ti 200
GeForce 4 4400
Radeon 9700 Pro
Radeon 9800 Pro
Radeon X850 XT
Radeon 1800XT
Radeon 1900XT
GeForce 8800GTX SLI
Radeon 4870X2 + 4870
GeForce GTX 570 SLI
GeForce GTX 680 SLI
GeForce GTX 980

Feeling old..

March 23, 2015 | 09:21 AM - Posted by obababoy

Master Chen....You are a turd. You must be paid by NVIDIA to sound like a total idiot AMD fan. As an AMD supporter myself I am ashamed to be on the same "side" as you.

March 23, 2015 | 09:27 PM - Posted by Master Chen (not verified)

I'm on neither side, you lame stupid schmuck.

March 17, 2015 | 10:06 PM - Posted by ThorAxe

Cool! Where can I buy a 390X?

March 17, 2015 | 10:29 PM - Posted by Master Chen (not verified)

May/June. Across the world. Pricing? ~800$ depending on region and model.

March 17, 2015 | 10:58 PM - Posted by pdjblum

It would be awesome if amd can one up these greedy motherfuckers. But $800 for a flagship is just as obscene to me. Of course I will say that because I cannot afford them and because, unlike great audio equipment that always sounds great, these things become middle of the pack in a few years as games become more demanding and monitors become more dense. But I do appreciate and agree with your sentiments.

March 17, 2015 | 11:17 PM - Posted by Master Chen (not verified)

That "~800$" was actually meant to point out price for 8GB version, though.

March 17, 2015 | 11:10 PM - Posted by ThorAxe

That too long to wait.

The R9 390X will be DOA since Pascal will be out not long after that and have at least 5 times the performance.

March 17, 2015 | 11:18 PM - Posted by Master Chen (not verified)

Hold your panties, Crysisboy.

March 18, 2015 | 07:53 AM - Posted by AMDfanboi (not verified)

LOL, i love it, and I thought i was an AMD Fan Boi.

March 18, 2015 | 04:11 PM - Posted by Master Chen (not verified)

Did you read what I've said previously?

March 18, 2015 | 12:27 PM - Posted by MegalodoN (not verified)

+1 For your knowledge about nVidia Propaganda (its not a Marketing after nV 970 Lie/fiasco :D )
Still good Fermi Rev.6 !! And really nothing more.
I know cuz im a engineer (i know whats inside of this, so called new tech: Fermi/Keppler/Maxwell etc. all is based on same OLD Architecture -> tesselators, cache and Block build is the same (of course with major Tweaks ;-)
I think AMD is better with the Superb GCN -> and Next-Gen R390X (i will have it 100% :D will be real game changer -> read Trully for 4k and beyoun Gaming, on 1 R3xx with HBM.
So the nGreedia Trols can go to cave now ;-)

March 18, 2015 | 04:14 PM - Posted by Master Chen (not verified)

I find it VERY laughable that only NOW, when 970 3.5GB scam opened up, noVideots were "shocked" and "confused" out of their sorry marketing victim asses, even though nGreedia did EXACTLY same crap in the past with 6xx.

March 18, 2015 | 05:45 PM - Posted by MegalodoN (not verified)

Yeah the in-Famous 660Ti Lie/Fiasco lol
But i have Great Good nV in my shelf -> GTX 285 1.2GB 512Bit Monster DX10 !! that was a "mess" :-) And before that i have GF 6600 for NFS Underground 2 with latest DX9.c SM 3.0 ! ATI was a mess those times Only 9800 with old 9.b :-(
But in the end AMD/ATI is my favorite GPU Maker tis dayz.
The 3850 ! the 7970Ghz now 280X 1075/1650 1.2v Whooa.
And now im waiting for True Power -> R390X HBM WC 8GB The King o't'Hill. the 4k & 5k Mania will Begin shortly (maby 1 or 2 months)
VooDoo was in'da'house with Virge 4MB :-)
Yeah my bro, We are Old Dogs :D We know what is created for Gaming and what is not meant to be Played(nGreedia) but Meant to be milked to tha last drop....
Yeah mby you tell another gaming brothers why Green Games are MESS ;-) The game Works DLL's Fiasco !
BatMan and heavy tesselated coat (64MB in DLL For shader only to look good in benches compared to ATI :-(
The tesselated Blocks (Cubes lol) in Crysis2 and of course invisible tesselation in that game also (disabled on nV cards lol)
And many more Bad deeds from nGreedia.... Makes me PUKE
Only $$$$$
G-sync (fiasco)
GameWorks (fiasco) even on nV cards lol
CUDA, PhysX (death to AGEIA, no respect for creators)

March 18, 2015 | 06:13 PM - Posted by MegalodoN (not verified)

And another interesting feature from nG
You know what nV970 really is?
He He the Old Good 780Ti 3GB GDDR5 + 1GB DDR3 ! (not 0.5GB my bro)
Cuz in some scenarios problem was seen when VRAM exceed 3GB !
Some sooner, some later, depends on scenario ;-)
So we have "Maxwell" 970 -> but real name is 780Ti with added 1GB DDR3 but on the BOX and in tha sop you have LIE !
it is 4GB GDDR5 :-(
But Fermi is too old for me ( -_- ) whatever the pseudo-marketing nG tell You...
Now New Brain wash with Titan-X
But is from not usable Quadro chips lol
New Box, New Propaganda and here we go. "Best" GPU (only $1k ;-)
With cut down Power needs to maintain P/Wat new Propaganda.
But electricity is cheap...
So the 290X Sapphire or XFX with 8GB is for the Gamers NOW !
Wanna play 4k? Go buy 295X2 or wait for R390X HBM...
That's ool folks

March 19, 2015 | 02:23 AM - Posted by Master Chen (not verified)

Um...since when there's 8GB XFX? I don't remember that. If my memory is right, there's only Sapphire out there who makes 8GB 290Xs right now.

March 19, 2015 | 06:28 PM - Posted by MegalodoN (not verified)

XFX R290X Black 8GB
Maby is on Sale already, dunno
but it Exist 4 100%

March 21, 2015 | 02:29 AM - Posted by Master Chen (not verified)

Found it on Newegg. Wow, that's a first. I seriously didn't know about this one up until now.

March 21, 2015 | 05:31 AM - Posted by BlackDove (not verified)

You mean like AMD fanboys who got worse performance with Crossfire than with a single GPU for years until AMDs lies were exposed?

March 21, 2015 | 04:13 PM - Posted by Master Chen (not verified)

I was always highly against multi-card approach because I know everything about the problems this method causes, regardless of be that CFX or SLI. You really shouldn't particularize - they're both crap. I've always used only single-card solutions in my personal PC builds, so even if HD 6xxx line had some problems with CFX scaling or frame pacing (I think that's what you're referring to?), I wouldn't know. The only 6xxx series card I've ever had was HD 6850 Windforce OC from Gigabyte, and it was THE best 6850 out there, out of all others present on the market, it performed like an absolute BEAST, so I wasn't in any need of going HD 6870 or even HD 69xx, all the way until HD 7870 HAWK was released.
I always try to get the most best one out there, you see. I thoroughly examine offerings from all manufacturers out there and inside each particular manufacturer I always compare different offerings of different models (if any) of theirs, before I decide to buy any particular one. And, usually, I tend to get the best out of them all. For HD 6850 line that was Gigabyte Windforce OC, for HD 7870 line that was MSi's HAWK, for R9 270X line it's TUL's Devil and for R9 290X line it's Sapphire's Tri-X Vapor-X 8GB. That's what I've been getting lately. And never got mistaken even once, so far. Got the best single-card performance out there, all the time, so was in no need in going CFX. And now I plan to get the best 8GB R9 390X available out there, after it gets released and quality nonreference third-party solutions start flowing in (unless reference offering turns out ot be just that much better, I'll always get nonreference), so that I'll be getting the best single-card performance out there once again. CFX became pretty good recently, at least that's what I've heard from people utilizing it, but I'm personally not ready yet to be using it in my personal rigs. I can build it for others, and I did that in the past already, but for me personally, in my own machine, that won't be anytime soon. I'm just against the approach itself.

March 21, 2015 | 09:22 PM - Posted by BlackDove (not verified)

Im talking about the issues covered on both pcper and tech report for the last few years.

SLI has hardware level frame time metering and actually works.

Even XDMA Crossfire doesnt work with DX9 and Crossfire now has software frame pacing after pcper and tech report demonstrated that Crossfire gave worse performance than a single card in almost every game tested.

I dont like multi GPU either but AMD was told for years that Crossfire didnt work and it wasnt until customers complained because pcper and tech report exposed their lies that they decided to fix it.

March 23, 2015 | 12:30 AM - Posted by Master Chen (not verified)

I hope that you DO know that several of the lately released batches of Radeon drivers (mainly Omega and different WHQLs) improved CFX's quality significantly?

March 23, 2015 | 01:01 AM - Posted by BlackDove (not verified)

Not nearly as well since its just software implemented frame metering. And DX9 games still dont work with Crossfire.

You knew that right?

March 23, 2015 | 07:06 AM - Posted by Master Chen (not verified)

Keep telling yourself that, I guess. Nobody would take that away from you. Just remember: DefectX 9 (just as the 11-th is) is a piece of shit itself by default. CFX/SLI has little to do with it.

March 23, 2015 | 08:56 PM - Posted by BlackDove (not verified)

Keep telling myself what?

This website and Tech Report, as well as pretty much every other tech website that followed their lead has proven that Crossfire is inferior to SLI in every way.

Crossfire was only partially patched with software implemented frametime metering, which doesn't work nearly as well as Nvidia's hardware level frame metering.

It still doesn't work as well as SLI even after all the patches, because it's SOFTWARE FRAME METERING, which was only added to Crossfire after YEARS of people telling AMD that Crossfire gives WORSE PERFORMANCE than a single card.

So you can say whatever you want about how horrible Nvidia is, but it doesn't change the fact that AMD lied about their multi-GPU being functional at all for years, and even now they still haven't patched DX9 games.

That means you have to disable Crossfire and only use one card just to play a DX9 game, not because it won't give you 2x the performance in DX9, but because you'll be getting dropped and runt frames constantly, and get WORSE performance than a single card.

Yeah, AMD is wonderful with their two core per module CPUs that use 3x the power and are easily outperformed by Intel's low end i5's and their 600W water cooled GPU that gets worse performance than one of their 290x's.

Although it was kind of hard to tell how well the 290x performed initially since AMD sent cards with "special" BIOS to review sites, and then got caught lying again when review sites purchased retail versions of their cards and it was proven that the press samples performed better.

Yeah, keep telling us how wonderful AMD is please.

March 23, 2015 | 09:29 PM - Posted by Master Chen (not verified)

>DAT damage control.

Just stop. No, really. Don't embarrass yourself any more than you already did.

March 23, 2015 | 09:44 PM - Posted by BlackDove (not verified)

LOL what?

You're the one here telling us how dishonest Nvidia is and how wonderful AMD's products are, when their shit doesn't even work and hasn't for years.

Why don't you acknowledge the company that you shill for doesn't make good products anymore?

March 17, 2015 | 05:53 PM - Posted by Malisha (not verified)

Am I wrong or there is ANOTHER EIGHT pin connector that hasn't been used, soldered onto the pcb... Maybe this card is using much more watts but it's been cut down by some measure... Maybe they are returning to older style power hungry GPU's...

March 17, 2015 | 06:09 PM - Posted by larsoncc

Any news on SLI results from any sources?

Also- the 390x hypetrain needs to take a breather. It's a bit deep in here with no actual, verifiable results to be found anywhere, don't you think?

March 18, 2015 | 05:12 AM - Posted by Anonymous (not verified)

The Titan X is more of a marketing tactic rather than a real product. The fan boys and marketing people will be out in force, but very few people will actually buy this product.

March 18, 2015 | 08:24 AM - Posted by larsoncc

That didn't answer my question you toolbag.

March 18, 2015 | 05:29 PM - Posted by BBMan (not verified)

Actually,there is a leak

In short, it looks to be equal to a Titan X.

Of course, considering the history of such AMD leaks ....

March 17, 2015 | 07:41 PM - Posted by Topjet (John) (not verified)

I did notices today went updating my Nvidia driver, they call the Titan X World's fastest gaming GPU

March 17, 2015 | 07:52 PM - Posted by Brett from Australia (not verified)

Great write up Ryan. This GPU has some serious chops and all sorts of other goodness included. This should be whetting the appetite's of enthusiasts and games everywhere. Great set of connection options on the card. The 12GB of memory does seem like overkill though.

March 17, 2015 | 08:53 PM - Posted by wolfkabal (not verified)

From a GPGPU Computing stand point. Would this be better than say the 295X or a dual 980 configuration? Looking into doing some high-end CUDA processing and not sure how things scale from that end with dual-gpu, and how the memory usage would compare. Obviously this would depend on the processing needs, but looking in a 'general' sense.

March 17, 2015 | 10:39 PM - Posted by BillDStrong (not verified)

This will largely depend on how "parallel" your problem is.

First, if your problem domain requires Double Precision Floating Point, then none of these options are for you.

Second, which compute language do you want to use? Cuda is Nvidia only, so that limits your choices. OpenCL is supported by both, but not the latest versions. Nvidia support of OpenCL has been lacking in updates. I know their drivers support OpenCL 1.1, but can't find reliable information about any of the later versions.

So, in terms of compute, if SP FP's are fine, then currently the Titan X will be the way to go. Unlike games, that 12GB of memory can and should be very useful in keeping most of your data set into the GPU, which is one of the big slow downs in GPGPU computing. It also has the largest single GPU numbers, and it can support multiple cards in one system.

Keep in mind, GPGPU does not use SLI. But it can use multiple GPUs to perform calculations. It is up to you to use them effectively, but it can scale much better than SLI.

If you are performing the same calculations on multiple datasets, then it can be as simple as giving each GPU it's own data set and go. If the data is in one set, you can break the calculations apart, but this is generally going to be slower than different data sets.

The 290X2 should just function like two 290X's, but may not, since they are preconfigured in a Crossfire solution. I can't tell you without trying it.

You might wait until the R9 390X comes out with 8GB of memory. My expectation is this will either cause Nvidia to push a new Titan dual GPU solution and/or lower prices across the board.

Exciting times.

March 17, 2015 | 10:22 PM - Posted by Mandrake

Nice card for those who don't mind spending that amount of cash!

I have a 3GB GTX 780. The 980 and related cards aren't fast enough to warrant an upgrade. This Titan X probably fast enough, but I'd never pay $1000 for a graphics card. My GTX 780 was $500 USD back in October 2013. Maybe we need some new AMD offerings and a price war? Perhaps a 6GB cutdown GM200 card?

It's about the only way they'd get me to purchase a new card. I had thought of finding another 780 on eBay, but even in SLI I'm still limited to 3GB per card, which can be a bottleneck in future games at 2560x1440.

Might just be holding on to this card until the 2016 cards launch. It'll be winter here soon before long and I can pull out the 'max OC' without worrying about the card getting too warm!

March 17, 2015 | 10:46 PM - Posted by BillDStrong (not verified)

The 12GB might be needed more than you think.

First, this card directly supports 5K displays, which will require more ram for games.

Secondly, the UHD standard has not been ratified yet, and one of the big pushes relates to higher bit rate displays. These so called "High Dynamic Range" displays may be capable of showing as high as 16bit per pixel color.

For games to take advantage of that, a large amount of the current 4bit compressed images will need to be higher bit rate. (What, you thought the compressed textures were 8bit?)

This is actually good news for gamers, as TVs are pushed out with these higher bit depths, the tech monitors should become available at cheap prices.

March 18, 2015 | 12:18 AM - Posted by Anonymous (not verified)

very overpriced for what you get, I was hoping they'd incorporate a 512bit memory interface and stick with double precision

March 18, 2015 | 01:23 AM - Posted by Mandrake

They're already pushing over 600mm2 on the GPU. They seem limited by what they can do on the 28nm process.

March 18, 2015 | 04:08 AM - Posted by Master Chen (not verified)

Oh...oh god...MOTHER OF GOD...

March 18, 2015 | 05:50 AM - Posted by Anonymous (not verified)

It seems like most newer games work quite well with multiple gpu set-ups. At some point there is going to be little difference between 2 gpus and one gpu at twice the size except price. Two smaller gpus will probably be cheaper since the yield on a ~600 square mm part is not going to be good. For most of the games tested, a multi-gpu system is cheaper and better performing. Power is more, but with multi-gpu systems you are running multiple, independent high speed memory systems instead of a single memory system. Stacked HBM is supposed to be much more power efficient, so multiple smaller gpus with HBM may be the the best option eventually. Both the Titan X and the 390X are going to be priced out of the mainstream market. They are interesting to read about, but I doubt many people here are considering actually buying one.

March 18, 2015 | 07:54 AM - Posted by MarkT (not verified)

Performance per dollar is HORRIBLE, this is the card for people who have so much money they honestly don't know what to do with it.

March 18, 2015 | 08:03 AM - Posted by Anonymous (not verified)

"As a hardware enthusiast, it's impossible to not fall in love with the GeForce GTX Titan X."

Yes it is. Because despite what your "funny nvidia infomercial", which you dare call a test, this card is a freaking rip-off.

It's nothing more than this generation's GTX580 3GB. Except that one was 550$. It's gain over previous gen, be it performance or power efficiency, are nothing more than 780 over 580 or 580 over 280.

The fact the GTX 560Ti was 250€ and the 580 3GB 550€, and now the 960 Ti (oop, 980) is 550€ and the 980 (oops, TitanX) 1000€ should make any hardware enthusiast think twice before writing piles of advertising shit.

March 27, 2015 | 01:50 AM - Posted by SiliconDoc (not verified)

Wow, angry much ?

AMD overclocked one of their stupid bull cpu's, then sold it on NewEgg for freaking $800.

I was LOL'ing so hard - a bunch of amd fanboys bought it, then it found to be flaky and a supreme power hog, then the price plummeted... to like $300, slightly above the best doggy cpu ad had

I mean how could amd rebrand then raise the price 300% ?


At least nVidia makes a new MONSTER product with MONSTER RAM, instead of overclocking a failed dog then scalping fanboys.

March 18, 2015 | 08:39 AM - Posted by Gamer (not verified)

The 295X2 just killed another weak Titan.
The 295X2 is still top of the food chain.
The weak Titan X is too weak for 295X2.

March 18, 2015 | 08:11 PM - Posted by svnowviwvn

AMD continues to bleed market share and profits on discrete GPUs.

Nvidia gains market share in discrete and now holds a 76% to 24% advantage over AMD.

AMD cuts the prices over and over again yet more gamers are choosing Nvidia over AMD.

Nvidia has high margins on the Titans (and they do sell out) whereas AMD has low margins on the 295X2 (2x GPUs, expensive cooling, more layers on PCB, etc) and still can't get people to buy it. So sad (for AMD).

March 18, 2015 | 11:34 AM - Posted by jackiesbouffant (not verified)

Fanboys make me laugh. If you don't want what the Titan X offers, don't buy it. If you can't afford it, don't buy it. But laughing and saying the 295X2 is still the best is just stupid. The 295X2 is 12.4B transistors and 600W of power that can only keep itself cool with an unwieldy CLC. AMD's only selling for 700 bucks because nobody wants them. Is it still the fastest single PCB solution? Well, yes, but only sometimes... only if CF is implemented correctly. The 295X2 comes with so many compromises that nvidia clearly feels like they can charge a $450 premium over the GTX 980, and until AMD has something that can compete directly on a single GPU basis, its going to stay that way. Oh, and a bunch of sketchy leaked slides and rumors about the 390X mean nothing. If you're using the 390X as proof of AMD's superiority you're a joke. Call me when the reviews drop.

PS- owner of 2 290x's speaking

March 18, 2015 | 05:03 PM - Posted by Anonymous (not verified)

Prices at this level have very little to do with market demand or much of anything else. The volume of these super ridiculous high end cards is so low that they are mostly a marketing tool. Nvidia releases this Titan x with probably 2x the amount of ram that it could actually use and gets a lot of media attention ahead of any AMD releases. They are not going to make much on actually selling these things, especially since it doesn't have high compute performance. They will certainly have a cut down version of this chip that will be a more reasonable price eventually.

For people actually willing to spend the money on this much performance, the r9-295x2 is still the better deal. I don't care much about the number of transistors and such. The AMD solution is cheaper and performs better. Most gamers don't care that much about the power consumption either unless it makes the card excessively noisy. I don't see much of any reason that a gamer would buy a Titan x over the 295, so why wouldn't you say that the 295 is still the better card? It seems that most modern games are handling dual gpu fine, and older games probably don't need this level of performance anyway.

March 18, 2015 | 06:18 PM - Posted by pdjblum

Agreed, despite your being too cowardly to register.

March 19, 2015 | 01:32 PM - Posted by jackiesbouffant (not verified)

The transistor counts tell you something about where each company is in terms of technology, but you're right that they don't matter directly. Instead it has a knock-on effect for the power and cooling requirements. Despite what AMD fanboys keep telling me, power DOES matter. Installing a CLC is a big logistical pain in the ass and fan speed needs to compensate to keep things cool. I couldn't stand the fan noise on my 7970's, let alone the 290X's so I built a full custom loop just to shut them the hell up. Might not have needed to do that with more efficient cards that can run quieter on air.

And as far as performance, I have lots of recent games that still only use 1 GPU reliably. How many games have serious, sometimes gamebreaking issues at launch with SLI or CF? Happens a lot... flickering textures, stuttering, framerates bouncing all over the place or just straight up crashes. Titanfall and Evolve come to mind immediately. So yea, if everything is working the 295X2 is a better perf/$ card, but its so much more dependent on software that there will always be value in a single GPU solution. How much is up to you, but people who keep squawking about the 295X2 would have you believe otherwise.

March 27, 2015 | 01:53 AM - Posted by SiliconDoc (not verified)

So now crossfire always works and amd drivers don't suck....


March 19, 2015 | 12:16 AM - Posted by ThorAxe

Stop talking sense!

March 19, 2015 | 02:07 AM - Posted by pdjblum

deleted. Was going to say deleted self, but that would have been ambiguous.

March 19, 2015 | 10:25 AM - Posted by Irishgamer01

Yes the 295x has compromises

BUT I want one, but it won't fit into the cases I own, as I already have watercooling CPU units inside. If they were Air cooled they would be flying off the shelf at that price point. Thats why most AMD ers are just doing crossfire.

PS. (I run AMD and Nvidia)

March 19, 2015 | 10:16 AM - Posted by Irishgamer01

Priced here in Ireland @ 1250 euros (Including tax 23%).
I want one, (Well 2 in SLI), but not ever at that price.

980 TI is basically going to be this card with less ram and speed aligned as to what the AMD 390x does. Prob june/july

So wait and see, seems to be the thing to do.
(If rumors are true then the TITAN X is about AMD 390x speeds)

PS. Never seen so may review cards issued...........
PPS. Guess I didn't win the draw...........secret word was dress (Red)

March 19, 2015 | 11:04 AM - Posted by BBMan (not verified)

I see a future with 3-phase 208 and liquid nitrogen tanks.

April 2, 2015 | 09:41 PM - Posted by ERic Lin (not verified)

Comon guys, Titan X benchmarks on vanilla Skyrim? :( Nobody buying Titan X is going to care about vanilla Skyrim performance on 1440p. Maybe 4k vanilla, but better, 1440p with mods.

May 29, 2015 | 09:43 PM - Posted by Vitali Fridman (not verified)

Want to post a score for EVGA super-clocked version of this card. With factory settings and without additional overclocking I got on Fire Strike Extreme: Score: 8281, Graphics score: 8884 which is about 10% higher that reference card.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.