GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

View Full Size

JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

View Full Size

Any guesses on performance or price?

View Full Size

View Full Size

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.

View Full Size

Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.

View Full Size

March 4, 2015 | 01:19 PM - Posted by PapaDragon

Any guesses on performance or price?

Performance: 50% more than a GTX 980 if it has the rumored 3072 cuda core count

Price: $1499

March 4, 2015 | 01:20 PM - Posted by RadioActiveLobster

Price is rumored to be $1,349 if the leak from Jan is to be believed (they did get the 12GB right)

March 4, 2015 | 01:24 PM - Posted by Anonymous (not verified)

So, technically it only has 11.5GB of VRAM, am it right?

March 4, 2015 | 02:04 PM - Posted by kkk (not verified)

You can say that again!!

March 4, 2015 | 03:17 PM - Posted by svnowviwvn

And be wrong both times.

March 5, 2015 | 02:44 PM - Posted by Anonymouse (not verified)


March 5, 2015 | 08:02 PM - Posted by StewartGraham (not verified)


March 8, 2015 | 02:01 PM - Posted by svnowviwvn


March 4, 2015 | 01:24 PM - Posted by Jay1998 (not verified)

i'll take it no tax lol

March 4, 2015 | 01:26 PM - Posted by Daniel Nielsen (not verified)

Another niche product. Wonder when they will reveal a 980 Ti.

March 4, 2015 | 01:30 PM - Posted by Anonymous (not verified)

AMD swoops in with a 390x2 and makes this card irrelevant.

March 4, 2015 | 01:43 PM - Posted by goatsecks (not verified)

Yea, lets have an argument comparing one set of unknown performance metrics to another.

March 4, 2015 | 04:26 PM - Posted by Anonymous (not verified)

Not when [playing an UE4 game...unreal engine 4 has no multi GPU.

March 4, 2015 | 08:20 PM - Posted by BlackDove (not verified)

Still uses software frame metering so cant run TF2 or any DX9 games lol

March 4, 2015 | 08:46 PM - Posted by loccothan (not verified)

You mean single R390X? Right? Because it will be enough

March 4, 2015 | 01:49 PM - Posted by Ophelos

I"m sorry but that UE4 kite demo is kinda joke for running on that graphics card.

If you go on YouTube an look up the game "Black Desert Online" you'll notice that the graphics in that MMO are just as good as the demo video or if not alot better. An BDO doesn't even use the UE or Crytek game engines.

March 4, 2015 | 02:09 PM - Posted by Anonymous (not verified)

I looked up that game you mentioned and you're an idiot.

March 4, 2015 | 03:13 PM - Posted by Anonymous (not verified)

I for real lol'd

March 5, 2015 | 12:32 AM - Posted by lilchronic

lmao yep he an idiot

March 4, 2015 | 05:33 PM - Posted by duttyfoot (not verified)

you really comparing that horrible looking thing to the UE4 kite demo?

March 4, 2015 | 02:07 PM - Posted by Anonymous (not verified)

So in other words nobody buy this card til the 390x is released to give that damn card a serious price cut. It looks like its going to be another titan Z incident again.

March 5, 2015 | 01:22 AM - Posted by renz (not verified)

titan z was silly but not for the single gpu version of titan. gamer will find titan stupid but for small scale professional that needs DP titan actually a STEAL. imagine you can get something that perform as fast as the fastest Tesla but cost 3-4 times cheaper. and the original titan never drop in price despite being slower than 290X. only GTX 700 series did.

March 4, 2015 | 02:08 PM - Posted by Rick Cain (not verified)

It still is cheaper than a typical workstation card.

March 4, 2015 | 02:10 PM - Posted by Jen-Hsun (not verified)

Hey everyone,
Some of you are disappointed that we did not clearly describe the segmented memory of Titan X. when we launched it. I can see why,

so let me address it.
We invented a new memory architecture in Maxwell This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer -. Ie, so that Titan X is not limited to 11GB, and can have an additional 1GB.
Titan X is a 12GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth.

This is a good design because we were able to add an additional 1GB for Titan X and our software engineers can keep less frequently used

data in the 512MB segment.
Unfortunately, we failed to communicate this internally to our marketing team, and externally to reviewers at launch.
Since then, Jonah Alben, our senior vice president of hardware engineering, provided a technical description of the design, which was captured well by several editors. Here's one example from The Tech Report.
Instead of being excited that we invented a way to increase memory of the Titan X from 11GB to 12GB, some were disappointed that we did not better describe the segmented nature of the architecture for that last 1GB of memory.
This is understandable But, let me be clear:.. Our only intention was to create the best GPU for you We wanted Titan X to have 12GB of memory,

as games are using more memory than ever.
The 12GB of memory on Titan X is used and useful to achieve the performance you are enjoying. And as ever, our engineers will continue to

enhance game performance that you can regularly download using GeForce Experience.
This new feature of Maxwell should have been clearly detailed from the beginning.
We will not let this happen again. We'll do a better job next time.

March 4, 2015 | 02:51 PM - Posted by pylomsquelch (not verified)

While funny, this is just a copy-paste-edit of the Jen-Hsun post at nvidia blogs:

Sorry to spoil the joke

March 4, 2015 | 03:16 PM - Posted by Jeremy Hellstrom

Now, now.  Don't go spoiling their version of reality with facts, it only upsets them.

March 4, 2015 | 02:18 PM - Posted by RS84 (not verified)

Titan Z - Titan Y - Titan X - Titan W - Titan V ??? backward till Titan A :P

look at Jen face, unhappy?

March 4, 2015 | 02:32 PM - Posted by larsoncc

Finally some real news about progress in the GPU space. I want more details!

My guesses are 35% faster than the 980 at 1080/1440, 50% faster than 980 at 4K, $1250.

March 4, 2015 | 02:37 PM - Posted by Ophelos

You forgot about being 10% slower then 295x2 at 4k... :)

March 4, 2015 | 03:58 PM - Posted by arbiter

Probably draws half the power to do it.

edit: ryan tweeted, 8+6 pin power so card is around 2-225watts maybe little more. You are comparing a 2 gpu card vs 1.

March 4, 2015 | 05:33 PM - Posted by Anonymous (not verified)

How do you get 225watts ?

PCI-E = 75watts
6-pin = 75watts
8-pin = 150watts
Total = 300watts

If it was 225watts it be 2x6-pin. It needs a 8-pin because just like all maxwells once they run a GPU intense application they suck just as much power as their kepler counterparts.

Canned gaming benchmarks is where maxwell power consumption shines.

March 4, 2015 | 08:14 PM - Posted by BlackDove (not verified)

Software frame metering on a garbage dual gpu card lol

March 4, 2015 | 02:32 PM - Posted by Coupe

Lol He's signing it like he's some sort of celebrity. NVidia makes some great cards with great engineers, but I really dislike Jen.

March 4, 2015 | 02:38 PM - Posted by Anonymous (not verified)

The guy insulting his customers in his blog presenting his new rip of.


March 4, 2015 | 04:29 PM - Posted by arbiter

You say that but according to market share, you are in a pretty small minority that think that.

March 4, 2015 | 06:02 PM - Posted by Anonymous (not verified)

Says the one with the FUD/Terfer green merit badge, and shiny gold doubloons! that's the problem when the Market Share is so out of whack, but market share can change, and in the mobile market the market share is more evenly balanced. Fore sure there needs to be a third discrete GPU market player, and a few mobile only players SKUs are getting up there in the SP/other counts, to maybe make a jump at some mobile discrete GPU offerings. Those SOC SKUs may just themselves take over the low end graphics discrete market needs entirely, but for sure there can be scaled up graphics discrete products made by formally in the SOC only graphics market IP players, 2 of them come to mind already.

March 4, 2015 | 02:49 PM - Posted by Anonymous (not verified)

so, are they going to try and market this card as a gaming card at the beginning like they did last time? Because this "surprise" appearance sure makes it look that way. Wonder how long these cards don't sell before they say it's a semi-professional/pro-user card like they did with the first titan. I think this "surprise" is just to test the waters to see how people react before releasing more marketing on this card, instead of just out of the gate calling it a gaming card.

March 4, 2015 | 03:27 PM - Posted by MarkT (not verified)

Nvidia punishes everybody with high prices cause nobody buys the Tegra stuff :-(

March 4, 2015 | 03:47 PM - Posted by Anonymous (not verified)

The hair, not very Tressy, and still how long was the render time(per frame) of this short vignette! How about a few close up not so dynamic shots of a plant, or thick foliage, with lots shadows cast through layers of gently swaying leaves. Was this a single GPU, or a whole rack of GPUs on a grid server, And again in Jen-Hsun Huang's back stage dressing room with the star on the door, the quick change back into the polyester, and spiked shoes, and back in the stretch limo, and off to the links, with Money Bags, Bottom Jaw the Third, and the other country clubbers. It's pay through the nose, or kidney sales time, for any of JHH's metal! Marketing and big production values, but that math may not add up. Well time to second mortgage the double-wide again.

March 4, 2015 | 08:15 PM - Posted by Anonymous (not verified)

This demo was running at 30 FPS locked. Shadows were dynamic, you could see the ones being cast by the trees moving while the trees blew in the wind. I would assume this one done on one workstation from how Epic presented it, but no guess if it was single or multi-gpu. There was a few close up shots, I could tell they were using their distance field ambient occlusion because there was some artifacts, I'm curious if they were using their distance field shadows and GI as well.

March 4, 2015 | 04:29 PM - Posted by Anonymous (not verified)

All this hate nowadays for new more powerful graphics cards. Fucking shit generation of kids and hipsters....

March 4, 2015 | 05:26 PM - Posted by Anonymous (not verified)

Don't you go blaming the hipsters, they are all using Apple, or other thin and light overpriced SKUs with Intel "Ultrabook" crappy underpowered specifications. It's more the hate for that phony, leather jacketed JHH, and his shyster tactics! Nvidia is not doing so well in the Phone SKU department, or in the tablet market, and hamstringing the Tablet market with more Android closed app ecosystem junk, and not some full Linux distro based tablet computers. For sure some steam OS based SKUs have been announced, and hopefully some Steam OS based tablets are in the planning stage! More powerful ways at milking more money, and such form customers has not added up for the green brand, and a lot of that hate is coming from Nvidia's own customer base! What's that math again 3.5 = 4! The older folks are not falling for JHH's huckster song and dance/dog and pony show, but everyone is a little tired of the dishonest crap.

March 4, 2015 | 07:35 PM - Posted by Anonymous (not verified)

The hate is more along the lines of. Nvidia is becoming the Intel of GPU's (look at the past 4 years of x86 *snore*). Lack of competition is never a good thing. Nvidia is upcharging each product line even more than normal when they had competition, etc, etc.

None of that is a good thing. Some of it is fanboyism, but when you create a card that's meant to sell at $500 and have high profit margins and then you add another $150+ because you can, that's not a good thing for the end-user.

People should watch the last PCPer Podcast where Ryan went over how drastically the prices have changed in the past 5 years and no abnormal performance increases per generation have occurred to warrant the change. Seriously, watch it. The proof is in the pudding.

March 4, 2015 | 08:17 PM - Posted by BlackDove (not verified)

Its because their dollars are worthless and they want things that are less powerful and worse than what they had before.

It has to be:

Newly packaged

Poor performance compared to what came before it

Made in China

Cheap because their money is worthless

March 4, 2015 | 04:36 PM - Posted by Anonymous (not verified)

*Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.*

And what you don't expect is Unreal Engine 4 not supporting SLI

March 5, 2015 | 02:10 AM - Posted by renz (not verified)

he only said not SLI friendly not outright saying that UE4 not supporting SLI at all. i think this is also the case with CrossFire since both SLI and Crossfire rely on AFR. also Daylight which is an UE4 based games works on my 660 SLI

March 5, 2015 | 07:20 AM - Posted by Marty (not verified)

Daylight is not on UE4 its using chrome engine

March 5, 2015 | 07:21 AM - Posted by Marty (not verified)


sorry just woke up I was thinking dying light :P

March 4, 2015 | 05:12 PM - Posted by lordbinky (not verified)

BAH! to their focus on power efficiency.

March 4, 2015 | 05:21 PM - Posted by Anonymous (not verified)

And queue DX12 and unified MEM and GPU tech. That'll be goodbye to traditional SLI and Crossfire and hello to the future.
Whith this hopefully UE4 engine will see the cards as a single, this is what microsoft has revealed up to now.

March 4, 2015 | 05:28 PM - Posted by Anonymous (not verified)


March 4, 2015 | 05:45 PM - Posted by pdjblum

Stop coming down so hard on yourself. Anonymous is a unified entity of distributed, in time and space, minds.

March 4, 2015 | 08:14 PM - Posted by Marty (not verified)

I dont get it....

March 4, 2015 | 05:51 PM - Posted by Chris.R (not verified)

Granted DX12 should be an improvement but throwing all you eggs in one basket is silly especially when we haven't even seen DX12 in real-time scenario outside of one benchmark.

Unified MEM is cool but again nothing to do with multi GPUs if anything it will help both single and multi GPU performance.

Everything you mentioned has nothing to do with "traditional SLI/CF" dropping support on engine is just fucking stupid.

March 4, 2015 | 08:43 PM - Posted by edwinjamesmiller36

price? an arm and a leg and part of another leg.

March 5, 2015 | 02:51 AM - Posted by pdjblum

Can Anonymous be numbered? Anonymous01, for instance. There are more anonymi here than there are hippopotami in Africa. Anonymi are restricted from replying to this post; don't bother trying as the reply button is disabled.

March 5, 2015 | 01:50 PM - Posted by Jeremy Hellstrom

Thankfully it is safe to ignore them, unlike hippopotami.

March 5, 2015 | 05:14 PM - Posted by edwinjamesmiller36

so true. Hippopotami are the most dangerous animals in Africa. They make lions look like pussycats.

March 5, 2015 | 08:48 AM - Posted by fvbounty

No back

March 5, 2015 | 03:53 PM - Posted by PhoneyVirus

This graphics card will run you $1300 dollars, seriously who the hell wants to purchase a graphics card like this beside developers. You`ll benefit more as a gamer going with x2 SLI configuration then you will with a developer card.


March 8, 2015 | 12:46 PM - Posted by Anonymous (not verified)

Has no one pointed out the 8-pin connector layout 90 degrees of the obvious 8-pin?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.