Rumor: NVIDIA GeForce GTX TITAN Black and GTX 790 Incoming

Subject: Graphics Cards | January 21, 2014 - 03:49 PM |
Tagged: rumor, nvidia, kepler, gtx titan black, gtx titan, gtx 790

How about some fresh graphics card rumors for your Tuesday afternoon?  The folks at VideoCardz.com have collected some information about two potential NVIDIA GeForce cards that are going to hit your pocketbook hard.  If the mid-range GPU market was crowded wait until you see the changes NVIDIA might have for you soon on the high-end.

First up is the NVIDIA GeForce GTX TITAN Black Edition, a card that will actually have the same specifications as the GTX 780 Ti but with full performance double precision floating point and a move from 3GB to 6GB of memory.  The all-black version of the GeForce GTX 700-series cooler is particularly awesome looking.  

View Full Size

Image from VideoCardz.com

The new TITAN would sport the same GPU as GTX 780 Ti, only TITAN BLACK would have higher double precision computing performance, thus more FP64 CUDA cores. The GTX TITAN Black Edition is also said to feature 6GB memory buffer.This is twice as much as GTX 780 Ti, and it pretty much confirms we won’t be seeing any 6GB Ti’s.

The rest is pretty much well known, TITAN BLACK has 2880 CUDA cores, 240 TMUs and 48 ROPs.

VideoCardz.com says this will come in at $999.  If true, this is a pure HPC play as the GTX 780 Ti would still offer the same gaming performance for enthusiasts.  

Secondly, there looks to be an upcoming dual-GPU graphics card using a pair of GK110 GPUs that will be called the GeForce GTX 790.  The specifications that VideoCardz.com says they have indicate that each GPU will have 2496 enabled CUDA cores and a smaller 320-bit memory interface with 5GB designated for each GPU.  Cutting back on the memory interface, shader counts and even clocks speeds would allow NVIDIA to manage power consumption at the targeted 300 watt level.  

View Full Size

Image from VideoCardz.com

Head over to VideoCardz.com for more information about these rumors but if all goes as they expect, you'll hear about these products quite a bit more in February and March.

What do you think?  Are these new $1000 graphics cards something you are looking forward to?  

Source: VideoCardz
January 21, 2014 | 04:45 PM - Posted by Anonymous (not verified)

At $1000 I don't look forward to being ripped off.

January 22, 2014 | 01:44 AM - Posted by arbiter

Consider 780 ti is 700$, 300$ more gets you pretty much 2 of them how would that be a "rip off" if its 1000$

January 22, 2014 | 12:37 PM - Posted by Anonymous (not verified)

Actually, it has only a few more cuda cores per gpu than a GTX 780 and many fewer than a 780ti. It's technically worth it if it's $1000, but I doubt that will be the price. I'm thinking more around $1200.

January 22, 2014 | 10:08 PM - Posted by Anonymous (not verified)

Normally in those 2X GPU card they use a lower clocked chip due to power/thermal limitations. lets see what AMD is going to offer. if they managed to get a card that performs close to that at a much cheaper price, then that 1000$ tag will be reduced.

January 23, 2014 | 12:44 PM - Posted by sixstringrick

and going from 3GB to 5GB per GPU

January 21, 2014 | 04:54 PM - Posted by biohazard918

I'm guessing $1200 for the 790 :(

January 21, 2014 | 05:25 PM - Posted by lightymikey (not verified)

10gb on the 790? That's 5 per gpu. I would love to see how the Titan black, ti, 790, and 690* bench against each other and which ones handle multiple monitors. (*Drivers have changed since the release of the old 690)

January 21, 2014 | 05:33 PM - Posted by Humanitarian

Gief Maxwell.

January 21, 2014 | 05:36 PM - Posted by mAxius

Well there goes nvidias time line for MAXWELL to be released AND NO ONE QUESTIONS THIS?!?!? REALLY NO ONE!!! THE HECK!!!

January 21, 2014 | 05:59 PM - Posted by Scott Michaud

I was actually thinking the same thing. It's a shame because Maxwell might be a gigantic leap for HPC.

January 21, 2014 | 06:44 PM - Posted by Anonymous (not verified)

Maxwell with a Denver ARMv8 core, or maybe 8 Denver ARMv8 cores in the future, like, but not exactly the same, as what AMD did with 8-core x86-64 console gaming APUs using x86 Jaguar cores. And, if Nvidia can do with the Denver ARM cores, what Apple did with with their Cyclone ARM cores, then Nvidia may have a console of its own, or maybe Nvidia could just build out their Maxwell descrete GPU line into a Maxwell "APU" gaming system on a PCI card! AMD seeing this would certainly counter with their console gaming APUs, in a more pimped out HSA aware console gaming APU on a PCI card formfactor reply! The whole gaming enchilada moved to the PCI card, race is about to begin! [foams at the mouth, and adjusts Tin Foil Hat] but you never Know!

January 21, 2014 | 08:33 PM - Posted by biohazard918

Its an arm core designed for mobile applications... Its not going into desktop gaming systems. Why would you want an apu on a pci card anyway? You would then have two cpu's running different instruction sets. You're not going to be able boot from a PCIe card so you will still need a host cpu and even a todays low end cpu's can play most games reasonably well. So there is no reason for such a thing to exist and a 4th console are you mad its an extremely tough market to get into and a good way to piss away a LOT of money.

January 21, 2014 | 09:13 PM - Posted by Anonymous (not verified)

A nice KVM VM (type 1 hyporvisor) running on the moatherboard based CPU and DDR3 memory with any OS or OSs running on that, while the PCIe gaming APU could be loaded after mainboard bootup with a gaming Linux distro, and whatever gaming engine/game could be hosted on the PCI based complete gaming platform, unencumbered by any main motherboard VM/OS overhead! The PCI gaming APU would be a descrete gamimg system with 512 bit GPU sized bus, next to, and on the same DIE as the CPU/s(ARM, x86), low latency for sure, directly connected to the GPU by an internal BUS, put a large shared on Die RAM memory, and a shared CPU/GPU memory controller[hUMA style] to GDDR5 memory! Who needs a comsole when any old PC could be outfitted with a PCI based complete gaming HIGH power desktop graphics/gaming console on a PCI card! And the APPLE A7 benches more like a laptop chip, The denver ARMv8 custom ARM K1 core is not the same as the 4 ARM refrence cores in the Early release K1 SKUs, and Maxwell is going to have at least one, but in the future(?)! See anand's review of the Apple A7, and the Nvidia Tegra K1! These custom ARM ISA based wide issue superscalar designs from Apple and Nvidia are not the ARM holdings refrence designs you may be thinking about! [More mouth foaming, and Babbeling Incoherently] But you never Know!

January 21, 2014 | 09:46 PM - Posted by Anonymous (not verified)

"You would then have two cpu's running different instruction sets", well people have been running CPU on motherboards and Descrete GPUs running on PCI cards for ever, and you can not get any more different than that CPU running its business on a motherboard and GPU running its business on PCIe card! Bus mastering took care of that! And after the gaiming distro was sent over to the PCI gaming card, well except for the initial OS load, the only PCI based communication between moatherboard CPU and complete gaming system would be to send the requested game/gaming engine over, via a PCI based OS request to the mainboard host Master KVM type one hyporvisor/ or VM hoasted motherboard OS, Or use bus mastering and let the PCI hosted OS do it own Read/writes to its own disk or disk partition share. Big Iron Linux distros have been doing that for a good while now, with loads of PCI based CPUs and GPUs of many different CPU and GPU instruction sets, and all over PCI! [more foaming, and gnawing at the bedposts] Really!

January 21, 2014 | 10:38 PM - Posted by renz (not verified)

If I have to speculate maybe it has something to do with TSMC next main process node. 20nm for SoC has start into mass production this January but I heard the type of process that usually used for GPU only be ready after second half of this year. From early speculation it is expected to be ready this quarter with both and nvidia are moving into their next major architecture by Q1 or Q2 this year. There already news about nvidia 750 Ti based on Maxwell architecture arriving next month albeit it is still on 28nm which is weird move from nvidia. If rumor about this black edition titan and 790 is true maybe high end maxwell will not going to hit stores anytime soon. These cards might be expensive but it will give partners new SKU to sell. While regular customer might have no interest with these cards boutique builder definitely interested with them and with nvidia recent battlebox program I think nvidia knows where these cards actually marketed at. Anyway I was expecting 790 to be 350w card at the very least but if they can make it into 300w card then it is really a marvel engineering from nvidia

January 21, 2014 | 05:47 PM - Posted by ImmenseBrick (not verified)

I just bought a 780ti 2 weeks ago, while expected (maxwell or otherwise) I am sad face now. In other news, im not a fan of dual gpu's so maybe I should not be so sad. Either way, big green is milking the hell out of gk110. Now if amd could make a dual 290x to compete.....Not to start a flame war but im pretty sure a dual 290x will be hotter than the sun. Just saying.

January 21, 2014 | 07:39 PM - Posted by Daniel Espinoza (not verified)

Agreed

January 21, 2014 | 08:25 PM - Posted by Anonymous (not verified)

Big green is the color of money (at least on the old US drab green bills, before the bills started on their way to becoming psychedelic), and Nvidia wants tons more to fund its race with AMD's new HSA and AMD's sucesses with other uses in bitcoin mining, or stock market auto trading houses, where millisecond lead times equal Millions in more profits! Won't somebody think of the gamers!

January 21, 2014 | 10:38 PM - Posted by Mountainlifter

I can feel your pain man. Ah well, play as many games as you can in the interval before these rumoured cards. It will make you feel like you did the right thing by not waiting.

January 21, 2014 | 08:24 PM - Posted by armrray1 (not verified)

So with all 15 SMX units online and full FP64 enabled, is the TITAN BLACK the final, fully-for-real-this-time enabled form of the GK110? The golden 'bin'? I need some CUDA horsepower for 3D rendering but have no interesting in buying into the 'professional' GPU racket, so this is looking pretty sweet...

January 21, 2014 | 09:30 PM - Posted by Anonymous (not verified)

Fake

January 21, 2014 | 10:20 PM - Posted by ThorAxe

Tossing up between 2 GTX 780s or 780 Tis but this GTX 790 looks intriguing if real. I would really be interested in the performance per dollar at 2560x1440 and above.

January 21, 2014 | 10:35 PM - Posted by Mountainlifter

I would like to see the 790 SLi performance benchmarks. :D
What do we call the old 1SMX-crippled Titans now? That Black shroud is beautiful.

January 22, 2014 | 01:38 AM - Posted by wizpig64 (not verified)

10 gigs, jeez.

interesting to note: 4992 shader units on the 790 would mean 2496 per gpu, between the 780 and the titan. seeing as how nvidia disables clusters of shader units in groups of 192 (called an SMX unit), it makes sense that there would be a product like the 790 to use up the chips binned for 14/15 units, which hasn't existed yet, at least on geforce cards.

gtx 780: 12/15 SMX units enabled

gtx 790: 13/15 (each gpu)

gtx titan: 14/15

gtx 780 ti: 15/15

titan black: 15/15

(hat tip to scott wasson at tech report for explaining SMX units in his gtx 780 review)

January 23, 2014 | 05:57 AM - Posted by Darchon (not verified)

I'm still waiting for the 12GB cards VideoCards rumored they would releasing. I'm tired of these ridiculous rumors. March is kind of late if they plan on releasing Maxwell sometime soon so one lineup is going to suffer.

January 23, 2014 | 12:46 PM - Posted by 1STANCESTOR (not verified)

Maybe I'll go GTX 790 if it crushes everything else at 4K 60hhz resolution, and its price beats that of two GTX 780Ti cards in SLI.

March 20, 2014 | 02:04 AM - Posted by Anonymous (not verified)

i dont know why you would use this to get to pointless resolutions, its for programming raytracing and gigantic synapse counts for neural networks, thats what im doing with them, but if it ends up more than $4000 for two and it dont scale well with 4 gpus, im not going to bother... and just go for two 690's, go back to dated old 8192x8192 textures (you can get a 16grandx16grand texture on one of these things something you little gamer girls never see) and pay a shit load less.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.