NVIDIA Kepler Graphics Cards Lineup Leak To Web

Subject: Graphics Cards | February 6, 2012 - 06:23 PM |
Tagged: nvidia, kepler, graphics, gpu

Although there were quite a few rumors leading up to AMD's Radeon 7000 series launch, the Internet has been very quiet on the greener side of the graphics market. Finally; however, we have some rumors to share with you on the Nvidia front. As always, take these numbers with more than your average grain of salt.

Specifically, EXP Review managed to uncover two charts that supposedly detail specifics about a range of GeForce 600 series Kepler cards from the number of stream processors to the release date. Needless to say, it's a lot of rumored information to take in all at once.

Anyway, without further adieu, let's dive into the two leaked charts.

Model Code Name Die Size Core Clock (TBD) MHz Shader Clock (TBD) GHz Stream Processors SM Count ROPs Memory Clock (effective) GDDR5  Bus Width Memory Bus Width
GTX690 GK110x2 550mm2 ~750 ~1.5 2x1024 2x32 2x56 4.5 GHz 2x448bit 2x252GB/s
GTX680 GK110 550mm2 ~850 ~1.7 1024 32 64 5.5 GHz 512bit 352GB/s
GTX670 GK110 550mm2 ~850 ~1.7 896 28 56 5 GHz 448bit 280GB/s
GTX660Ti GK110 550mm2 ~850 ~1.7 768 24 48 5 GHz 384bit 240GB/s
GTX660 GK104 290mm2 ~900 ~1.8 512 16 32 5.8 GHz 256bit 186GB/s
GTX650Ti GK104 290mm2 ~850 ~1.7 448 14 28 5.5 GHz 224bit 154GB/s
GTX650 GK106 155mm2 ~900 ~1.8 256 8 24 5.5 GHz 192bit 132GB/s
GTX640 GK106 155mm2 ~850 ~1.7 192 6 16 5.5 GHz 128bit 88GB/s


From the chart above, we can see the entire lineup of Kepler cards from the NVIDIA GTX 640 to the dual GPU GTX 690.  The die size in the higher end GeForce cards is approximately 50% larger than that of the AMD Radeon HD 7970, but not much bigger than that of the GTX 580.  If only we knew the TDP of these cards!  In the next chart, we see alleged performance comparison versus the AMD competition.

Model Bus Interface Frame Buffer Transistors (Billion) Price Point Release Date Performance Scale
GTX690 PCI-E 3 x16 2x1.75 GB 2x6.4 $999 Q3 2012  
GTX680 PCI-E 3 x16 2 GB 6.4 $649 April 2012 ~45%>HD7970
GTX670 PCI-E 3 x16 1.75 GB 6.4 $499 April 2012 ~20%>HD7970
GTX660Ti PCI-E 3 x16 1.5 GB 6.4 $399 Q2/Q3 2012 ~10%>HD7950
GTX660 PCI-E 3 x16 2 GB 3.4 $319 April 2012 ~GTX580
GTX650Ti PCI-E 3 x16 1.75 GB 3.4 $249 Q2/Q3 2012 ~GTX570
GTX650 PCI-E 3 x16 1.5 GB 1.8 $179 May 2012 ~GTX560
GTX640 PCI-E 3 x16 2 GB 1.8 $139 May 2012 ~GTX550Ti


If these numbers hold true, NVIDIA will handily beat the current AMD offerings; however, I would wait for reviews to come out before making any purchasing decisions.  One interesting aspect is the amount of GDDR5 memory.  It seems that NVIDIA is sticking with 2GB frame buffers (or less) per GPU while AMD has really started upping the RAM.  It will be interesting to see how this affects gaming in NVIDIA Surround and/or at high resolutions.

What do you guys think about these numbers, do you think Kepler will live up to the alleged performance scale figures?

Source: EXPreview

February 6, 2012 | 07:15 PM - Posted by Kevin (not verified)

Already got my ATI.

February 6, 2012 | 07:30 PM - Posted by Tim Verry

Which card did you get? j/c

February 6, 2012 | 07:32 PM - Posted by Ryan Shrout

I have the ATI Radeon 9700. Like, the OLD one.

February 6, 2012 | 08:45 PM - Posted by Tim Verry

Oh man, I bet that thing plays Battlefield 3 in Eyefinity resolutions like butter :D

February 6, 2012 | 07:34 PM - Posted by Anonymous (not verified)

the GK110 looks totally fake, no way they can produce a 550mm2 without huge problems.

and for $650 give me a break!

February 6, 2012 | 07:57 PM - Posted by Nilbog

The 690 seem like a rip off on paper. 999$ for a card that probably cant display 3 monitors by its self, they must really want you to buy 2 cards I guess... I am very disappointed seeing any card with less than 2GB of buffer coming out in 2012, regardless of how great the architecture is. 3 displays for gaming needs to become the norm already, in my opinion.

February 6, 2012 | 08:54 PM - Posted by Tim Verry

I thought the dual GPU cards were the only Nvidia cards that could do Nvidia Surround with just a single physical card (except those special MDT cards of course)?

February 7, 2012 | 12:36 AM - Posted by Druac Blaise (not verified)

My 590 supports three monitors just fine...not sure what that comment was about...love my 590, but I am salivating over the thought of two EVGA GeForce GTX 680 Classified Ultra Hydro Coppers...mmmmmmmmmmmmmmmmmmmmmmmm. Everything is better with EVGA! :)

February 7, 2012 | 01:29 AM - Posted by Tim Verry

Yeah, ever since XFX stopped doing double lifetime warranty, I'm in the market for a new GPU vendor whenever I eventually upgrade. They are going the way of BFG most likely :(

February 6, 2012 | 11:02 PM - Posted by Ryan Shrout

I would think NVIDIA would be crazy to release even a single GPU card that couldn't do at least three monitors...

February 8, 2012 | 04:07 PM - Posted by AParsh335i (not verified)


February 6, 2012 | 11:38 PM - Posted by chefbenito (not verified)

I would be really shocked if the 690 doesn't do 3 displays. I am sure it will be part of the feature set for this card. NV isn't stupid enough to expect people to spend 2000 dollars to run three monitors. Otherwise; I am not at all surprised that kepler gpu's rather handily destroy AMD's current lineup. The maxwell generation of gpus will widen the gap. I will happily step up my 3G 580 Classy to that kepler 680 2G'er FWIW this chart was out a few days ago, kudos to PCper for vetting it a bit before they re-ran it

February 8, 2012 | 04:06 PM - Posted by AParsh335i (not verified)

Just agreeing with the two other responses, the GTX 590 DOES do 3 monitors as a standard feature, so it would be very fair to assume the "GTX 690" would also.

February 6, 2012 | 08:01 PM - Posted by Anonymous (not verified)

I would rather see 3gb of vram per card because Battlefield 3 in 3D vision uses too much memory. My gtx 580's are maxed out at 1.5 gb and I only get 40fps. Nvidia and dice need to fix this.

February 7, 2012 | 12:37 AM - Posted by Druac Blaise (not verified)

That is where EVGA will come in...they got your back! :)

February 7, 2012 | 11:56 AM - Posted by Anonymous (not verified)

You won't get any more than around 40 fps due to the rendering setup in BF3. The 3D vision driver in it's current state can't handle all effects within modern games. 3D vision seeks to limit the amount of work that needs to be done for 3D and in the process ends up taking a few short cuts. This screws up the rendering of a lot of effects in many games while saving power.

In order for BF3 to work in 3D the whole 3d vision driver has to be bypassed. And as it stands, unless you settle for Crysis 2 3D (mega crap IMHO) that means rendering whole frames for both eyes. In effect this will mean exactly half fps in 3D compared to 2D. Get 80 fps in 2D using two 580s? Well then you get 40 in 3D.

I run 2 580s in (3D vision) surround and this is exactly what happens when I switch to 3D.

February 6, 2012 | 08:42 PM - Posted by ThorAxe

I'll probably get two GTX670s. 1.75GBs should be enough Vram for 2560x1440 and less heat issues than the GTX690.

It's going to be a tad faster than my Diamond Monster Voodoo II 12mb SLI combo. :)

February 6, 2012 | 08:52 PM - Posted by Tim Verry

true, you could prolly overclock the two individual cards further than you could the dual GPU card aswell :)

February 6, 2012 | 09:44 PM - Posted by Wolvenmoon (not verified)

Will buy a 660TI in a custom edition w/ 2 or more gigs of RAM at the $300 mark. Else my $140 6870 is enough to keep me happy for another year.

February 8, 2012 | 02:43 PM - Posted by Anonymous (not verified)

For $140 i'll just put a second 6870 in my system. One of those was a major jump from a 9800gt.

February 6, 2012 | 10:36 PM - Posted by Rion (not verified)

Wow that quite a price difference between the GTX 650 and 650TI, better be worth it, still out of my price range. Guess Ill be getting a 650 when they start to hit 150. Unless the leftover 560's get there first.

Then again I would rather SLI two 650's somday...
Why cant they just make all of this easier.

February 6, 2012 | 10:45 PM - Posted by Chaitanya Shukla (not verified)

nVidia has been working on Keplar for far too long. I remember attending a CUDA seminar a few months before the 500 series of video cards were launched and that guy giving seminar showed us slides of keplar by mistake.

February 7, 2012 | 06:35 AM - Posted by Steven Z. (not verified)

This seems more factual than all the BS floating around earlier. That the mid range would beat the 7970 and cost 300.

I agree 100%. There really is no point to have the top AMD or Nvidia cards without 2560 x X displays or multi monitor setups, so it will be interesting to see the difference with the extra 1GB will make a difference.

AFAIK, Nvidia will have their version of eyefinity on single card setups on Kepler.

Final interest will be over clock comparisons and.results.Good times!

February 7, 2012 | 04:36 PM - Posted by Tim Verry

heh, yeah I just don't see that happening :P

Yep, hopefully they release them soon so that Ryan can get to work testing them at high resolutions! :P

Really? That would be a positive move for them as AMD really has them beat on that front. No one wants to buy 2 Nvidia cards to do 3 displays :P

February 7, 2012 | 09:56 AM - Posted by Boggins (not verified)

What's with the stingy framebuffer sizes? Didn't Nvidia learn anything with the insufficient VRAM of the 400/500 series?

February 7, 2012 | 10:51 AM - Posted by Anonymous (not verified)

So the GTX660Ti would come out 6 month later, cost the same, have less memory and only have 10% better performance than the 7950? I'm /impressed/ ! I also would have thought the GTX650 to cost less than that. I can already buy a card with these kind of perf. for that kind of price.

February 7, 2012 | 04:33 PM - Posted by Tim Verry

Indeed. I'm hoping once they are released they will knock the prices way, way down!

February 7, 2012 | 03:45 PM - Posted by Anonymous (not verified)

hope they hit the market earlier than expected. gpu budget is burning a hole in my pocket. seriously considering going ati. the waiting game sucks,lets play hungry hungry hippos!

February 8, 2012 | 04:17 PM - Posted by AParsh335i (not verified)

I see a big price war on the horizon...

March 8, 2012 | 06:54 AM - Posted by Lok (not verified)

$1000+ for a GTX690 with under 4gb of RAM? they must be on crack... AMD's 7990 (their "flagship" card) cost around $150 less and gives you a 1/3rd more ram for a total of 6gb and almost double per GPU. (1.75gb vs 3gb)

+ the fact you can mine bitcoins and make your money back, yeah... I'm picking the HD 7990.... just for the fact I can make my money back and then some + my mobo that it will be going into isn't a PCIe 3.0, so no loss of PCIe there.

the only thing Nvidia has going for it is the PCIe 3.0 and if I'm not mistaken only intel boards have PCIe 3.0 on them right now. I have yet to see any AMD based boards using it. I'm just not a fan of intel's upgrade path with mobos along with the fact they cost a LOT more.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.