NVIDIA Rumored To Release 700-Series GeForce Cards At Computex 2013

Subject: Graphics Cards | April 15, 2013 - 03:34 PM |
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex

Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.

View Full Size

According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.

Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.

What do you make of this rumor? Do you think a refreshed Kepler is coming this year?

Source: Fudzilla
April 15, 2013 | 05:46 PM - Posted by Thedarklord

I honestly hope that this turns out to be true, even though the 700 series would simply be a improved Kepler, just like the 400/500 series.

So I would expect to see cards with higher clocks and more memory possible on the high end, and from mid to entry I would simply expect a minor clock increase and some changes to the memory subsystem, as Kepler is shown to be very memory dependent.

April 15, 2013 | 06:17 PM - Posted by DeadOfKnight (not verified)

Yeah, I think that 3-4 GB should be the minimum at this point for high performance cards like the GTX 680. Also, GPU Boost 2.0 will be a welcome addition.

April 16, 2013 | 01:24 PM - Posted by Thedarklord

Absolutely, 3GB should be the standard for top end cards (even single GPU cards).

I also forgot about GPU Boost 2.0 >.>, that would be a very welcome addition to more GPU's, hello much more overclocker friendly cards :D

April 15, 2013 | 08:12 PM - Posted by RoboGeoff (not verified)

I'm still rocking a pair of really hot GTX 580s. I would love a good reason to upgrade the GPUs along with my CPU this year.

It's looking quite pointless on the CPU side.

April 15, 2013 | 09:29 PM - Posted by Anonymous Coward (not verified)

18 months ago we had the exact same offerings from AMD and NVIDIA at the same prices as today. Way to go to the both of them! All they've done in the meantime is offer a $1000 card that is just more 680 cores for a cool grand. That's it. They must be learning from intel how to stagnate LIKE A FUCKING BOSS!

April 15, 2013 | 09:33 PM - Posted by Inverted (not verified)

This is the card i would like to see:

GTX 7xx

Shaders: 2048
Texture Units: 128
Graphics Clock: 1006
Texture Fillrate: 128.8 Gtex/s
Memory Clock: 1502 MHz
Memory Bus: 384-bit
Memory Bandwidth: 264 GB/s
Graphics Ram: 4 GB GDDR5
Price: $500

April 15, 2013 | 10:15 PM - Posted by Anonymous (not verified)

A 6 core NO integrated graphics consumer haswell CPU would really hit a sweet spot, But This does not appear to be the direction Intel is going! Every major player in the PC market appears to be fighting over the remaining diminishing returns, and designing for the least common denominator! Not much innovation, but there are a few technological bright spots, but those will come at a premium price1!

April 16, 2013 | 01:16 AM - Posted by Scott Michaud

Actually an integrated GPU could be an asset anyway. Some calculations for physics (collisions, fields, etc.) and artificial intelligence (path finding, visibility, etc.) benefit from GPU architectures. If integrated GPUs or APUs get ubiquitous enough, they can offload the primary GPU and CPU. This will become progressively more true as memory management gets better.

An efficient computer has all of its processing units loaded with the instructions that best-suit it. From my perspective, the CPU is still doing way too many random tasks even though GPUs are often bottlenecks anyway.

April 16, 2013 | 11:45 AM - Posted by Anonymous (not verified)

With proper HSA (heterogeneous system architecture) support and the OpenCl support! Yes The GPU can be used for physics, but do not expect Intel to work closely with AMD or Nvidia on making HSA a true reality! Intel would be better off, for gaming at least, focusing on their CPU strong points, while AMD and Nvidia can focus on the graphics, as Intel will find it as hard to catch up to the main GPU makers, as AMD found it with the x86 race! Intel's driver software reputation is not the same as AMD's or Nvidia's and well maintained and updated graphics drivers are essential for gamers!

April 16, 2013 | 01:21 PM - Posted by Thedarklord

Wasn't John Carmack of ID talking about this at last years Quakecon?

I agree that as the development of these APUs/CPU-GPU on die designs improve it would be very cool to see some of that "simple" code get passed to the otherwise idle on die GPU, as you said AI and Physics run better on GPUs.

From what Ive been seeing the hardest part of this from a hardware standpoint is 3 parts,
1) space and heat (which is improving)
2) memory controllers that are very slow compared to GPU's with dedicated RAM
3) getting the CPU and GPU components that share a die to play nice with the shared memory and caches (which again I think points to better memory controller logic)

April 15, 2013 | 10:15 PM - Posted by Daniel Masterson (not verified)

Ya I think I will wait next year to see how Maxwell performs.

April 15, 2013 | 11:59 PM - Posted by Mnemonicman

So . . . I guess I have to wait to buy a new card then? Was going to buy a 670 but now I'm not sure. Knew I should have bought one on boxing day.

April 16, 2013 | 05:04 AM - Posted by Funkatronis (not verified)

i'm going to wait for Maxwell as GTX690 will hold me out till then. i'm curious what the performance will be as the GTX 690 kicks the living snot out of my last card and Radeon 5970.

April 16, 2013 | 07:13 AM - Posted by Howie Doohan (not verified)

Excellent news. I'm still running Crossfire 6870's (XFX 2gb versions) and was looking at a couple of 670's as a replacement but I reckon I'll wait till computex now.

April 16, 2013 | 07:21 AM - Posted by btdog

Instead of a 700 series, is it possible that the 600 series would get a refresh? Perhaps a 685/675/665, etc.? I'm not sure a complete new line is warranted.

My big concern is price-point. Last year, we were fortunate that nVidia decided to keep prices "reasonable" (a relative term, I guess). They came in and undercut AMD by ~$50 even thought their cards could outperform at every level. If these are truly upgrades and even better cards, what happens to existing card prices, and where will the new ones fall?

I'd love to see a new line up, but not if it means $600-650 for a 780.

April 16, 2013 | 10:05 AM - Posted by Anonymous (not verified)

glad to hear it, altho I wonder where the Titan will fit in with these new cards and will these new cards be based on gk110 or back to the gk104? Honestly i may end up waiting until maxwell the following year as my 570s in surround are still feisty but even now im seeing games maxing out my 2.5 gb (i have the EVGA gtx570 hd 2.5gb cards)

so hopefully nvidia releases these cards w/ 3-4gb minimum

April 16, 2013 | 01:30 PM - Posted by Anonymous (not verified)

Im dying for a new GPU! My 460GTX cannot overclock higher!! I gotta give it to EVGA for making such a good card that has managed to stay overclocked with heavy gaming running like a champ for 3 years now. I have been tempted to go AMD even thou my last card from them was a 9800 SE flashed to 9800 PRO back when it was ATI :)

Their never settle bundles are quite nice but this is what I was waiting for to give a couple games a second run :D

April 17, 2013 | 01:31 PM - Posted by Maester Aemon (not verified)

If this is true the 7xx will end up being a tweaked rebranding of the 6xx just like they did with the 9800/8800 series.

April 26, 2013 | 06:45 AM - Posted by Rokim (not verified)

They must do something cos,7870LE got same performance as 660ti and comes with 3 games and price is same as 660none ti.For 200€ I can get performance of 660ti witch is 270€ and 3 games.7870LE destroyes mid range gtx600 series now.

April 26, 2013 | 06:50 AM - Posted by Rokim (not verified)

I have more than 20 friends who bought 7870LE.Same performance as 660Ti + 3 games for 660nonTi price. Just sick deal

May 2, 2013 | 10:31 AM - Posted by Anonymous (not verified)

hell mr AMD fanboi.

the 7870Le is priced within 20$ of the GTX660 TI, you might want to find a decent store.

Performance is about 8% lower than a GTX660 TI.

Its a damn nice card but don't start talking bull because you like something

May 2, 2013 | 01:26 PM - Posted by Rokim (not verified)

I SaY 7870LE+3 games is 205€,660non ti is 200€ and 660ti is 270€.660ti is little better for 30% bigger price and I get zero games with it.That is in my country.Now do math what is better buy.Even if ther would be no games involved,660ti is 30% more cash for little better,not worth it.When I add games there is not much to think about witch is better buy.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.