Rumor: NVIDIA GeForce GTX 880 Is Actually September?

Subject: General Tech, Graphics Cards | August 3, 2014 - 04:59 PM |
Tagged: nvidia, maxwell, gtx 880

Just recently, we posted a story that claimed NVIDIA was preparing to launch high-end Maxwell in the October/November time frame. Apparently, that was generous. The graphics company is said to announce their GeForce GTX 880 in mid-September, with availability coming later in the month. It is expected to be based on the GM204 architecture (which previous rumors claim is 28nm).

View Full Size

It is expected that the GeForce GTX 880 will be available with 4GB of video memory, with an 8GB version possible at some point. As someone who runs multiple (five) monitors, I can tell you that 2GB is not enough for someone of my use case. Windows 7 says the same. It kicks me out of applications to tell me that it does not have enough video memory. This would be enough reason for me to get more GPU memory.

We still do not know how many CUDA cores will be present in the GM204 chip, or if the GeForce GTX 880 will have all of them enabled (but I would be surprised if it didn't). Without any way to derive its theoretical performance, we cannot compare it against the GTX 780 or 780Ti. It could be significantly faster, it could be marginally faster, or it could be somewhere between.

But we will probably find out within two months.

Source: Videocardz

Video News

August 3, 2014 | 07:12 PM - Posted by H1tman_Actua1

AMD users mourn while Gsync users rejoice.

August 3, 2014 | 07:39 PM - Posted by Daniel (not verified)

Thats trying way too hard.

August 4, 2014 | 12:04 AM - Posted by Anonymous (not verified)

Why would AMD users, or anyone for that matter, mourn a GPU release? It doesn't affect them in the slightest unless they choose to buy it.

August 4, 2014 | 12:46 AM - Posted by JohnGR

Oh look. The fanboy just came out of it's cave.

Newer and better products are good news for everybody. It's called competition and leads to lower prices and higher performance.

As for GSync, what a joke. Months after it's release there are people who bought a gsync monitor only to notice extra lag when playing fast ffs games.

August 4, 2014 | 03:44 AM - Posted by arbiter

your the fanboy that needs to go back in to his cave. IF there is any its only a few ms at best. 1-2ms maybe 3ms. frame time for 144fps is like 6ms so. Wanna talk about BS, AMD claims its tech is VESA standard but just to turn around in their FAQ and say its not proprietary, AKA straight up lie to end users yet again.

August 4, 2014 | 05:09 AM - Posted by JohnGR

I am the fanboy because I posted something that you admit it is there, that it is true? Good one.

August 4, 2014 | 06:08 AM - Posted by arbiter

I said IF, you are one those all over AMD every word like they can say and do no wrong. I never said there was and delay as there is no proof of it besides AMD claiming more crap which more offen then not is AMD being full of it not not exactly saying everything. As far as we know the work could be sent with the video frame delay IF and i said IF is sub 1ms. SO its so small it won't impact anything. Under 5-6ms won't be noticeable by anyone even if AMD makes such a large deal about it. I am no expert designer that worked on g-sync so I can't say there is a delay, 60fps is 16.67ms, 120 drops it to about 8ms, bumping at 144 which most g-sync monitors set to release are, only puts frame time to around 6-7ms.

August 4, 2014 | 07:52 AM - Posted by JohnGR

So now we just try to create a false image of the other person, to have an excuse to make him look bad. You are doing great.

I was reading about the lag from people who are using GSync and they are full of Nvidia hardware.

This is my third post and I haven't written until now the word "AMD" once. On the other hand in two posts you mentioned AMD 5 times. They must have done some pretty bad things to you when you where very young.

August 4, 2014 | 09:34 AM - Posted by Anonymous (not verified)

Your avatar isn't helping

August 4, 2014 | 10:02 AM - Posted by JohnGR

Yes I know, but people should focus on the text not on the avatar. Unfortunately most people see the avatar and then they just approve or reject whatever it is written based, on that avatar.

August 4, 2014 | 06:43 PM - Posted by That vesa standard? (not verified)

August 4, 2014 | 07:21 PM - Posted by Scott Michaud

Arbiter's point is: that standard is NOT FreeSync. It is used by FreeSync, wrapped in proprietary bits.

August 5, 2014 | 02:10 PM - Posted by Interesting... (not verified)

That Just means VESA Didn't adopt everything AMD gave them...
Dockport standard from AMD also went VESA...


August 4, 2014 | 08:48 PM - Posted by arbiter

"How are DisplayPort Adaptive-Sync and Project FreeSync different?
​DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.​ Users are encouraged to read this interview​ to learn more."

AMD claimed months ago that their solution would be a VESA spec but like most every other AMD claim they end up not being 100% truthful and had to back pedal cause they couldn't live on to their claim.

August 4, 2014 | 12:25 PM - Posted by PCPer Fan - Thedarklord (not verified)

Haha, actually you threw the first stone "Oh look. The fanboy just came out of IT'S cave." And judging by your other reply's I see you go with the tactic "if you cant attack the argument, attack the person".

Stepping that aside and going back to the original post and even to your original off topic rant about GSync.

The GTX 800 series is coming soon and we should all be excited even AMD fans, its progress in the industry, which on the GPU side has been slowing down as of late. Really looking forward to some new GPU's.

Also on a side note AMD has also been doing the "reuse/re-brand" method, and are just as guilty as NVIDIA doing it. I personally don't have that big of an issue with it, they basically make last years flag ship cheaper and move it down the product stack.

As for GSync, I think it is the right direction to go in, do I wish it was more open source, sure. But I can also understand NVIDIA's $$$ investment so its understandable. And at its core it is a technology that improves the life of EVERY gamer, no more Tear Lines and no more Frame Stutter.

:) Feel free to object and reply.

August 4, 2014 | 01:27 PM - Posted by JohnGR

First paragraph.
Nope and nope.

Second paragraph.
Rand about GSync? Nope. Facts.

Third paragraph.
You are copying me.

Fourth paragraph.
Who mentioned rebrand? I think you are the first.

Last paragraph. GSync was good to wake up the industry. The same is the case with Mantle. But both should go away the day free techs replace them. And NO. Proprietary standards are NOT for EVERY gamer. But for some. Those who pay a specific price to a specific manufacturer.

What do you prefer?
DirectX 12 (OpenGL), OpenCL, Adaptive Sync, some open Physics engine


Mantle, CUDA, GSync, PhysX

August 4, 2014 | 02:40 AM - Posted by Anonymous (not verified)

This guy is delusional. Even the moderator over at ASUS ROG Forums is realistic while responding to SWIFT G-Sync issues.

It can't undo the fundamental effect of very low frame rates and it doesn't do anything below 30FPS, but it does make it far more tolerable and the transition from high to low smooth. IMO it's not so suitable for very fast action games where you're better off using the ULMB option with normal or extreme pixel response setting instead.

August 3, 2014 | 11:06 PM - Posted by Anonymous (not verified)

Because dat 6GB GTX 780 Ti in the wild >=)

8GB lol

Not impressed at all because 28nm is going to have a severe impact on Maxwell which has already lost everything that made it interesting over the past 4 years.

We can keep dreaming though.

August 3, 2014 | 11:49 PM - Posted by renz (not verified)

making the efficiency improvement over kepler using the same 28nm node is already impressive. but i know what you mean. but then again nvidia can't keep delaying their next product just because the tech they need does not arrive in time (TSMC die shrink). they still need to show investor that they have some sort of progress despite things are not going as they have hope/expected in the past. me? maybe i will wait for their third gen maxwell before pulling the trigger.

August 4, 2014 | 03:45 AM - Posted by arbiter

I am sure there is some loss but Nvidia has proven what can be done with 28nm in the 750ti.

August 5, 2014 | 06:09 AM - Posted by Jade (not verified)

Soo Far.. You really want to defend Nvidia.. I smell fan boyism ahahaha!!

Anyway.. GSYNC or FREE SYNC, what ever you call it, I'll just go for something that does the same with no extra hardware cost

August 5, 2014 | 01:01 PM - Posted by renz (not verified)

so you're waiting for a solution that will give you the capability without the need to upgrade your current monitor and gpu?

August 5, 2014 | 10:46 PM - Posted by arbiter

g-sync is nvidia, no-so "freesync" is AMD.

Least with g-sync if you have mid range or higher gpu based on kepler you just need the monitor.

AMD on other hand you will even though they claim some monitors just need firmware update which i doubt, you need a monitor and only 3 models of them radeon dedicated series supports it so more like need new GPU as well. AMD solution in the end depending on what you got isn't gonna be cheaper.

August 3, 2014 | 11:51 PM - Posted by renz (not verified)

i don't know how reliable is this but it seems Gigabyte has confirmed that they will be releasing GTX880 in late september:

August 4, 2014 | 03:16 AM - Posted by Max fail (not verified)


The 880 Maxwell specs is finally here !!!
2560 SP/ 1050MHz/ 7GHz /256bit/ 4GB/ 8GB/ 230WTDP

August 4, 2014 | 04:47 AM - Posted by Anonymous (not verified)

The new rip off on its way.
Hopefully the driver "quality" will be the same as the 600-700 series, meaning one realse out of two will crash and the other do artifacts.

nvidia FTW !

August 4, 2014 | 06:00 AM - Posted by arbiter

Take Nvidia outta the battle, you think AMD wouldn't pull same crap if they didn't have to compete with company beating them down in the market?

August 4, 2014 | 04:09 PM - Posted by qicy (not verified)

lol do you blow the nvidia logo every morning when you wake up.

August 4, 2014 | 06:21 PM - Posted by Anonymous (not verified)

judging by his rational to that last one I believe he does.

August 4, 2014 | 09:04 PM - Posted by arbiter

judging by your 2 moronic comments you blow AMD logo everyone morning and confirm the whole "AMD can do no wrong" viewpoint.

August 4, 2014 | 11:03 PM - Posted by Anonymous (not verified)

Might want to wipe that JHH love juice off your chin there buddy. You think people haven't noticed.

Anyone that regular reads the comments here on PCPer can tell you do. I don't know why your surprised when someone says it.

August 5, 2014 | 01:06 AM - Posted by Scott Michaud

Whoa people. This has gone way too far.

August 4, 2014 | 06:21 PM - Posted by Anonymous (not verified)

judging by his rational to that last one I believe he does.

August 5, 2014 | 04:50 AM - Posted by Anonymous (not verified)

Yes, AMD would pull the same crap.
And I would be defending nVidia.
Question is, would current nVidia defender switch to defend AMD ? After all if they defend the philosophy of expensive slow crap, they should.
That would be an interesting test.

August 4, 2014 | 09:40 AM - Posted by Anonymous (not verified)

yea you know what you're taking about ...

August 4, 2014 | 09:44 AM - Posted by Anonymous (not verified)

I really hope they dont milk Maxwell like kepler.
Im afraid to even get the first cards 6 months later will better

August 4, 2014 | 03:07 PM - Posted by Anonymous (not verified)

I promise they will. After everyone suckled the teet of the Titan because of the shock and awe factor, nobody started questioning what Nvidia was doing until the Titan Z came out and people realized the bullshit delicate game Nvidia has played throughout the Kepler Generation.

You will see another Titan for most likely the $1,000 price mark on 20nm. You will see a 20nm refresh with slight improvements and definitely a 880 Ti at least at the price of $649 if the 880 isn't already priced there screwing those owners just like the 700 Series for $150 net profit.

August 4, 2014 | 03:49 PM - Posted by GG Nvidia (not verified)

The massively over hype whale known as Maxwell 880 is over.
Now it's official. GTX 880 confirm weak.
No chance weak 880 is faster than 708 Ti.

August 5, 2014 | 12:13 AM - Posted by Anonymous (not verified)

from what i heard a titan z part 2 will be coming next year. i'm still running with a gtx 560ti sc and have no intention to move from that card until i think its really necessary. really glad i didn't run out and buy any of the 6 or 7 series cards as yet, same goes for the titans.

August 5, 2014 | 12:43 AM - Posted by Sonic4Spuds

What you are saying there makes some sense, but not necessarily in the way you present it. The 560TI SC is a good card, and can still play any game you want to, so unless you have a bunch of extra money burning a hole in your pocket why spend it?

August 5, 2014 | 12:43 AM - Posted by Sonic4Spuds

What you are saying there makes some sense, but not necessarily in the way you present it. The 560TI SC is a good card, and can still play any game you want to, so unless you have a bunch of extra money burning a hole in your pocket why spend it?

August 5, 2014 | 12:40 AM - Posted by Sonic4Spuds

Here's to hoping that these cards keep the compute gains that were seen in the 750 and 750TI. I know that NVIDIA isn't as fast a compute if you are running a simple task, but unfortunately in the rendering industry it is usually the only option, as CUDA actually works for complex path tracing algorithms unlike OpenCL.

August 5, 2014 | 10:50 PM - Posted by arbiter

Mostly Compute hasn't been needed by end normal users hence why they don't focus on making their cards do to that work. Usually it was all limited to pro user end.

August 5, 2014 | 01:18 AM - Posted by snook

this is the PCper comments section now?

August 5, 2014 | 04:08 AM - Posted by collie

I think this crazy hot summer is getting to everyone, We all seem to be bitching at each other, ALOT.

August 5, 2014 | 06:13 AM - Posted by Jade (not verified)

There are too many off topic people here... most are those fanboys ahahaha!

I dont care if you love AMD or Nvidia.. please if you are a fanboy just go out and camp at Nvidia or AMD HQ to support them and stop ranting here. You guys are detroying a good article here

August 5, 2014 | 10:52 PM - Posted by arbiter

the massive AMD lovers can only attack anything nvidia does even when AMD does same thing. When you point out the truth or the facts they start throwing verbal and slanders at people.

August 5, 2014 | 09:59 AM - Posted by don't care (not verified)

I used to get excited about those graphics releases back in the old days, but what is the point now? Most upcoming games are total failures. No real progress has been made in game development. The so called physics engines are so bad, you don't even feel like you're doing anything when playing a game... Game audio still sucks, and the visuals ain't that good either for the amount of money you pay for those graphics cards. I mean tessellation isn't used to a noticeable extent as of yet in games and I haven't seen any decent texture design in those modern games, including the upcoming ones.

BTW, I haven't played a game in over two years, I think I've already quit this boring activity that I used to enjoy years ago.

August 5, 2014 | 03:13 PM - Posted by Anonymous (not verified)

Back in the day they had nice releases now they try to nickel dime loyal customers by fragment releases on top of it you get lazy ass devs porting over broken and half ass games from consoles my 780s are it for me im done after this unless something revolutionary comes out but still you deal with incompetent game devs

August 5, 2014 | 12:07 PM - Posted by mAxius

ah another 28nm refresh for the big 2 gpu vendors gg tsmc

August 5, 2014 | 10:55 PM - Posted by arbiter

Its not a refresh, gpu is based on nvidia new maxwell arch which has power and performance improvements over kepler. If you want to get idea what maxwell brings, look at gtx750ti. It competes with 260x and 265 AMD parts while using like half the power. 260x has 115watt TDP, 265 is 150watt TDP, And the Nvidia 750ti is listed at 60watt TDP.

August 5, 2014 | 11:31 PM - Posted by Mandrake

Nice to see some new GPUs, even if they are still on the 28nm node. The Maxwell architecture has shown impressive power efficiency with the 750 Ti though, so we should see a reasonable performance increase with the 870 and 880 over the 780 and 780 Ti.

I'm rather content with my GTX 780 for the while, it handles all my games at 2560x1440 perfectly. Might be looking at one of those ASUS ROG Swift monitors though. :D

The childish arguments are lame too, just saying.

August 9, 2014 | 05:34 PM - Posted by PhoneyVirus

@Mandrake Stay away from the ASUS ROG Swift monitors, why because of G-Sync sorry Free-Sync will kill that as soon as it hits the market latter this year.

Yes I'm a Nvidia fan boy but you would have to listen to Maximum PC podcast to know what I'm saying.


August 16, 2014 | 03:06 PM - Posted by jayden (not verified)

How is gsync dead? Because AMD, a company that only sells to 1\3rd the market is coming up with a knock off? "Freesync" will only work with AMD GPUs, AMD only has a fraction of the dgpu market.

Or do u think that gsync is dead because a function of the eDP standard has been added to the desktop DP standard????? News flash, the ability to change the refresh rate has existed for years as part of the eDP spec. And no one to this day, not even AMD themselves was using it to give a gsync like experience. No one.

So how is moving this eDP feature to the desktop supposed to have any effect on gsync? Ppl need to get their facts straight. There is a heck of a lot more to freesync than the spec change. There is no one using the spec to create a gsync experience now, and that part of the spec has been a part of eDP for yrs. The only one trying now is AMD. With their small fraction of the market, there is no way gsync is automatically dead.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.