Rumor: NVIDIA GeForce GTX 880 Is Actually September?

Subject: General Tech, Graphics Cards | August 3, 2014 - 01:59 PM |
Tagged: nvidia, maxwell, gtx 880

Just recently, we posted a story that claimed NVIDIA was preparing to launch high-end Maxwell in the October/November time frame. Apparently, that was generous. The graphics company is said to announce their GeForce GTX 880 in mid-September, with availability coming later in the month. It is expected to be based on the GM204 architecture (which previous rumors claim is 28nm).

View Full Size

It is expected that the GeForce GTX 880 will be available with 4GB of video memory, with an 8GB version possible at some point. As someone who runs multiple (five) monitors, I can tell you that 2GB is not enough for someone of my use case. Windows 7 says the same. It kicks me out of applications to tell me that it does not have enough video memory. This would be enough reason for me to get more GPU memory.

We still do not know how many CUDA cores will be present in the GM204 chip, or if the GeForce GTX 880 will have all of them enabled (but I would be surprised if it didn't). Without any way to derive its theoretical performance, we cannot compare it against the GTX 780 or 780Ti. It could be significantly faster, it could be marginally faster, or it could be somewhere between.

But we will probably find out within two months.

Source: Videocardz
August 3, 2014 | 04:12 PM - Posted by H1tman_Actua1

AMD users mourn while Gsync users rejoice.

August 3, 2014 | 04:39 PM - Posted by Daniel (not verified)

Thats trying way too hard.

August 3, 2014 | 09:04 PM - Posted by Anonymous (not verified)

Why would AMD users, or anyone for that matter, mourn a GPU release? It doesn't affect them in the slightest unless they choose to buy it.

August 3, 2014 | 09:46 PM - Posted by JohnGR

Oh look. The fanboy just came out of it's cave.

Newer and better products are good news for everybody. It's called competition and leads to lower prices and higher performance.

As for GSync, what a joke. Months after it's release there are people who bought a gsync monitor only to notice extra lag when playing fast ffs games.

August 4, 2014 | 12:44 AM - Posted by arbiter

your the fanboy that needs to go back in to his cave. IF there is any its only a few ms at best. 1-2ms maybe 3ms. frame time for 144fps is like 6ms so. Wanna talk about BS, AMD claims its tech is VESA standard but just to turn around in their FAQ and say its not proprietary, AKA straight up lie to end users yet again.

August 4, 2014 | 02:09 AM - Posted by JohnGR

I am the fanboy because I posted something that you admit it is there, that it is true? Good one.

August 4, 2014 | 03:08 AM - Posted by arbiter

I said IF, you are one those all over AMD every word like they can say and do no wrong. I never said there was and delay as there is no proof of it besides AMD claiming more crap which more offen then not is AMD being full of it not not exactly saying everything. As far as we know the work could be sent with the video frame delay IF and i said IF is sub 1ms. SO its so small it won't impact anything. Under 5-6ms won't be noticeable by anyone even if AMD makes such a large deal about it. I am no expert designer that worked on g-sync so I can't say there is a delay, 60fps is 16.67ms, 120 drops it to about 8ms, bumping at 144 which most g-sync monitors set to release are, only puts frame time to around 6-7ms.

August 4, 2014 | 04:52 AM - Posted by JohnGR

So now we just try to create a false image of the other person, to have an excuse to make him look bad. You are doing great.

I was reading about the lag from people who are using GSync and they are full of Nvidia hardware.

This is my third post and I haven't written until now the word "AMD" once. On the other hand in two posts you mentioned AMD 5 times. They must have done some pretty bad things to you when you where very young.

August 4, 2014 | 06:34 AM - Posted by Anonymous (not verified)

Your avatar isn't helping

August 4, 2014 | 07:02 AM - Posted by JohnGR

Yes I know, but people should focus on the text not on the avatar. Unfortunately most people see the avatar and then they just approve or reject whatever it is written based, on that avatar.

August 4, 2014 | 03:43 PM - Posted by That vesa standard? (not verified)

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-...

August 4, 2014 | 04:21 PM - Posted by Scott Michaud

Arbiter's point is: that standard is NOT FreeSync. It is used by FreeSync, wrapped in proprietary bits.

August 5, 2014 | 11:10 AM - Posted by Interesting... (not verified)

That Just means VESA Didn't adopt everything AMD gave them...
Dockport standard from AMD also went VESA...

Meh.

August 4, 2014 | 05:48 PM - Posted by arbiter

"How are DisplayPort Adaptive-Sync and Project FreeSync different?
​DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.​ Users are encouraged to read this interview​ to learn more."

http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx

AMD claimed months ago that their solution would be a VESA spec but like most every other AMD claim they end up not being 100% truthful and had to back pedal cause they couldn't live on to their claim.

August 4, 2014 | 09:25 AM - Posted by PCPer Fan - Thedarklord (not verified)

Haha, actually you threw the first stone "Oh look. The fanboy just came out of IT'S cave." And judging by your other reply's I see you go with the tactic "if you cant attack the argument, attack the person".

Stepping that aside and going back to the original post and even to your original off topic rant about GSync.

The GTX 800 series is coming soon and we should all be excited even AMD fans, its progress in the industry, which on the GPU side has been slowing down as of late. Really looking forward to some new GPU's.

Also on a side note AMD has also been doing the "reuse/re-brand" method, and are just as guilty as NVIDIA doing it. I personally don't have that big of an issue with it, they basically make last years flag ship cheaper and move it down the product stack.

As for GSync, I think it is the right direction to go in, do I wish it was more open source, sure. But I can also understand NVIDIA's $$$ investment so its understandable. And at its core it is a technology that improves the life of EVERY gamer, no more Tear Lines and no more Frame Stutter.

:) Feel free to object and reply.

August 4, 2014 | 10:27 AM - Posted by JohnGR

First paragraph.
Nope and nope.

Second paragraph.
Rand about GSync? Nope. Facts.

Third paragraph.
You are copying me.

Fourth paragraph.
Who mentioned rebrand? I think you are the first.

Last paragraph. GSync was good to wake up the industry. The same is the case with Mantle. But both should go away the day free techs replace them. And NO. Proprietary standards are NOT for EVERY gamer. But for some. Those who pay a specific price to a specific manufacturer.

What do you prefer?
DirectX 12 (OpenGL), OpenCL, Adaptive Sync, some open Physics engine

or

Mantle, CUDA, GSync, PhysX

August 3, 2014 | 11:40 PM - Posted by Anonymous (not verified)

This guy is delusional. Even the moderator over at ASUS ROG Forums is realistic while responding to SWIFT G-Sync issues.


It can't undo the fundamental effect of very low frame rates and it doesn't do anything below 30FPS, but it does make it far more tolerable and the transition from high to low smooth. IMO it's not so suitable for very fast action games where you're better off using the ULMB option with normal or extreme pixel response setting instead.

August 3, 2014 | 08:06 PM - Posted by Anonymous (not verified)

Because dat 6GB GTX 780 Ti in the wild >=)

8GB lol

Not impressed at all because 28nm is going to have a severe impact on Maxwell which has already lost everything that made it interesting over the past 4 years.

We can keep dreaming though.

August 3, 2014 | 08:49 PM - Posted by renz (not verified)

making the efficiency improvement over kepler using the same 28nm node is already impressive. but i know what you mean. but then again nvidia can't keep delaying their next product just because the tech they need does not arrive in time (TSMC die shrink). they still need to show investor that they have some sort of progress despite things are not going as they have hope/expected in the past. me? maybe i will wait for their third gen maxwell before pulling the trigger.

August 4, 2014 | 12:45 AM - Posted by arbiter

I am sure there is some loss but Nvidia has proven what can be done with 28nm in the 750ti.

August 5, 2014 | 03:09 AM - Posted by Jade (not verified)

Soo Far.. You really want to defend Nvidia.. I smell fan boyism ahahaha!!

Anyway.. GSYNC or FREE SYNC, what ever you call it, I'll just go for something that does the same with no extra hardware cost

August 5, 2014 | 10:01 AM - Posted by renz (not verified)

so you're waiting for a solution that will give you the capability without the need to upgrade your current monitor and gpu?

August 5, 2014 | 07:46 PM - Posted by arbiter

g-sync is nvidia, no-so "freesync" is AMD.

Least with g-sync if you have mid range or higher gpu based on kepler you just need the monitor.

AMD on other hand you will even though they claim some monitors just need firmware update which i doubt, you need a monitor and only 3 models of them radeon dedicated series supports it so more like need new GPU as well. AMD solution in the end depending on what you got isn't gonna be cheaper.

August 3, 2014 | 08:51 PM - Posted by renz (not verified)

i don't know how reliable is this but it seems Gigabyte has confirmed that they will be releasing GTX880 in late september:

http://videocardz.com/51133/gigabyte-launch-geforce-gtx-880-g1-gaming-se...

August 4, 2014 | 12:16 AM - Posted by Max fail (not verified)

Booyah!!!

The 880 Maxwell specs is finally here !!!
2560 SP/ 1050MHz/ 7GHz /256bit/ 4GB/ 8GB/ 230WTDP

August 4, 2014 | 01:47 AM - Posted by Anonymous (not verified)

The new rip off on its way.
Hopefully the driver "quality" will be the same as the 600-700 series, meaning one realse out of two will crash and the other do artifacts.

nvidia FTW !

August 4, 2014 | 03:00 AM - Posted by arbiter

Take Nvidia outta the battle, you think AMD wouldn't pull same crap if they didn't have to compete with company beating them down in the market?

August 4, 2014 | 01:09 PM - Posted by qicy (not verified)

lol do you blow the nvidia logo every morning when you wake up.

August 4, 2014 | 03:21 PM - Posted by Anonymous (not verified)

judging by his rational to that last one I believe he does.

August 4, 2014 | 06:04 PM - Posted by arbiter

judging by your 2 moronic comments you blow AMD logo everyone morning and confirm the whole "AMD can do no wrong" viewpoint.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.