PCPer Live! GeForce GTX 980 Ti, G-Sync Live Stream and Giveaway!

Subject: Graphics Cards | June 9, 2015 - 08:33 AM |
Tagged: video, tom petersen, nvidia, maxwell, live, GTX 980 Ti, gtx, gsync, gm200, giveaway, geforce, g-sync, contest

UPDATE: Did you miss the event? No worries, you can still learn all about the GTX 980 Ti, G-Sync changes and even how NVIDIA is changing VR! Once again, a HUGE thanks to NVIDIA and Tom Petersen for coming out to visit.

Even thought it's a week after official release, we are hosting a live stream from the PC Perspective offices with NVIDIA's Tom Petersen to discuss the new GeForce GTX 980 Ti graphics card as well as the changes and updates the company has made to the G-Sync brand. Why would NVIDIA undercut the GTX TITAN X by such a wide margin? Are they worried about AMD's Fiji GPU? Now that we are seeing new form factors and screen types of G-Sync monitors, will prices come down? How does G-Sync for notebooks work without a module?

All of this information and more will be learned on Tuesday, June 9th.

View Full Size

And what's a live stream without a prize? One lucky live viewer will win an EVGA GeForce GTX GTX 980 Ti 6GB graphics card of their very own! That's right - all you have to do is tune in for the live stream Tuesday afternoon and you could win a 980 Ti!!

View Full Size

NVIDIA GeForce GTX 980 Ti / G-Sync Live Stream and Giveaway

12pm PT / 3pm ET - June 9th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, June 9th at 12pm PT / 3pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 12pm PT / 3pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Video News


June 5, 2015 | 04:40 PM - Posted by CalebMcCarty

Can't wait!

June 5, 2015 | 02:53 PM - Posted by Mountainlifter

I always enjoy the tom peterson live streams. I'll be watching live in the middle of the night from halfway around the world!!!

June 5, 2015 | 03:54 PM - Posted by YTech

Great! I should be able to watch a good portion of it.

Question:
- What's all the hype and issues with Hair Tessellation? Why is it said to be a burden on performance? Any solutions -- driver updates, etc.?

- What about PhyX? Is it being utilized in games less than before? Couldn't Hair effects be implemented by PhyX?

- How will gaming with the GTX 900 series perform in Windows 10 with support of DX 12?
From rumours, GTX 900 series was released without fully utilizing DX 12. The secret AMD GPU is rumoured to comply with DX 12 better than nVidia products.

I am a nVidia fan, however I am debating between a GTX 700 series and a GTX 900 series. Depending on these answers, I may go with one over the other.

After the GTX 900 series, is there a plan for GTX 1K Series? -- primarily naming, as hardware-wise there will be something.

June 5, 2015 | 04:24 PM - Posted by Anonymous (not verified)

Is Gameworks VR NVidia only? Why latest drivers crash so much with Chrome? Are there new, significant features planned for gsync in desktop monitors?

June 5, 2015 | 05:32 PM - Posted by nevzim (not verified)

Tom, will DirectX 12 make game specific driver optimizations less important?

June 6, 2015 | 03:29 PM - Posted by Anonymous (not verified)

No. DX12 focuses more on making CPU calls more efficient.

June 5, 2015 | 05:38 PM - Posted by nevzim (not verified)

Tom, will there be 8GB 970 or 980?

June 6, 2015 | 03:16 AM - Posted by Goldmember (not verified)

A 970 with 8GB wouldn't make much sense would it?

June 6, 2015 | 09:29 AM - Posted by nevzim (not verified)

Probably you are right. Would be marketing nightmare. More likely for Nvidia to release 3GB and/or 6GB 960ti (like 970m).

Still, 8GB 980 would be very nice card so, please Nvidia release one already.

June 5, 2015 | 05:46 PM - Posted by nevzim (not verified)

Tom, how important is OpenCL for Nvidia?

June 5, 2015 | 09:46 PM - Posted by mikep (not verified)

Hello,

A couple of random questions if you get around to them:

- Will G-Sync ever find its way to Android/Android TV?
- When can we expect to see the first AAA games benefiting from DX12 Feature Level 12_1?
- Does Nvidia plan to support the Steam Controller for Shield Android TV?

Thanks!

June 6, 2015 | 03:30 AM - Posted by mohamedkamel

Nice! I hope I find time to watch it :)

June 6, 2015 | 11:48 PM - Posted by decoy11

I have a few questions for Tom.

1. Over the past few weeks GameWorks and Physx has again sparked outrage and has been blamed for performance issues in Witch 3 and Project Cars. Most notably AMD has put up a video where they claim the propriety black box code of GameWorks is bad for PC gaming and Nvidia forced tessellation beyond what is required for Hairworks. Is there any rebuttal or comments from Nvidia on their GameWorks program or to AMD?

2. Many people already think Nvidia give financial assistance to developers and while many developer deny financial assistance I assume that Nvidia provide assistance in other ways like PC systems, middleware (GameWorks, PhysX), and programmers to help develop the game. It seems that Nvidia is almost a video game publisher and I was curious if Nvidia had any thoughts about getting into that business? Maybe GRID exclusive games?

3. Any comment on the internet claiming Nvidia using drivers for planned obsolescence of Kepler to force users to buy Maxwell cards? Also why do you think people are quick to believe Nvidia would do that?

4. G-sync on HDMI? I also must ask again G-sync + ULMB mode?

5. Just saw the NVIDIA VR Headset Patent Found article ask Tom what that is about?

June 6, 2015 | 03:40 AM - Posted by Anonymous (not verified)

Thinking about buying the 980Ti, however I have 3 x 1920x1080 screens (approx. 5 years old) at the moment but I would like to replace one for a GSync 144hz 2560x1440 screen in the middle, would I still be able to utilize all 3? like my current AMD eyefinity layout?

June 6, 2015 | 11:46 AM - Posted by gerard (not verified)

Question:

A while back Tom was talking about the "acoustic experience", but going by a lot of reviews and user feedback this card and titan x seem noticeably louder than previous models using the same heatsink design. Some users are even reporting their ti and titan x are throttling even in single card use, that being said it would seem to me that this heatsink design has basically ran out of headroom and the balance between performance and noise levels is being blurred especially when cards are throttling. Is there much that can be done to improve this design as it already uses a vapour chamber and is a 2 slot design, or is it time for something new?

June 6, 2015 | 05:47 PM - Posted by Cannonaire

Maxwell has much better effective memory bandwidth than older architectures, meaning that, for example, a 128-bit interface doesn't necessarily mean the card will be slow. I was wondering if these improvements have much of an effect on antialiasing (because antialiasing is notoriously bandwidth-limited) - will we see an improvement there, or will the antialiasing be limited to the performance you would expect from an older 128-bit interface (as in the example)?

June 6, 2015 | 06:42 PM - Posted by Aibohphobia

Question for Tom:

Are there any plans for a Nvidia single-spaced (60mm) 2-way SLI bridge?

June 8, 2015 | 11:47 AM - Posted by Ghostmop

Question for Tom:

Many GTX 970/980 owners are experiencing a black/blank screen on boot up and/or resume from sleep when using DisplayPort connections. https://goo.gl/ecM0iK

I have the latest drivers and vBios for my MSI GTX 980 Gaming 4G card. I am using DVI connections as a workaround to the DisplayPort problem.

Can nVidia offer a solution or replacement cards if this issue cannot be corrected?

June 8, 2015 | 03:11 PM - Posted by slugbug55

I have Doctor's appointment at 2:00 PM, I doubt I'll be back in time.

June 8, 2015 | 04:12 PM - Posted by db87

Tom, Will we also see G-Sync Direct (G-sync without module) for the desktop?

If so will you keep remaining naming it G-sync while it's technically using the (by competitor invented) VESA implementation?

June 8, 2015 | 05:41 PM - Posted by arbiter

Um not likely due to well on desktop module does so much work where on laptop the gpu is connected directly to the panel.

June 8, 2015 | 04:19 PM - Posted by db87

Tom, Will Nvidia keep pushing GameWorks (and PhysX) with game developers who are rushing games to the market without proper game optimization?

How does Nvidia feel about having a Nvidia / GameWorks logo on the box of a game that runs like sh*t?

June 8, 2015 | 04:28 PM - Posted by db87

Tom, Nvidia's Maxwell 2 support the optional DirectX12 feature level 12.1 but feature level 11.2 is missing. Can you Demystifying DirectX 12 support?

June 9, 2015 | 12:00 AM - Posted by KyonCoraeL

Hi Tom, Are there any old laptops that you know of which are on the market currently which will retroactively get mobile G-sync?

For example the Razer Blade 2014.

June 9, 2015 | 10:16 AM - Posted by BBMan (not verified)

First paragraph:

"GTX TITNA X"

I'd like to get my hands on a TITNA too.

June 9, 2015 | 10:35 AM - Posted by Dusty

We see with the 980 Ti that the 70-90 extra Mhz on the boost are making up for the missing cores, and that the 6 Gb of memory is not being fully utilized. Does this mean that with an aftermarket cooling system and an overclock, the 980 Ti might actually do better than the Titan on gaming benchmarks? Are any partners going to be releasing custom coolers on the 980 Ti? Is stock cooling system a limitation when benchmarking or is the limitation somewhere else in the PC?

June 9, 2015 | 10:45 AM - Posted by funandjam

Question for Tom Petersen:

Considering that AMD just demonstrated Variable Refresh over HDMI and this this points to mainstream monitors and/or TVs with VRR, Does Nvidia have plans to develop Gsync to go that route?

June 9, 2015 | 05:41 PM - Posted by funandjam

It's funny how Petersen blows off VRR over HDMI, as if pretending that VRR is not important on cheaper monitors and TV's. BUT, Ryan only asked part of my question and not the whole thing, otherwise I think Tom might have answered differently and given us a clue on what Nvidia is planning.

June 10, 2015 | 05:49 PM - Posted by arbiter

Would have to modify hardware on the card since HDMI spec is Fixed rate. As for TV there really has been No reason for VRR, its like 3D tv's.

June 9, 2015 | 10:49 AM - Posted by Anonymous (not verified)

Since mobile Gsync does not need a module, how does mobile Gsync handle below VRR window?

June 10, 2015 | 05:59 PM - Posted by arbiter

Does it in same way it does on the desktop.

June 9, 2015 | 11:34 AM - Posted by RenderB

I don't really expect to get an answer, but:

As a disabled gamer I am often dependent on magnification software such as the one built into windows 7. This includes running it in combination with games in windowed mode.

When this setup causes a driver issue on my Amd graphics cards the driver resets itself, Aero is disabled for a while, but games will keep working afterwards.

On my Nvidia machines the driver will recover, but the game will have stopped displaying any video. Either crashing, or being stuck on a white/black screen while the game itself keeps running+accepting inputs.

This isn't a specific driver, machine, or shadowplay etc issue. I've been seeing this for a few years now on both desktops, and laptops. Can you shine any light on this difference?

June 9, 2015 | 11:35 AM - Posted by razor512

What are the power limits of the card? One of the main issues with the newer nvidia cards, is the power and voltage limits getting in the way of overclocks. While on some of the cards, the voltage regulator can be more directly controlled by the bios, thus allowing for higher voltages, and power limits. For many nvidia cards lately, the power/TDP limits have been consistently set well below the max limits for 6 and 8 pin PCIe power connectors.

Does the 980ti break free of these limitations?

June 9, 2015 | 11:45 AM - Posted by HipsDontLie (not verified)

Tom, will the new gsync monitors due out later this year have new gsync modules (i.e gsync 2.0)? Some of the monitors coming out (e.g Asus PG279Q) have both hdmi and display port connections. Is this a result of the monitor using separate modules or are we going to get an updated gsync module (with additional improvements)?

Would also love to hear if there's any update on the planned additional features that gsync modules will be able to provide. I know Nvidia have referred to other benefits that they plan to implement in the future. Thanks!

June 9, 2015 | 01:25 PM - Posted by Anonymous (not verified)

I'm very much interested too in more details about G-syn v2. Can you ask Tom more details about it?

I see that the new 3440x1440 screens coming out soon will use that module. Any other changes/advantages over the v1 module except for the additional HDMI input?

June 9, 2015 | 03:03 PM - Posted by HipsDontLie (not verified)

About to watch the livestream so I'll ask again in the livechat and try and get an answer anyway! :)

June 9, 2015 | 01:45 PM - Posted by Thedarklord

Question for Tom;

Why did NVIDIA chose not to include a backplate with the GTX TITAN X or the GTX 980 TI?, the idea of it being for SLI does not make sense, considering not many run SLI, and those that do are mainly 2 way SLI and have space between the cards anyway?

Adding in a backplate to flagship cards serves no practical purpose anyway, but it certainly makes the card feel like a premium product that obviously people want and enjoy.

June 9, 2015 | 02:09 PM - Posted by cmadrid

How much of an effect will DirectX 12 have on GPU design? Will nVidia be able to leverage its greater driver support to gain even more of an edge over the competition?

June 9, 2015 | 02:32 PM - Posted by Barry S (not verified)

With the era of Thunderbolt 3 dawning, does Nvidia think there is a possibility for true portable, swappable, external graphics cards in the near future? How might Direct X12's multi-gpu options make this more or less desired. Can I use 2 or 3 of them? Lastly, Is the bandwidth still too low or latency too high for external graphics to still be considered reasonable for anything other than laptops.

June 9, 2015 | 02:35 PM - Posted by Martin Lewis (not verified)

Hi Tom, Just ordered A Evga 980ti SC ACX 2.0 Card Upgrading From 780s As I Use Gsync And Want The Extra VRAM
We Had Some Special Kepler Boosting Drivers In The Past With Real World Performance Increases , Can We Expect Some Special Maxwell Boosting Drivers In a Future Driver Branch ?
Thanks !

June 9, 2015 | 02:36 PM - Posted by Titan_V (not verified)

AMD's TressFX runs well on Nvidia GPUs. Why won't Nvidia design Hairworks such that it doesn't cripple performance for AMD hardware?

June 10, 2015 | 05:58 PM - Posted by arbiter

Well problem for AMD hardware Hairworks uses DX11 tessellation, AMD cards really SUCK at tessellation. Yea hairworks uses the DX11 standard, funny how AMD whines about that now?

If you want an idea how bad they suck, a r9 290x is SLOWER then an gtx750ti.

June 9, 2015 | 02:36 PM - Posted by Martin Lewis (not verified)

I Game At 4K
Thanks

June 9, 2015 | 02:41 PM - Posted by fingolfin33

Question for Tom
Any differences in CUDA support between GTX 980 Ti and GTX Titan other than the physical differences with the core count, memory, and fill rate?

June 9, 2015 | 02:43 PM - Posted by ZoyosJD

Comparing the situations of the original Titan and 780 Ti to the Titan X and 980 Ti:

The original Titan had 2x RAM, double precision compute, and was $300 more than the 780 Ti while being out for 9 months uncontested.

The Titan X still has 2x RAM, but lost double precision compute, yet is $350 more than the 980 Ti, which has released significantly sooner after.

Has this had any impact on the uptake of the Titan brand?

June 9, 2015 | 02:45 PM - Posted by Cataclysm_ZA

A few questions for Tom:

1) What challenges would there be moving from using Displayport on the G-Sync FPGA to HDMI? Does the FPGA still function in the same way when working with HDMI signaling?

2) DX12 now has this new "multi adapter" thing. In Nvidia's testing, are there any pros to teaming up GPUs with disparate sets of hardware? Can you then add two GTX580s together and scale up the double-precision compute capabilities of the cards as if they were a single unit?

3) With Nvidia exiting the mobile market in regards to phones, what other future does Tegra have besides home consoles and cars? Do you expect Tegra to be competitive in the server space, like AMD's Seattle server chips?

4) Titan X is the first high-end GPU from your stable to reduce double-precision compute to almost nothing. Why was this decision taken, and is it solely based on power/TDP constraints?

5) Why is the Hairworks feature in Witcher 3 so taxing on Kepler GPUs? What magic does Maxwell have that lets the GPU perform so much better with the same workload?

6) Why is Nvidia and her contemporaries so put on solving current memory bandwidth issues with tricks like colour compression, when Microsoft is talking about bandwidth-saving techniques like tiled resources? Is there a future problem that requires both technologies in order for us to work around it effectively?

Thanks! - Wesley

June 9, 2015 | 02:51 PM - Posted by Thedarklord

Question for Tom;

Can we expect a new Shield Tablet?, if so any thoughts on a larger size (10 inch + size?)

June 9, 2015 | 02:52 PM - Posted by Nemex

Great stuff and very good time for European viewers. Always enjoy the NVIDIA live streams!

One question for Tom:

I really love the windowed G-SYNC, great surprise! One big issue with it though: Brightness flicker seems more noticable than in fullscreen G-SYNC, especially at low FPS. Is NVIDIA aware about this and can it be improved/already actively worked on?

June 9, 2015 | 02:58 PM - Posted by glfnchf24

So everyone by now knows its a gaming beast! But, I'm sure a lot of us want to know how well it works with Editing Software like Adobe Premiere/Sony Vegas, etc.

June 9, 2015 | 03:11 PM - Posted by Titan_V (not verified)

Any updates on the two law suits Nvidia is involved in?

Nvidia vs. Samsung
970 Class action

June 9, 2015 | 03:15 PM - Posted by Danny74

Hi Guys...

June 9, 2015 | 03:51 PM - Posted by Ordsmed (not verified)

1. Can you reveal anything about the new GPU-processes (14nm and 16nm, I think) that are rumored for next year? The internet doomsayers are proclaiming a paradigm shift that will render all current GPUs obsolete (including TITAN X and GTX 980Ti).

2. Does Nvidia have any plans to implement High Bandwidth Memory in coming products?

3. Is there a particular reason for the lack in diversity of "blower"-style coolers? Both in regards to the Nvidia reference-cooler not changing for several iterations, and that nearly all aftermarket coolers being non-blowers/open-air.

June 10, 2015 | 05:54 PM - Posted by arbiter

1. Tom would never talk about any future products so asking is pointless.

2. Same answer as question one, though road maps say that next GPU nvidia is releasing will use HBM2. Reports say Pascal tape chips(prototypes) using HBM2 have been produced for internal testing.

3. the cooler they use is a very good cooler, does the job keeping the gpu cool enough for most uses.

June 9, 2015 | 04:07 PM - Posted by Danomite (not verified)

Windowed G-Sync, Love it, this was one of my biggest wishes for gsync.
However I ran into a few issues testing it this weekend on Asus ROG Swift.
Most worked completely as expected, but I've run into 2 situations where it does not work as it should.
In windowed Gsync and windowed frame rate in World of Tanks and SWTOR, drops from 60+ fps to low 20s.
In fullscreen Gsync or in Gsync disabled they work fine and does not change the framerate.
I should stress these are the exceptions to something that seems work with everything else I tried.
I hope it gets fixed soon!

June 9, 2015 | 04:15 PM - Posted by glfnchf24

Just wondering why there are no cards that are utilizing the new DP 1.3 standard that has been ratified for quite some time now.

June 9, 2015 | 04:29 PM - Posted by Anonymous (not verified)

Questions for Tom:

1. Has the stereo/warp injector (previously part of VR Direct) been abandoned completely?

2. Does the hardware that helps speed up multiviewport rendering also help somewhat with stereo rendering - using 2 viewports?

June 9, 2015 | 04:35 PM - Posted by Ashun (not verified)

Shader aliasing is one of the most distracting things in modern games. Forcing sparse grid supersampling is great for DX9 games, but for newer games, MSAA and even DSR sometime leave aliasing.

How does sparse grid supersampling work and why doesn't it work in DX10+ games?

June 9, 2015 | 05:04 PM - Posted by Anonymous (not verified)

I was driving and couldn't see the entire thing and didn't get entered in time. I am not much of a gamer anyway, and the only system I have is an old i7-920 dell workstation. It has an 8600 gt or something in it (256 MB). I was thinking of getting a 960 so I can run a few steam games under Linux. Anyway, 12 pm is not a convenient time if you actually live in pacific time zone.

June 9, 2015 | 05:04 PM - Posted by Anonymous (not verified)

I was driving and couldn't see the entire thing and didn't get entered in time. I am not much of a gamer anyway, and the only system I have is an old i7-920 dell workstation. It has an 8600 gt or something in it (256 MB). I was thinking of getting a 960 so I can run a few steam games under Linux. Anyway, 12 pm is not a convenient time if you actually live in pacific time zone.

June 10, 2015 | 07:35 AM - Posted by Geran (not verified)

Well I got stuck on the train and couldn't tune in to try and win this or watch the stream which I was looking forward to. Sigh...

June 11, 2015 | 07:51 AM - Posted by Anonymous (not verified)

Since Tom Petersen said G-Sync is there problem from end to end.

Will Nvidia be upgrading all old & current panels that have issues or will they toss the lawyers in front of you ?

http://www.pcper.com/reviews/Displays/Acer-XB270HU-27-1440P-144Hz-IPS-G-...

He also denied a lot of claims that were made about G-Sync in the past on your 1 on 1 podcast and all of a sudden he started signing like a bird.

Kepler G-Sync isn't as good as Maxwell G-Sync. Kepler G-Sync had lag. That was a WTF moment. He said on your podcast that AMD didn't know what they were talking about at the moment yet here he is saying it does.

Love marketing. Give out more free stuff to distract people from their short memory spans.

June 25, 2015 | 12:32 PM - Posted by db87

I disagree on the Gameworks matter:

Most often Gameworks get's added at a very late time. Far Cry 4 version 1.0.0 for example didn't have tessellation and FUR (hairworks). Also the ambient lightning was altered and optimized at 1.0.1 (day one patch). So even if AMD had access to version 1.0.0 prior to the release they couldn't optimize what wasn't there yet...

We also have to realize that games sometimes get rushed to meet the release date. Nvidia has only given a few weeks to add the GameWorks features and also forwards that version to their driverteam for optimization. In such occasion AMD gets the retail version just days prior to release. That's means that AMD is at least 3 weeks behind Nvidia. AMD then releases a beta driver within two weeks after the release.

Also Nvidia argues that Gameworks sourcecode is available for the game developer. While that's true the game developer is prohibited to share that to third parties like AMD. This makes Gameworks a black box for AMD.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.