NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

View Full Size

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

View Full Size

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!

August 21, 2014 | 05:34 PM - Posted by Julian (not verified)

Hi,

I've asked this question via twitter to nvidia so many times and no one has ever bothered replying. Does the kepler series (GTX 660 TI) support natively DX11.2??? Also when is the GTX 800 series coming out?

Thanks

August 21, 2014 | 11:22 PM - Posted by arbiter

he won't likely say anything about 800 series on here. If i remember right dx11.2, the cards support some 3d gameing parts of it but not 2d desktop parts so can't be called 11.2 offically.

August 21, 2014 | 05:40 PM - Posted by Anonymous (not verified)

Question about G-sync modules.

G-sync partner companies like Acer, AOC, ASUS, BenQ, and Phillips, are rumored to come out with non G-sync 4k monitors, such as the Asus 4k 32" ProArt PA328Q and the Phillips 40" 4k monitor.

Will Nvidia make it possible to buy the module separately and install it on our monitor of choice? The way they did with the ASUS VG248QE? I don't want to be confined to 28" TN panels.

If I had the ability to do this, there'd be a 100% chance that I'd buy a G-sync module and as may geforce cards as I can afford.

Thanks!

August 21, 2014 | 11:24 PM - Posted by arbiter

Problem with doing it is not all monitors are same in terms of space or connections. So less they started using a standardized connections internally and space for hardware, not likely to happen.

August 22, 2014 | 08:57 AM - Posted by mutantmagnet (not verified)

You can't install a module into any display. A module has to be calibrated specifically to the specs of a display.

Read this post from a smaller display company to get a sense of what Nvidia is doing now for display manufactures.

http://overlordforum.com/topic/603-nvidia-g-sync/

August 22, 2014 | 11:21 AM - Posted by Anonymous (not verified)

I hear what y'all are saying. I was just curious given that Asus has had three displays with G-Sync capability: the VG248QE, PQ321Q that was shown at CES (not for sale of course), and the Swift. It seems like Asus might be headed in a direction of making as many of their panels as possible compatabile with G-sync in one way or another. Or at least that what i'm dreaming of. lol.

I'm just wondering if other manufactuers might start to develop some kind of standard that would allow for the insertion of a G-sync module?

August 21, 2014 | 06:08 PM - Posted by Martin Trautvetter

Will future reference cards have a different display connector layout to accommodate more than one G-sync monitor per card?

August 21, 2014 | 06:21 PM - Posted by Xander R (not verified)

Hello again,

A couple questions regarding G-Sync:

How does G-Sync behave when the render rate of the GPU is equal or greater than the maximum refresh rate of the monitor? Is vsync being automatically invoked by the driver in real time similar to Adaptive Vsync or is the frame rate being limited by some other means such as a frame rate limiter?

And secondly, has Nvidia considered equipping future generations of graphics cards (800 series) with more than one DisplayPort output since G-Sync requires DP so G-Sync Surround can be possible with less than three graphics cards as it is currently?

August 21, 2014 | 06:50 PM - Posted by Leonid (not verified)

I have two GTX780 running in SLI mode. Will PG278Q work in G-Sync mode if connected via DisplayPort link?

August 21, 2014 | 06:55 PM - Posted by Rick (not verified)

Why do you block physx if the primary card is amd?

August 21, 2014 | 07:04 PM - Posted by larsoncc

Two questions -
1. Will STANDARD (non-gsync) surround work if only one monitor has a gsync module? I'm currently running 60hz surround with the side monitors as 60hz (asus ve248h), and the center monitor as an asus vg248qe (144hz). This allows me to have 144hz on fps games in center monitor, and surround on adventure games (60hz). Basically - if I throw the gsync kit into the center monitor, will the rest of the setup operate as-is (for instance refresh rates / sync polarity still same with module?)?

2. Is everything that you're showing / talking about (surround gsync) applicable to the DIY gsync folks?

2.5 Something you could probably quickly say "no" to: Has anyone over there tested the GSync module on a monitor that is a similar make as the vg248qe? (I think that the panel of the ve248h is probably the same as the vg248qe)

August 21, 2014 | 07:06 PM - Posted by Anonymous (not verified)

I know it's been a while, but any chance he wants to retort the amd live cast? Then rince repeat.

August 21, 2014 | 11:38 PM - Posted by arbiter

Problem with AMD, they have made a lot of claims over last few years that are well not the full truth. like their claims their A10 mobile cpu goes head to head with an i7 part, but only proof they provide are benchmarks like 3dmark and a benchmark that measures opencl work. which all favor amd cpu cause the gpu accelerated work. So i wouldn't take any AMD claims completely at face value when it comes straight from them.

August 21, 2014 | 07:07 PM - Posted by larsoncc

OH - another quick question!!
Can you guys post the answers to the questions in the thread as a follow up to the event, just in case we can't hit up the live event (I might have business meetings).

August 21, 2014 | 07:52 PM - Posted by gloomfrost

Question for Tom- About G Sync

1.) AMD-Sync only has one way communication between cards and monitors.

Are there any benefits that we have not yet seen or you have not yet been asked about G-sync using two way communication?

2.) Film/Movies - Will G-Sync have a way to LOCK in 24 or 48 frames for viewing movies originally shot at 24 frames a second?

Thanks Ryan and Tom!

August 21, 2014 | 08:22 PM - Posted by Anonymous (not verified)

I would have to say yes about G-sync using two way communication?

Frame Limiting.

August 21, 2014 | 08:26 PM - Posted by Anonymous (not verified)

There is no reason to lock in a frame rate with gsync or freesync in theory as they feed the display as soon as the frame is ready to display regardless of the displays refresh rate.

August 22, 2014 | 10:31 AM - Posted by Paul Keeble (not verified)

These are my top two questions as well. Seems a lot hinges on whether this two way communication with a relatively expensive monitor module is actually better than a cheaper but dumber monitor module.

August 21, 2014 | 08:18 PM - Posted by Anonymous (not verified)

Hay guy's,

1. When will the price of the asus gsync 1440p display be reasonable? (QNIX QX2710 Evolution II $323.98)

2. Can a surround (three panel display) gsync enabled setup be driven from a single DP++ port?

3. will gsync\freesync ever be Occulus compatible?

4.Does gsync as it is, Support Freesync compatibility?

5. Can we expect the major manufacturers of LCD panels to increase the resolutions of released displays in future?
(my phone has a higher resolution then my laptop and desktop:)
YES.......

August 21, 2014 | 11:33 PM - Posted by arbiter

on #1, that is a 60hz panel, ATM that asus monitor is only 1440p 144hz panel you can buy, hence why it cost a lot more.

on #4, freesync is proprietary AMD code based off of adaptive sync but still proprietary. So not like it would ever.

August 21, 2014 | 08:37 PM - Posted by Anonymous (not verified)

1) What are any additional changes we might see in the next G-Sync version 2.0 ?

2) Why does the G-Sync module require 768mb of memory, is it a buffer for frame & hold and does the memory need to increase as the display resolution increases ?

3) When are the IPS, VA, Eizo panels coming and will they only be limited to a DP 1.2 connector ?

4) When you first showcased G-Sync the goal was to get cost to $50. The initial kits were $150 USD. The price since has gone in the opposite direction up to $199 USD.
The technology seams to benefits people in the middle and the lower end of performance and not the top end. I see the price premium being counter intuitive and if the recent VESA adoption of Adaptive-Sync in DisplayPort 1.2a and AMD showcasing "free-sync" on IPS panels and supposively samples coming next month. Are we likely to see the price premium fall ?

5) Will Nvidia be using the VESA Adaptive-Sync standard in the future ?

August 21, 2014 | 08:38 PM - Posted by Richard Burton (not verified)

Due to the public release G-Sync and the fact that it requires a DP connection. Firstly is there a reason modern GeForce graphics cards only have 1 DP connection and can we expect more than one DP connector in the future

August 21, 2014 | 08:44 PM - Posted by ChaosBahamut (not verified)

One question:

Any news on when the next series of desktop GPUs will be announced/released?

August 21, 2014 | 11:48 PM - Posted by arbiter

Until Nvidia publicly announces it he won't be able to say anything on that. Not to diss pcper but nvidia will do it on a bigger stage then a sites live stream.

August 21, 2014 | 09:11 PM - Posted by duttyfoot (not verified)

when will nvidia drop dvi and go full hdmi and dp?

August 21, 2014 | 09:22 PM - Posted by Anonymous (not verified)

One of the touted benefits of G-Sync is latency but its beholden to the panel.

Aside from tearing what benefits does G-Sync variable refresh rate have over a set refresh rate & latency of 120hz or 144hz rather then G-Sync on were latency is tied to the refresh rate and is constantly being changed frame to frame ?

120hz or 144hz your ensuring the lowest latency possible and when you turn on G-Sync that advantage goes away and if the game dips to the 60fps or 30fps your latency is a disadvantage of being doubled or tripled to the same monitor running at max set hz.

Is there a firmware update to fix this or is this just an inherit issue ?

August 22, 2014 | 03:54 AM - Posted by Joakim (not verified)

I think I can answer that. G-sync or not, you will only benefit from low latency of a 144Hz panel when the GPU is able to keep up with the current refresh rate. It doesn't matter if the refresh rate is locked at 144Hz if u'r GPU suddenly drops to 30fps. The benefit of the lower latency in a scenario like this would go away, simply because there would be no new content to show on the screen. Meaning: you wouldn't be able to notice if the latency was lower or not since you're staring at the same picture (the same frame) you did just before you moved your mouse. And furthermore if frame rate took a dive like this (144fps -> 30fps) you would feel stutter without G-sync and that would be a much more noticeable and annoying than the extra input lag at that time.

August 21, 2014 | 09:26 PM - Posted by Anonymous (not verified)

Will G-Sync ever make its way into laptops or Nvidia-powered Android devices (Android TV, Shield, etc.)?

August 21, 2014 | 09:56 PM - Posted by larsoncc

Oh heck, I guess I have more questions! Why is it so ridiculous to switch from "normal" mode to surround? Shouldn't this be a hot key by now (even on multicard setups)?

What are NVidias plans to support monitor setups that are outside their current capabilities- ie eliminating concerns with sync polarity, or supporting monitors running different resolutions (like Eyefinity)? How about PLP setups?

Why in triple monitor setups can't a person run 3 screens with one receiving the SLI boost (main monitor in Windows for example)? Currently, you can't have 3 monitors active and SLI on at the same time (except Surround). Best you can do is SLI and accessory. Can that be improved?

How does Gsync work with 3DVision? Do the glasses vary their shutter speed?

August 21, 2014 | 10:15 PM - Posted by Aaron (not verified)

Does G-sync help w/ micro stuttering w/ cards in SLI?

When will Amazon or Newegg take my money for a ASUS ROG Swift PG278Q?!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.