NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

View Full Size

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

View Full Size

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!


August 21, 2014 | 05:34 PM - Posted by Julian (not verified)

Hi,

I've asked this question via twitter to nvidia so many times and no one has ever bothered replying. Does the kepler series (GTX 660 TI) support natively DX11.2??? Also when is the GTX 800 series coming out?

Thanks

August 21, 2014 | 11:22 PM - Posted by arbiter

he won't likely say anything about 800 series on here. If i remember right dx11.2, the cards support some 3d gameing parts of it but not 2d desktop parts so can't be called 11.2 offically.

August 21, 2014 | 05:40 PM - Posted by Anonymous (not verified)

Question about G-sync modules.

G-sync partner companies like Acer, AOC, ASUS, BenQ, and Phillips, are rumored to come out with non G-sync 4k monitors, such as the Asus 4k 32" ProArt PA328Q and the Phillips 40" 4k monitor.

Will Nvidia make it possible to buy the module separately and install it on our monitor of choice? The way they did with the ASUS VG248QE? I don't want to be confined to 28" TN panels.

If I had the ability to do this, there'd be a 100% chance that I'd buy a G-sync module and as may geforce cards as I can afford.

Thanks!

August 21, 2014 | 11:24 PM - Posted by arbiter

Problem with doing it is not all monitors are same in terms of space or connections. So less they started using a standardized connections internally and space for hardware, not likely to happen.

August 22, 2014 | 08:57 AM - Posted by mutantmagnet (not verified)

You can't install a module into any display. A module has to be calibrated specifically to the specs of a display.

Read this post from a smaller display company to get a sense of what Nvidia is doing now for display manufactures.

http://overlordforum.com/topic/603-nvidia-g-sync/

August 22, 2014 | 11:21 AM - Posted by Anonymous (not verified)

I hear what y'all are saying. I was just curious given that Asus has had three displays with G-Sync capability: the VG248QE, PQ321Q that was shown at CES (not for sale of course), and the Swift. It seems like Asus might be headed in a direction of making as many of their panels as possible compatabile with G-sync in one way or another. Or at least that what i'm dreaming of. lol.

I'm just wondering if other manufactuers might start to develop some kind of standard that would allow for the insertion of a G-sync module?

August 21, 2014 | 06:08 PM - Posted by Martin Trautvetter

Will future reference cards have a different display connector layout to accommodate more than one G-sync monitor per card?

August 21, 2014 | 06:21 PM - Posted by Xander R (not verified)

Hello again,

A couple questions regarding G-Sync:

How does G-Sync behave when the render rate of the GPU is equal or greater than the maximum refresh rate of the monitor? Is vsync being automatically invoked by the driver in real time similar to Adaptive Vsync or is the frame rate being limited by some other means such as a frame rate limiter?

And secondly, has Nvidia considered equipping future generations of graphics cards (800 series) with more than one DisplayPort output since G-Sync requires DP so G-Sync Surround can be possible with less than three graphics cards as it is currently?

August 21, 2014 | 06:50 PM - Posted by Leonid (not verified)

I have two GTX780 running in SLI mode. Will PG278Q work in G-Sync mode if connected via DisplayPort link?

August 21, 2014 | 06:55 PM - Posted by Rick (not verified)

Why do you block physx if the primary card is amd?

August 21, 2014 | 07:04 PM - Posted by larsoncc

Two questions -
1. Will STANDARD (non-gsync) surround work if only one monitor has a gsync module? I'm currently running 60hz surround with the side monitors as 60hz (asus ve248h), and the center monitor as an asus vg248qe (144hz). This allows me to have 144hz on fps games in center monitor, and surround on adventure games (60hz). Basically - if I throw the gsync kit into the center monitor, will the rest of the setup operate as-is (for instance refresh rates / sync polarity still same with module?)?

2. Is everything that you're showing / talking about (surround gsync) applicable to the DIY gsync folks?

2.5 Something you could probably quickly say "no" to: Has anyone over there tested the GSync module on a monitor that is a similar make as the vg248qe? (I think that the panel of the ve248h is probably the same as the vg248qe)

August 21, 2014 | 07:06 PM - Posted by Anonymous (not verified)

I know it's been a while, but any chance he wants to retort the amd live cast? Then rince repeat.

August 21, 2014 | 11:38 PM - Posted by arbiter

Problem with AMD, they have made a lot of claims over last few years that are well not the full truth. like their claims their A10 mobile cpu goes head to head with an i7 part, but only proof they provide are benchmarks like 3dmark and a benchmark that measures opencl work. which all favor amd cpu cause the gpu accelerated work. So i wouldn't take any AMD claims completely at face value when it comes straight from them.

August 21, 2014 | 07:07 PM - Posted by larsoncc

OH - another quick question!!
Can you guys post the answers to the questions in the thread as a follow up to the event, just in case we can't hit up the live event (I might have business meetings).

August 21, 2014 | 07:52 PM - Posted by gloomfrost

Question for Tom- About G Sync

1.) AMD-Sync only has one way communication between cards and monitors.

Are there any benefits that we have not yet seen or you have not yet been asked about G-sync using two way communication?

2.) Film/Movies - Will G-Sync have a way to LOCK in 24 or 48 frames for viewing movies originally shot at 24 frames a second?

Thanks Ryan and Tom!

August 21, 2014 | 08:22 PM - Posted by Anonymous (not verified)

I would have to say yes about G-sync using two way communication?

Frame Limiting.

August 21, 2014 | 08:26 PM - Posted by Anonymous (not verified)

There is no reason to lock in a frame rate with gsync or freesync in theory as they feed the display as soon as the frame is ready to display regardless of the displays refresh rate.

August 22, 2014 | 10:31 AM - Posted by Paul Keeble (not verified)

These are my top two questions as well. Seems a lot hinges on whether this two way communication with a relatively expensive monitor module is actually better than a cheaper but dumber monitor module.

August 21, 2014 | 08:18 PM - Posted by Anonymous (not verified)

Hay guy's,

1. When will the price of the asus gsync 1440p display be reasonable? (QNIX QX2710 Evolution II $323.98)

2. Can a surround (three panel display) gsync enabled setup be driven from a single DP++ port?

3. will gsync\freesync ever be Occulus compatible?

4.Does gsync as it is, Support Freesync compatibility?

5. Can we expect the major manufacturers of LCD panels to increase the resolutions of released displays in future?
(my phone has a higher resolution then my laptop and desktop:)
YES.......

August 21, 2014 | 11:33 PM - Posted by arbiter

on #1, that is a 60hz panel, ATM that asus monitor is only 1440p 144hz panel you can buy, hence why it cost a lot more.

on #4, freesync is proprietary AMD code based off of adaptive sync but still proprietary. So not like it would ever.

August 21, 2014 | 08:37 PM - Posted by Anonymous (not verified)

1) What are any additional changes we might see in the next G-Sync version 2.0 ?

2) Why does the G-Sync module require 768mb of memory, is it a buffer for frame & hold and does the memory need to increase as the display resolution increases ?

3) When are the IPS, VA, Eizo panels coming and will they only be limited to a DP 1.2 connector ?

4) When you first showcased G-Sync the goal was to get cost to $50. The initial kits were $150 USD. The price since has gone in the opposite direction up to $199 USD.
The technology seams to benefits people in the middle and the lower end of performance and not the top end. I see the price premium being counter intuitive and if the recent VESA adoption of Adaptive-Sync in DisplayPort 1.2a and AMD showcasing "free-sync" on IPS panels and supposively samples coming next month. Are we likely to see the price premium fall ?

5) Will Nvidia be using the VESA Adaptive-Sync standard in the future ?

August 21, 2014 | 08:38 PM - Posted by Richard Burton (not verified)

Due to the public release G-Sync and the fact that it requires a DP connection. Firstly is there a reason modern GeForce graphics cards only have 1 DP connection and can we expect more than one DP connector in the future

August 21, 2014 | 08:44 PM - Posted by ChaosBahamut (not verified)

One question:

Any news on when the next series of desktop GPUs will be announced/released?

August 21, 2014 | 11:48 PM - Posted by arbiter

Until Nvidia publicly announces it he won't be able to say anything on that. Not to diss pcper but nvidia will do it on a bigger stage then a sites live stream.

August 21, 2014 | 09:11 PM - Posted by duttyfoot (not verified)

when will nvidia drop dvi and go full hdmi and dp?

August 21, 2014 | 09:22 PM - Posted by Anonymous (not verified)

One of the touted benefits of G-Sync is latency but its beholden to the panel.

Aside from tearing what benefits does G-Sync variable refresh rate have over a set refresh rate & latency of 120hz or 144hz rather then G-Sync on were latency is tied to the refresh rate and is constantly being changed frame to frame ?

120hz or 144hz your ensuring the lowest latency possible and when you turn on G-Sync that advantage goes away and if the game dips to the 60fps or 30fps your latency is a disadvantage of being doubled or tripled to the same monitor running at max set hz.

Is there a firmware update to fix this or is this just an inherit issue ?

August 22, 2014 | 03:54 AM - Posted by Joakim (not verified)

I think I can answer that. G-sync or not, you will only benefit from low latency of a 144Hz panel when the GPU is able to keep up with the current refresh rate. It doesn't matter if the refresh rate is locked at 144Hz if u'r GPU suddenly drops to 30fps. The benefit of the lower latency in a scenario like this would go away, simply because there would be no new content to show on the screen. Meaning: you wouldn't be able to notice if the latency was lower or not since you're staring at the same picture (the same frame) you did just before you moved your mouse. And furthermore if frame rate took a dive like this (144fps -> 30fps) you would feel stutter without G-sync and that would be a much more noticeable and annoying than the extra input lag at that time.

August 21, 2014 | 09:26 PM - Posted by Anonymous (not verified)

Will G-Sync ever make its way into laptops or Nvidia-powered Android devices (Android TV, Shield, etc.)?

August 21, 2014 | 09:56 PM - Posted by larsoncc

Oh heck, I guess I have more questions! Why is it so ridiculous to switch from "normal" mode to surround? Shouldn't this be a hot key by now (even on multicard setups)?

What are NVidias plans to support monitor setups that are outside their current capabilities- ie eliminating concerns with sync polarity, or supporting monitors running different resolutions (like Eyefinity)? How about PLP setups?

Why in triple monitor setups can't a person run 3 screens with one receiving the SLI boost (main monitor in Windows for example)? Currently, you can't have 3 monitors active and SLI on at the same time (except Surround). Best you can do is SLI and accessory. Can that be improved?

How does Gsync work with 3DVision? Do the glasses vary their shutter speed?

August 21, 2014 | 10:15 PM - Posted by Aaron (not verified)

Does G-sync help w/ micro stuttering w/ cards in SLI?

When will Amazon or Newegg take my money for a ASUS ROG Swift PG278Q?!

August 21, 2014 | 10:36 PM - Posted by H1tman_Actua1

Thank you Nvidia for doing work and bringing us Gsync!!!

Pre-ordered My ROG swift last sunday from PCnation in IL .

cant wait to see that tracking #

also newegg is prepping

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236405&cm_re=PG...

August 22, 2014 | 12:02 AM - Posted by Anonymous (not verified)

I'd like to ask Tom how much he pays you for shilling

August 22, 2014 | 01:30 AM - Posted by H1tman_Actua1

a lot more then I pay your mom for a hando.

August 22, 2014 | 04:53 AM - Posted by Anonymous (not verified)

I'm having second thoughts on buying a G-Sync monitor. If buying one makes you act like this idiot i'll definitely pass on it.

August 22, 2014 | 12:20 AM - Posted by JCCIII

How is NVidia working with companies like Blizzard, whose games, like Diablo III, have issues with G-Sync?

With G-Sync, will desktops, like Windows’, be static, alleviating the need for a refresh until a call to the GPU is made, hopefully alleviating eyestrain and also allowing GPUs to use less power?

Why was the decision made to avoid high-quality panels, meaning using TN panels? Certainly 5 to 6 ms on a 120 Hz panel is arguably better for a first showing.

What is compatibility issue with Ultra Low Motion Blur that it will not work with G-Sync? It seems odd that one will not work with the other.

Has NVidia taken the necessary steps to ensure our thousand-dollar cards will work with FreeSync?

Tom Peterson, hearing from and seeing you again will be a pleasure, thank you.

Thank you PC Perspective crew, Ryan Strout, your work is excellent. Do not let Mr. Peterson answering a question that only creates another question.

Blessings to you all and to all those whom love.

Sincerely, Joseph C. Carbone III; 21 August 2014

August 22, 2014 | 12:20 AM - Posted by arbiter

As a person that plays diablo III, that game has its own issues of chugging to a crawl in fights as it is.

avoid HQ panels? Do you mean IPS? if so how many 120hz ones are there even at this time? IPS even those better color then TN are slower in hz side, 60hz/fps in most games is easy so no real reason to use that.

Nvidia cards i think they said works with VESA adaptive sync standard, BUT Free sync is proprietary AMD code so can't expect them to support it outta the box if they ever do.

August 22, 2014 | 01:14 AM - Posted by Anonymous (not verified)

You really need to get out more.

The ASUS ROG Swift is a 12ms panel at 60hz. IPS panels are faster then that natively at 4-8ms. At 144mhz the ASUS ROG Swift is 7ms panel. Not until its overdriven and color is degraded can it get below 4ms.

Take any of the current Korean brands IPS panels that are being overclocked to 120hz. They are going to get you a faster response time then the Asus Rog Swift 144hz @ 7ms without the need of any color degradation.

You can buy 2 of these and overclock them for the price of one ASUS Rog Swift.
http://www.amazon.com/dp/B00BUI44US/?tag=ufoghost-20

$300 premium on a non IPS panel just seams silly when the advantages comes to those using lower end systems who don't spend much let alone $750 USD on something or high end systems with multi monitors and even then G-Sync isn't functioning properly for them needing 1 panel per GPU card.

August 22, 2014 | 02:54 AM - Posted by JCCIII

Like cherry picking CPUs and GPUs, there are IPS/PLS panels that would do the job at 120 Hz and cleanly. There are several examples that have been on the market for at least eight months.

Understanding AMD’s direction, the company intends on using FreeSync by working with a standard already implemented by LCD controllers and connectors, which is the Embedded Display Port standard and a compatible controller used by notebooks, other portable devices, and some all-in-one PCs, the connector being available with Display Port 1.2a for the PC.

What is needed from companies like AMD and Nvidia is drivers to work with the existing technology, and some monitors are believed to already have controllers that recognize VESA’s Adaptive-Sync in the same way the Toshiba notebook AMD used to demonstrate FreeSync had the right controller. The effort needed from Nvidia to make an LCD recognize this technology AMD demonstrated is minor compared to G-Sync: drivers! Which, they have likely already written for their mobile devices.

I do not trust Nvidia’s adoration of being reasonable, to do anything that does not require first you and I put cash in their big pockets. However, if they avoided what has been available at little extra cost to use, what they are selling us is required to be superior, but if this proves to be contrary, which Nvidia has a reputation for less than admirable marketing strategies, then we will want our cards to work without Nvidia’s premium monitors and will need Adaptive-Sync drivers from the company.

---

Thank you PC Perspective crew, Ryan Shrout, your work is excellent. Do not let Mr. Peterson answer a question that only creates another question go unnoticed.

Sincerely, Joseph C. Carbone III; 21 August 2014

August 22, 2014 | 03:44 AM - Posted by arbiter

AMD's freesync isn't what VESA ratified as DP 1.2a standard, AMD said their freesync was gonna be a standard which was nothing more then a half truth, its based off the standard but is proprietary code in the end.

August 22, 2014 | 02:35 AM - Posted by JCCIII

...

August 22, 2014 | 02:35 AM - Posted by JCCIII

...

August 22, 2014 | 02:35 AM - Posted by JCCIII

...

August 22, 2014 | 04:12 AM - Posted by Joakim (not verified)

About Ultra Low Motion Blur. Since that technique inserts a black frame (switches off the led backlight)in between the visible frames that the GPU produces, it would lead to a screen that very visibly starts to flicker when framerate varies. Have you for example seen how Sony LCD TV's work using Impulse mode? That mode also inserts a black frame between it's native refresh interval. But because the normal refresh rate is only 60Hz the screen flickers quite a lot and also gets somewhat darker when this mode is activated. I suspect this is one of the reasons for it not working together with G-sync. Plus the fact that it would most likely add some extra latency when synchronizing the panels led backlight to the variable refresh rate of G-sync.

August 22, 2014 | 12:27 AM - Posted by Arkamwest

Hola,
I just build my pc I used as reference a pc that you recommend for play Titanfall on 1080p, but I change the graphic card for a asus gtx 780. I really want a g-sync monitor, but I want a 1080p g-sync monitor. So my question is when can I buy it?
And if Nvidia have a official online retailer in Mexico?
The asus monitor is too expensive, I think a 24 g-sync 1080p monitor would be nice.

Gracias

August 22, 2014 | 01:53 AM - Posted by arbiter

you can buy ASUS VG248QE 144hz monitor and g-sync DIY kit, that is 1080p monitor or there is a few sites that sell it pre-installed.

August 22, 2014 | 12:19 PM - Posted by Thedarklord

Some quick questions, thanks again to PCPer and NVIDIA for this;

1) When can we expect NVIDIA SHIELD PC Streaming to support multiple controllers, say when using a SHIELD Tablet in "console mode" playing PC Games?

2) Any plans to put the Denver version of Tegra K1 into a NVIDIA SHIELD Tablet?

3) Does NVIDIA have more OEM's planning to produce G-Sync monitors than is currently announced?, Maybe Dell?

I for one would be very excited for a 24Inch+ Dell IPS Ultrasharp with G-Sync built in.

4) Any plans for NVIDIA to license its Tegra/Gefoce line to say other ARM SoC developers?, ex: an Apple ARM SoC with Geforce GPU.

5) What is NVIDIA's current opinion on the state of multi GPU systems with regard to the problem of micro-stutter.

August 22, 2014 | 03:45 AM - Posted by JCCIII

Okay, one of the most important questions, did I mention importance, what the hell is it with the PG278Q being another 16:9 at 2560 x 1440? Did Nvidia have any say? Please, do not say that was ASUS.

For Childern: For Adults:

1920 x 1080 | 1920 x 1200
2560 x 1440 | 2560 x 1600
3840 x 2160 | 3840 x 2400

The Shield Tablet got it right with 1,900 x 1,200 display.

August 22, 2014 | 04:16 AM - Posted by Joakim (not verified)

Thats kinda an opinion of taste. At work I prefer 16:10, but when gaming 16:9 or wider.

August 22, 2014 | 04:54 AM - Posted by razor512

Why doesn't nvidia implement a built in auto overclocker in their drivers, like with some other graphic card brands?,

While manual overclocking is simple enough, there are some users who are unable to get a stable overclock.

August 22, 2014 | 05:55 AM - Posted by Balamute (not verified)

With VR supposedly about to have a big impact on gaming, are we going to see GPU designs optimized for driving 2 1920x2160 displays?
Or perhaps efforts to improove drivers to make 2-way SLI setups better at rendering pairs of slightly spatially offset frames?

August 22, 2014 | 06:03 AM - Posted by Wesley (not verified)

I have a few questions for Tom:

1) What is the technical reason for ASUS to have Lightboost/ULMB disabled while running in GSync mode on the PG278Q? Is there an electrical issue with doing this, or is the GSync scaler not capable of coordinating both at the same time? If electrical, what's required to make ULMB work at the full 144hz refresh rate?

2) ULMB can be supported on Radeon GPUs, but it's intentionally blocked for PG278Q owners. Why go through the trouble of blocking this out? Why not leave ULMB open for Radeon users, which would allow them to consider buying this monitor and later upgrading to a Geforce GPU that would support GSync? Most people change GPUs to keep up with the times, but they don't swap out their monitors as much. By blocking ULMB, you're locking out a fairly big part of the market that's currently on the fence on whether they want variable refresh rates now or in 2015. Giving them ULMB now would probably make them your customers later when upgrade time comes around.

3) What tricks can the GSync module do that aren't related to performance in games? Can a faux panel self-refresh ability be patched in?

4) Once Displayport 1.3 monitors come to market, will Nvidia continue to invest in GSync and custom scaler hardware, or will it eventually be deprecated with the rise of DP1.3?

5) Monitors with GSync don't have variable refresh rates below 30Hz for now. What is Nvidia doing to make the fabled 23.976Hz mode work for watching videos?

August 22, 2014 | 11:05 AM - Posted by Joakim (not verified)

5) Monitors with GSync don't have variable refresh rates below 30Hz for now. What is Nvidia doing to make the fabled 23.976Hz mode work for watching videos?

I want to know this aswell. It's kind of a biggie for us that uses the computer to watch blu-rays. Having to manually change refresh every time or deactivating G-sync to watch a movie is not that nice.

August 22, 2014 | 06:32 AM - Posted by Balamute (not verified)

What is easier for a GTX Titan to render:
100 duck sized horses with their hair modelled with tools from Nvidia GameWorks or
1 horse sized duck with its feathers modelled with TressFX?

August 22, 2014 | 07:28 AM - Posted by Anonymous (not verified)

Just what kind of game are you making there?

August 22, 2014 | 06:41 AM - Posted by Decoy (not verified)

I have a few questions regarding G-Sync that I haven't seen answered anywhere yet.

1. Gsync so far has been marketed at gaming but another thing I think we all know is that it can fix the video judder from TV and Movies. What i mean is that TV and Movies don't run at 60fps they run at different fps(movies 24fps, PAL 25fps, 48fps, etc.). Is Nvidia working on a solution for this or can some sort of developer tap into Gsync to make a solution?

2. So far all Gsync monitor come with ULMB is that going to be stanard going forward with all Gsync monitors?

3. PhysX 2 vs PhysX 3 is there anyway to benchmark between the 2 version. Also why do some developers still use PhysX 2 (gearbox, rocksteady)?

4. Gsync IPS when? I check the Overlord Computer forums. They said they were on the list but that was 6months ago.

August 22, 2014 | 06:42 AM - Posted by suiken_2mieu (not verified)

In order to get 3 g-sync monitors in surround, does it require 3 video cards? Or are there any video cards coming out with multiple DisplayPort ports?

Can I have a G-sync monitor in the middle, and then have 2 non-G-Sync monitors on the side? I assume this will create issues with tearing on the side monitors.

August 22, 2014 | 07:57 AM - Posted by larsoncc

In the Geforce forums, NVidia reps have previously said it takes 3 cards (because of display port needs), but this would be good to confirm, same with your 'cards with multiple display port' question.

The way that I understand it, running Surround the way you describe isn't possible with GSync *active*. Surround treats all the monitors as one single monitor, so Surround is supported to whatever degree the monitors are exactly the same. So best case scenario - if all three don't support GSync, then that feature simply isn't available. This is similar to my setup, where I run two 60hz and one 144hz monitor. Surround is only available to me at 60hz (which is fine!).

But my earlier question still stands, given the fickle nature of Surround. Is surround in this manner possible AT ALL? Meaning, if I have a GSync monitor in the middle of two 60hz monitors, would 60hz normal Surround be available? While I think it should be, no one I've talked to has any idea - and I don't want to spend hundreds without knowing for sure.

August 22, 2014 | 07:09 AM - Posted by James Hadley (not verified)

Question.

Is there ever a chance that G-Sync will work with AMD cards? Could this be implemented with a driver, or is there some proprietary hardware on Nvidia cards that would prevent this?

August 22, 2014 | 02:52 PM - Posted by arbiter

AMD could maybe add it if they license the tech from nvidia but its AMD and they are too cheap to ever do that.

August 22, 2014 | 08:05 AM - Posted by ChangWang

Not really a question, more of a message to pass along.

Hey Tom, please ask the Shield team to consider building a 64GB SKU for future products.

If it were possible to get a Shield Tablet that was wifi only and 64GB with a cost around $350-400, I'd have 2 on pre-order already.

August 22, 2014 | 08:09 AM - Posted by Rod (not verified)

When will nvidia come up a fix for shield tablet wifi problems and case cracking at the edges?
Will they replace my tablet?

August 22, 2014 | 08:41 AM - Posted by mutantmagnet (not verified)

1) With the advancements you are making with gsync and project Denver have you been laying the ground work to enter the laptop/tablet market to offer a competitive (hopefully lower cost) alternative to AMD/Intel?

2) So GPUs can't share memory because of latency over pcie but with the improvements you made with Keplar such as GPUDirect can't you offer a dual gpu system where all the memory on a single card is available in the future?

August 22, 2014 | 08:50 AM - Posted by mutantmagnet (not verified)

Just to expand on question 2. Can't you in the future use the technology in GK110 so a dual gpu system can be seen as a single GPU through a virtualized instance? This way you increase the number of physical GPUs that can be SLIed from 4 to 8.

August 22, 2014 | 11:32 AM - Posted by Chas Boss (not verified)

AT what point will DX12 be an issue with cards that are farther back in the line up? Cards like the 560, 660 and the such. With some elements not working so well with DX 11.2, what does that mean for the cards with DX12?

August 22, 2014 | 12:04 PM - Posted by NamelessTed

How hard was the decision to market the Shield Tablet as a gaming device while? The Note 7 is my favorite tablet that I have ever owned and I don't use it for gaming at all. It seems that the Shield Tablet would be a great all day tablet for everything not gaming but even as a gamer I don't find any of the gaming features compelling enough as a selling point in that form factor.

I would actually be much more interested in buying a Tegra K1 powered shield in the form factor of the original shield rather than a tablet that just has a secondary controller. Or at least some sort of controller that can dock to a controller, having two separate pieces of hardware that don't connect just doesn't make any sense to me. If I have to set up on a desk/table to play a game what is the point?

August 22, 2014 | 12:08 PM - Posted by Jani Turunen (not verified)

When is Nvidia shield tablet LTE coming? as I ordered it

August 22, 2014 | 12:08 PM - Posted by JohnS (not verified)

When will Nvidia EVER start offering true 10 bit color on it's HDMI and Displayport output of GTX and Titan cards for occasional content creation and gaming?

I do not want to buy a Quadro, not a design company...

August 22, 2014 | 12:42 PM - Posted by Anonymous (not verified)

Why are we so behind on eGPU technology??!!!!??

I'd like to run powerful software out of portable laptops that have great battery life.

August 22, 2014 | 01:10 PM - Posted by Aaron McQuaid (not verified)

Hello Tom,

Thanks for all of your excellent work on behalf of gamers.

1. Are you aware of any OLED based monitors or TVs that are in development which will also include Gsync functionality?

2. Are you aware of any 4K IPS monitors that are in development that will also include Gsync Functionality?

3. Will the next gen cards (880) have DP 1.3 or HDMI 2.0 connectivity?

4. Is Nvidia working directly with Oculus to enable Gsync support in the Consumer version of the Rift?

5. Rumors state that the 20nm process at TSMC has run into issues. When can we expect 20nm Maxwell parts?

6. What is the approximate performance delta between a 28nm and a 20nm Maxwell part?

August 22, 2014 | 01:25 PM - Posted by AndyKoopa

Does the Shield tablet have support for AC wireless?

August 22, 2014 | 05:39 PM - Posted by JCCIII

Okay Ryan, here's a question for Tom, why do the monitors have only one input? Bypass the module for legacy connections and/or add some more DP through the module if too expensive or involved to have legacy.

August 22, 2014 | 02:06 PM - Posted by Anonymous (not verified)

What are the unique challenges in dealing with 4K content? Specifically with regards to gaming, G-Sync, streaming 4K content, etc?

How is NVIDIA dealing with these issues?

August 23, 2014 | 12:43 AM - Posted by Danomite (not verified)

So I have a combined suggestion/comment/question.

Now I believe I have read that GSYNC does not work with fullscreen windowed mode, (as apparently it requires full-screen rendering for it work) and I find this incredibly disappointing.

I find myself constantly tabbing and doing other things while I am also gaming (in particular MMOs) and losing the ability to multitask (traditional fullscreen tabing is not acceptable) is a major drawback in my opinion.

Thus my question is, could nvidia adjust/improve GSYNC to allow it to work without real fullscreen rendering?

The way I imagine it working is it would adaptively set the entire screens refresh rate to whatever application is in the foreground, regardless of if it is fullscreen or not.

Thus a game in a window or fullscreen windowed might might have its refresh rate varying while it is in the foreground, while Windows might be set to say 60 or 120, and then another application say a movie could be tied to the films framerate (say 24).

Ideally it would be able to switch to whatever is being interacted with, so a movie would gets its perfect frame rate, as would windows, and as would any game, or whatever else is being used.

In short, GSYNC everywhere, all the time without the need for an application to be fullscreen but just in focus.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.