Review Index:
Feedback

NVIDIA G-Sync DIY Upgrade Kit Installation and Performance Review

Manufacturer: NVIDIA G-SYNC

Internals and Documentation

Internals:

Since the G-Sync module and PCB are already, well, 'exposed', here are a few shots of the externals:

View Full Size

This side of the PCB holds the G-Sync module itself, along with the necessary logic to interface it with the ASUS LCD panel controls (power, brightness, etc).

View Full Size

The other side takes care of all of the power handling circuitry. Most of the work is done by an external power adapter, so things are sparse here.

View Full Size

The back of the PCB is mostly empty, save some leftover power handling components.

If you guys know me, you know I *have* to take apart *something* for a review. The G-Sync module seemed a fitting victim:

View Full Size

On the front of the PCB we have the logic and memory required to perform the G-Sync goodness:

View Full Size

View Full Size

I was sort of expecting a spiffy G-Sync logo here, but I'll take Altera instead. The back story here is that NVIDIA chose to go with a more expensive FPGA chip for this first round of G-Sync hardware, reason being is that it is faster to get a product to market in this manner. This is typical for first run hardware, and purpose-built ASICs would logically follow in future generations of G-Sync hardware.

View Full Size

The rear of the PCB contains minimal power / filtering circuitry, in addition to firmware.

Documentation:

View Full Size

I must say this is the best documentation I've ever seen for anything DIY - and it is absolutely necessary as this is not the sort of mod that most folks will be familiar with. Lets dive in and install this puppy!

January 5, 2014 | 05:46 AM - Posted by Fernandk (not verified)

my dream is to see this on TVs..

January 5, 2014 | 09:06 AM - Posted by Branthog

My Sony has an Impulse mode for gaming, which is essentially Lightboost. It's terrific. It kills the intensity of the lighting to accomplish it (so you need to crank the backlight up) and it introduces flicker that some people notice to varying degrees (but seems fine to me - and I'm picky) and only really benefits you at 60fps, so should be disabled for sub 60fps content, but . . . man is it really fantastic.

January 5, 2014 | 01:11 PM - Posted by Anonymous (not verified)

Keep dreaming. It doesn't work in non-3d applications. Not to mention below 30fps it's aweful.

January 7, 2014 | 05:56 PM - Posted by StewartGraham (not verified)

Of course it works in non-3D applications. I played with G-Sync at the recent SC2 Tournament in NYC where it was first debuted to the public and there was absolutely no indication that it was "awful" below 30fps. As far as I could tell, FPS had no ill-effect on the manner in which G-Sync worked (at least on the two demo-displays).

January 9, 2014 | 01:00 AM - Posted by Anonymous (not verified)

Are you calling Allyn a liar ?

He has a half-page write up on the last page on it.

January 7, 2014 | 06:05 PM - Posted by StewartGraham (not verified)

I'm not sure this would benefit a television unless the source was a PC's GPU.

January 7, 2014 | 10:33 PM - Posted by gm (not verified)

There's no point to this on a TV - TV content is a fixed framerate and latency doesn't matter as long as the sound is in sync...

January 8, 2014 | 10:14 PM - Posted by kin (not verified)

step 1. get 55" good panel quality tv with sub 1frame input lag
step 2. get 5$ hdmi cable
step 3. ...
step 4. profit

now. if there was gsync in there, i wouldnt mind it. trust me when i say that playing BF4 on max settings on sony's w905 55" is not too shabby.

January 5, 2014 | 02:09 PM - Posted by Anonymous (not verified)

Please check your spelling before publishing the article.

January 7, 2014 | 06:01 PM - Posted by StewartGraham (not verified)

Oh be quiet, you obviously have no clue what you're talking about... Allyn worked very hard to have this article ready for PCper fans before CES. Also, if you knew anything about PCper, you'd know that Allyn is not the Editor in Chief, Ryan is. So if you'd like to direct your comment towards someone, perhaps you should email Ryan.

January 28, 2014 | 09:22 AM - Posted by Anonymous (not verified)

stfu - gtfo

January 5, 2014 | 03:29 PM - Posted by Lou (not verified)

I'm assuming this kit is only available for this one particular Asus monitor. Are there other kits for different monitors or brands?

January 5, 2014 | 03:50 PM - Posted by Tim Verry

Correct, and not at the moment.

January 5, 2014 | 05:58 PM - Posted by Fishbait

Whelp, my original post was blocked by the spam filter. I'll make this a tad more concise.

Excellent and very thorough review Allyn.

Could you confirm whether or not LightBoost is built in?

January 5, 2014 | 07:10 PM - Posted by Allyn Malventano

Apologies, we are tweaking our spam filtering. 

I haven't tested it personally, but the display should handle it with the gsync module installed - but not in gsync mode. It's either or. 

January 6, 2014 | 10:44 AM - Posted by Lord Binky (not verified)

I want this...but I am not going back from 2560x1440. Not that I complain about IPS 2560x1440 @ 120Hz.

January 6, 2014 | 12:02 PM - Posted by Lord Binky (not verified)

Also, what happened to thin bezels? Thin bezels are better looking than monitor thickness. I've taken my monitor apart and it sticks the panel to the bezel with an L bracket that makes requires a larger bezel. Why not use an L bracket and bring the bezel attachment BEHIND the panel for attachment, and reduce the bezel width. Then for multi-monitor users, make the front bezel detachable and then they only have the panel + <1mm bracket between monitors. To make it better, attach the brackets to the top and bottom of the panel. For either configuration they can place the OSD controls recessed from the panel and attached to the bracket for an even slicker profile.

January 6, 2014 | 11:35 PM - Posted by Anonymous (not verified)

AMD FREE-SYNC --> NVIDIA G-SYNC

January 7, 2014 | 10:20 AM - Posted by gamerk2 (not verified)

Let wait and see; I'm interested to see if latency is going to be a major problem with Freesync, given how AMD attempts to fix the problem in software. Nevermind most monitors don't support dynamic Vblank intervals yet anyways. If NVIDIA gets its hardware in first...

January 7, 2014 | 11:35 PM - Posted by chefbenito (not verified)

This is a whole lot of work and way too expensive for what its trying to do. The article is well written if in need of a quick spell check/re-read. It is totally fair of readers to nitpick on spelling mistakes in a tech article. It also does not negate the content.

G-sync is a great idea, just save us all the cost and sell monitors that are G-Sync ready. Have a logo on the box or corner of the bezel that says G-Sync Inside or something. In my experience better drivers and tweaking settings can reduce or eliminate tearing almost every time.

January 9, 2014 | 09:45 PM - Posted by Anonymous (not verified)

I think this is really dangerous for the average person, and if someone gets killed.. Nvidia and PCPER and everyone else pushing this upgrade is liable to get sued by the dead persons family.

Monitors and especially TV`s are very dangerous, and should not be opened unless they are a qualified electrician. There is plenty of current stored in components well after its been switched off.. some components hold current for days and weeks that could easily kill you.

This is an accident waiting to happen.

January 10, 2014 | 01:07 PM - Posted by Jeremy Hellstrom

Seriously?  Apart from the fact that LCD capacitors are signifcantly smaller and hold less power than the ones in a CRT, you feel that people should be able to sue a website for providing information that could used improperly?

January 10, 2014 | 08:31 PM - Posted by Keith everitt (not verified)

Just purchased mine. One question though. Does it come with a display port cable or will I need to purchase one?

January 14, 2014 | 05:01 PM - Posted by Anonymous (not verified)

it comes with a display port cable. Installation is easy, just watch the orientation of the 'Y' style connector as mentioned in the article.

The posted YouTube video is a slightly different board than what ships to us.

The ribbon cable is also a pain to install.

January 14, 2014 | 10:30 PM - Posted by armrray1 (not verified)

Eagerly awaiting my kit! The only thing that worries me a bit is that this kit is, by necessity, late-prototype engineering-sample production-candidate custom hardware and what goes in the new monitors (as Allyn says) will almost certainly be not this, but fully baked production ASICs. But, as we see here, the modules in these kits are FPGAs which makes me wonder (not knowing a whole lot about FPGAs) could these modules be 're-flashed' or somehow patched in case something needs fixing before the retail monitors go live? (I do know this is bleeding-edge stuff and fully accept the risks the kits bring.) Either way, I still think it's very cool of NVIDIA to let us in on at least a little bit of bleeding edge hardware hackery.

January 20, 2014 | 01:18 AM - Posted by Anonymous (not verified)

As someone that owns this monitor and GTX 760s in sli, I can usually push 100-120 fps in most games with max settings at 1080p. Would gysnc have any benefit for me? Or is it mainly beneficially when u can push 60fps?

January 26, 2014 | 03:32 AM - Posted by Anonymous (not verified)

Does anyone know if the power brick with the kit is 100-240V ?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.