Review Index:
Feedback

NVIDIA G-Sync DIY Upgrade Kit Installation and Performance Review

Manufacturer: NVIDIA G-SYNC

Installation

Installation:

As a test to the integrity of the documentation, I'm going to perform this mod limiting myself to the required tools stated on page 1 of the included instructions:

View Full Size

Be sure to give yourself some extra room and use something to cushion the LCD as you'll have to lay it face down. You will definitely be sliding it around a bit during the next step.

View Full Size

View Full Size

The hole above is handy for getting things started. You'll need to gap the bezel here with a screwdriver.

View Full Size

With a gap started, the included plastic tool makes short work of getting the panel apart. Work slowly along the bottom *first*, as that's where you are likely to break something until you get used to how to work these particular clasps apart. I found the best method was to insert the tool ~1/4" as pictured above and then simply rotate the end parallel with the table. Leverage is your friend.

View Full Size

With the panel separated we reveal the patient. Everything must be disconnected. Yes, the metal box you see here is only held to the back of the LCD with tape.

View Full Size

Electronics are out, now let's really get to work.

View Full Size

The two PCB's above are removed, and then the LVDS and LCD power cables are removed from the old logic module.

View Full Size

After transferring the cables to the G-Sync module (be sure to read the next page for a correction / clarification on that step), it gets installed in the old shield box.

View Full Size

Replacement plates block off the now unused HDMI / DVI openings. G-Sync only uses DisplayPort.

View Full Size

Cables are re-connected to the G-Sync module, with the exception of the built-in speakers. Real gamers don't use LCD panel audio!

View Full Size

Another look at the new port cover plate with everything buttoned up.

View Full Size

...and these are the parts leftover from the upgrade process.

January 5, 2014 | 02:46 AM - Posted by Fernandk (not verified)

my dream is to see this on TVs..

January 5, 2014 | 06:06 AM - Posted by Branthog

My Sony has an Impulse mode for gaming, which is essentially Lightboost. It's terrific. It kills the intensity of the lighting to accomplish it (so you need to crank the backlight up) and it introduces flicker that some people notice to varying degrees (but seems fine to me - and I'm picky) and only really benefits you at 60fps, so should be disabled for sub 60fps content, but . . . man is it really fantastic.

January 5, 2014 | 10:11 AM - Posted by Anonymous (not verified)

Keep dreaming. It doesn't work in non-3d applications. Not to mention below 30fps it's aweful.

January 7, 2014 | 02:56 PM - Posted by StewartGraham (not verified)

Of course it works in non-3D applications. I played with G-Sync at the recent SC2 Tournament in NYC where it was first debuted to the public and there was absolutely no indication that it was "awful" below 30fps. As far as I could tell, FPS had no ill-effect on the manner in which G-Sync worked (at least on the two demo-displays).

January 8, 2014 | 10:00 PM - Posted by Anonymous (not verified)

Are you calling Allyn a liar ?

He has a half-page write up on the last page on it.

January 7, 2014 | 03:05 PM - Posted by StewartGraham (not verified)

I'm not sure this would benefit a television unless the source was a PC's GPU.

January 7, 2014 | 07:33 PM - Posted by gm (not verified)

There's no point to this on a TV - TV content is a fixed framerate and latency doesn't matter as long as the sound is in sync...

January 8, 2014 | 07:14 PM - Posted by kin (not verified)

step 1. get 55" good panel quality tv with sub 1frame input lag
step 2. get 5$ hdmi cable
step 3. ...
step 4. profit

now. if there was gsync in there, i wouldnt mind it. trust me when i say that playing BF4 on max settings on sony's w905 55" is not too shabby.

January 5, 2014 | 11:09 AM - Posted by Anonymous (not verified)

Please check your spelling before publishing the article.

January 7, 2014 | 03:01 PM - Posted by StewartGraham (not verified)

Oh be quiet, you obviously have no clue what you're talking about... Allyn worked very hard to have this article ready for PCper fans before CES. Also, if you knew anything about PCper, you'd know that Allyn is not the Editor in Chief, Ryan is. So if you'd like to direct your comment towards someone, perhaps you should email Ryan.

January 28, 2014 | 06:22 AM - Posted by Anonymous (not verified)

stfu - gtfo

January 5, 2014 | 12:29 PM - Posted by Lou (not verified)

I'm assuming this kit is only available for this one particular Asus monitor. Are there other kits for different monitors or brands?

January 5, 2014 | 12:50 PM - Posted by Tim Verry

Correct, and not at the moment.

January 5, 2014 | 02:58 PM - Posted by Fishbait

Whelp, my original post was blocked by the spam filter. I'll make this a tad more concise.

Excellent and very thorough review Allyn.

Could you confirm whether or not LightBoost is built in?

January 5, 2014 | 04:10 PM - Posted by Allyn Malventano

Apologies, we are tweaking our spam filtering. 

I haven't tested it personally, but the display should handle it with the gsync module installed - but not in gsync mode. It's either or. 

January 6, 2014 | 07:44 AM - Posted by Lord Binky (not verified)

I want this...but I am not going back from 2560x1440. Not that I complain about IPS 2560x1440 @ 120Hz.

January 6, 2014 | 09:02 AM - Posted by Lord Binky (not verified)

Also, what happened to thin bezels? Thin bezels are better looking than monitor thickness. I've taken my monitor apart and it sticks the panel to the bezel with an L bracket that makes requires a larger bezel. Why not use an L bracket and bring the bezel attachment BEHIND the panel for attachment, and reduce the bezel width. Then for multi-monitor users, make the front bezel detachable and then they only have the panel + <1mm bracket between monitors. To make it better, attach the brackets to the top and bottom of the panel. For either configuration they can place the OSD controls recessed from the panel and attached to the bracket for an even slicker profile.

January 6, 2014 | 08:35 PM - Posted by Anonymous (not verified)

AMD FREE-SYNC --> NVIDIA G-SYNC

January 7, 2014 | 07:20 AM - Posted by gamerk2 (not verified)

Let wait and see; I'm interested to see if latency is going to be a major problem with Freesync, given how AMD attempts to fix the problem in software. Nevermind most monitors don't support dynamic Vblank intervals yet anyways. If NVIDIA gets its hardware in first...

January 7, 2014 | 08:35 PM - Posted by chefbenito (not verified)

This is a whole lot of work and way too expensive for what its trying to do. The article is well written if in need of a quick spell check/re-read. It is totally fair of readers to nitpick on spelling mistakes in a tech article. It also does not negate the content.

G-sync is a great idea, just save us all the cost and sell monitors that are G-Sync ready. Have a logo on the box or corner of the bezel that says G-Sync Inside or something. In my experience better drivers and tweaking settings can reduce or eliminate tearing almost every time.

January 9, 2014 | 06:45 PM - Posted by Anonymous (not verified)

I think this is really dangerous for the average person, and if someone gets killed.. Nvidia and PCPER and everyone else pushing this upgrade is liable to get sued by the dead persons family.

Monitors and especially TV`s are very dangerous, and should not be opened unless they are a qualified electrician. There is plenty of current stored in components well after its been switched off.. some components hold current for days and weeks that could easily kill you.

This is an accident waiting to happen.

January 10, 2014 | 10:07 AM - Posted by Jeremy Hellstrom

Seriously?  Apart from the fact that LCD capacitors are signifcantly smaller and hold less power than the ones in a CRT, you feel that people should be able to sue a website for providing information that could used improperly?

January 10, 2014 | 05:31 PM - Posted by Keith everitt (not verified)

Just purchased mine. One question though. Does it come with a display port cable or will I need to purchase one?

January 14, 2014 | 02:01 PM - Posted by Anonymous (not verified)

it comes with a display port cable. Installation is easy, just watch the orientation of the 'Y' style connector as mentioned in the article.

The posted YouTube video is a slightly different board than what ships to us.

The ribbon cable is also a pain to install.

January 14, 2014 | 07:30 PM - Posted by armrray1 (not verified)

Eagerly awaiting my kit! The only thing that worries me a bit is that this kit is, by necessity, late-prototype engineering-sample production-candidate custom hardware and what goes in the new monitors (as Allyn says) will almost certainly be not this, but fully baked production ASICs. But, as we see here, the modules in these kits are FPGAs which makes me wonder (not knowing a whole lot about FPGAs) could these modules be 're-flashed' or somehow patched in case something needs fixing before the retail monitors go live? (I do know this is bleeding-edge stuff and fully accept the risks the kits bring.) Either way, I still think it's very cool of NVIDIA to let us in on at least a little bit of bleeding edge hardware hackery.

January 19, 2014 | 10:18 PM - Posted by Anonymous (not verified)

As someone that owns this monitor and GTX 760s in sli, I can usually push 100-120 fps in most games with max settings at 1080p. Would gysnc have any benefit for me? Or is it mainly beneficially when u can push 60fps?

August 15, 2014 | 09:25 AM - Posted by 80yearoldGrandpaGamer (not verified)

Yes it would. Gsync even makes low framerates like 40fps feel way more fluid than they are. In fact you are likely to notice it a little less if you're used to a high framerate with a 120 or 144hz monitor...but either way it's going to be better, this should have happene din the monitor world a long time ago.

January 26, 2014 | 12:32 AM - Posted by Anonymous (not verified)

Does anyone know if the power brick with the kit is 100-240V ?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.