The Waiting Game

The first retail ready NVIDIA G-Sync monitor is finally reviewed, the ASUS PG278Q ROG Swift.

NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing — almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.

In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.

Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.

That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.

What is G-Sync? A Quick Refresher

Last year, I spent a lot of time learning about the technology behind NVIDIA G-Sync and even spoke with several game developers in the build up to the announcement about its potential impact on PC gaming. I wrote an article that looked at the historical background of refresh rates and how they were tied to archaic standards that are no longer needed in the world of LCDs, entitled: NVIDIA G-Sync: Death of the Refresh Rate. We also have a very in-depth interview with NVIDIA’s Tom Petersen that goes through the technology in an easy to understand step by step method that I would encourage readers watch for background on the game-changing feature in this display.

The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.

If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.

The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.

Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.

NVIDIA G-Sync switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.

There are a couple of fringe cases that NVIDIA has needed to build for, including frame times below 33ms (under 30 FPS), where the image on the panel might visibly darken or decay if it isn’t refreshed automatically, even if a new frame isn’t ready. Also, some games have issues with G-Sync (Diablo III, for example, doesn’t have a true full screen mode) and have to be disabled, either through a profile or manually to avoid artifacts.

Ultra Low Motion Blur Technology

Another feature present on the ASUS PG278Q monitor is ULMB, or Ultra Low Motion Blur. Original built as part of the NVIDIA 3D Vision infrastructure, ULMB is a technology that is used to decrease motion blur on the screen and remove or reduce ghosting of fast moving images. It does this by turning on the backlight in time with the screen refresh and then quickly darkening the backlight after the pixel has been “strobed”. The effect is that, with ULMB enabled, images are sharper and appear to have less motion blur from frame to frame.

This sounds great! But the side effect is a much lower total brightness perceived by the gamer on screen. Just as we saw with 3D Vision throughout its development, enabling this mode effectively drops the light output of the screen by half. For some gamers and in some situations, this trade off will be worth it. Particular games like RTS, that include lots of small text of units that scroll across the scene very quickly, can see dramatic sharpness increases.

It's difficult to capture with stills, but animations are darker, but shaper, with ULMB

It’s important to note that ULMB can only be used when G-Sync is not enabled and it only works at 85 Hz, 100 Hz, and 120 Hz. Most games, at least in my experiences thus far, will see much more benefit from the variable refresh rate technology of G-Sync than they will with ULMB. If brightness is a concern (like playing in a well lit room) then ULMB could be a non-starter as the halved light output will be very noticeable.

Enabling ULMB is as easy as navigating the monitor menu and selecting it and you’ll be able to adjust the strobe pulse width. I tested the capability through fantastic website called testufo.com that offers users a host of options to test the motion blur of their displays. It was easy to find instances in which the ULMB feature allowed for sharper animations but the brightness variance was also very apparent.

« PreviousNext »