Review Index:

ASUS ROG Swift PG278Q 27-in Monitor Review - NVIDIA G-Sync at 2560x1440

Manufacturer: ASUS

The Waiting Game

NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.

In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.

View Full Size

Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.

That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.

Continue reading our review of the ASUS ROG Swift PG278Q 2560x1440 G-Sync Monitor!!

What is G-Sync? A Quick Refresher

Last year, I spent a lot of time learning about the technology behind NVIDIA G-Sync and even spoke with several game developers in the build up to the announcement about its potential impact on PC gaming. I wrote an article that looked at the historical background of refresh rates and how they were tied to archaic standards that are no longer needed in the world of LCDs, entitled: NVIDIA G-Sync: Death of the Refresh Rate. We also have a very in-depth interview with NVIDIA’s Tom Petersen that goes through the technology in an easy to understand step by step method that I would encourage readers watch for background on the game-changing feature in this display.

View Full Size

The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.

View Full Size

If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.

View Full Size

The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.

View Full Size

Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.

View Full Size

NVIDIA G-Sync switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.

View Full Size

There are a couple of fringe cases that NVIDIA has needed to build for, including frame times below 33ms (under 30 FPS), where the image on the panel might visibly darken or decay if it isn’t refreshed automatically, even if a new frame isn’t ready. Also, some games have issues with G-Sync (Diablo III, for example, doesn’t have a true full screen mode) and have to be disabled, either through a profile or manually to avoid artifacts.

Ultra Low Motion Blur Technology

Another feature present on the ASUS PG278Q monitor is ULMB, or Ultra Low Motion Blur. Original built as part of the NVIDIA 3D Vision infrastructure, ULMB is a technology that is used to decrease motion blur on the screen and remove or reduce ghosting of fast moving images. It does this by turning on the backlight in time with the screen refresh and then quickly darkening the backlight after the pixel has been “strobed”. The effect is that, with ULMB enabled, images are sharper and appear to have less motion blur from frame to frame.

This sounds great! But the side effect is a much lower total brightness perceived by the gamer on screen. Just as we saw with 3D Vision throughout its development, enabling this mode effectively drops the light output of the screen by half. For some gamers and in some situations, this trade off will be worth it. Particular games like RTS, that include lots of small text of units that scroll across the scene very quickly, can see dramatic sharpness increases.

View Full Size

It's difficult to capture with stills, but animations are darker, but shaper, with ULMB

It’s important to note that ULMB can only be used when G-Sync is not enabled and it only works at 85 Hz, 100 Hz, and 120 Hz. Most games, at least in my experiences thus far, will see much more benefit from the variable refresh rate technology of G-Sync than they will with ULMB. If brightness is a concern (like playing in a well lit room) then ULMB could be a non-starter as the halved light output will be very noticeable.

View Full Size

Enabling ULMB is as easy as navigating the monitor menu and selecting it and you’ll be able to adjust the strobe pulse width. I tested the capability through fantastic website called that offers users a host of options to test the motion blur of their displays. It was easy to find instances in which the ULMB feature allowed for sharper animations but the brightness variance was also very apparent.

August 11, 2014 | 09:32 AM - Posted by arbiter

Asus VG248QE + g-sync kit is ~480$ if you do it yourself. A site by name of overlordcomputer is selling it pre installed for 500$. For people that are not feeling up to doing it themself.

August 11, 2014 | 10:22 AM - Posted by H1tman_Actua1

I bought my Asus VG248QE + g-sync from digital storm for 499 installed.


August 12, 2014 | 11:58 AM - Posted by Anonymous (not verified)

It's also worth mentioning that that is a 24" 1080p display.

October 13, 2014 | 09:24 AM - Posted by Anonymous (not verified)

Asus VG248QE is a 24' 1080 display, and you are suggesting people spend $500 on it?

August 11, 2014 | 09:38 AM - Posted by Edkiefer (not verified)

question on resolution , how does desktop look at say 1080 if you didn't want to scale windows settings to 125% because some apps not like that .
I see many times comments anything but native gives blurry text .
I have old 2007FP Dell 1600x1200 and run a custom res and don't really notice any degrade .

August 11, 2014 | 10:08 AM - Posted by Anonymous (not verified)

I'll just quote Allyn Malventano

Now for people saying that it 'feels smoother', with no actual frame capture to back it up, makes it a bit of a bogus claim, IMO.

August 11, 2014 | 10:36 AM - Posted by Ryan Shrout

It's interesting to follow you around...

August 11, 2014 | 04:38 PM - Posted by Allyn Malventano

Thanks for quoting me, but I was talking about people saying one card 'felt smoother' than another card, where frame pacing had been fixed on both cards. My point was, how could people 'feel' that particular difference when the actual pacing difference between both platforms is negligible. in this context, pretty much anyone you sit down in front of a GSYNC display is definitely going to see the difference. It pretty much smacks you in the face, especially at low FPS.

August 12, 2014 | 11:40 AM - Posted by Anonymous (not verified)

Thanks you for bringing the truth, and clearing up the usual misunderstanding. I agree GSync is hard to miss.

August 11, 2014 | 10:12 AM - Posted by Anonymous (not verified)

Hate the "gamer" look but love the specs.

August 11, 2014 | 10:22 AM - Posted by H1tman_Actua1

BAM! suck it "free"sync!

August 11, 2014 | 11:00 AM - Posted by Anonymous (not verified)

Why didn't this get an "Editor's Choice" award instead of a gold award? What's the difference between those two awards anyway? lol.

August 11, 2014 | 11:49 AM - Posted by Pholostan

My guess would be the price. Very expensive for a TN panel, albeit a nice one.

I checked my go-to stores here, they have it in stock or on order. But they ask more than 1000 USD (7500 SEK including taxes). I think I'll keep my cheap Korean 1440p Samsung PLS that does 100Hz.

August 11, 2014 | 02:35 PM - Posted by Ryan Shrout

Editor's Choice would be higher than Gold, yes. And you had it right - the exceptionally high price is what led me to not dive 100% in with this product. And also we have a 4K G-Sync option coming from Acer and a 1080p AOC model coming soon as well.

August 11, 2014 | 04:56 PM - Posted by Anonymous (not verified)

please don't tell me that 4k gsync monitor will be a 27-28" TN. :-(

When will we see a 32" or larger gsync non-TN 4k monitor?

August 12, 2014 | 04:55 PM - Posted by Klimax (not verified)

Heh, and I would love 24". (25"+ wouldn't fit space)

August 15, 2014 | 06:44 AM - Posted by JCCIII

Hey Ryan, longtime… I don’t know if I’d be so generous with even Gold. Would you please explain the technical reasons why the G-Sync module appears to have limited manufacturers to one DisplayPort, not even bypassing or rerouting for the monitor to switch to other sources?

Also, of the several monitors I have found being released shortly, none is above the color challenged, off-axis fading, TN; are there legitimate technical reasons for this limitation? Are there no high-quality panels capable of G-Sync’s or FreeSync’s requirements; relatively speaking, anything TN is not high-quality. I would really appreciate knowing and cannot find a legitimate answer, thank you for your time!

As I said before, your work is outstanding. Blessings…

Sincerely, Joseph C. Carbone III 14 August 2014

August 15, 2014 | 09:18 PM - Posted by JCCIII

Hello... :(

August 11, 2014 | 11:01 AM - Posted by remon (not verified)

It's a pitty, in eyefinity, the best cards to run those 3 monitors don't support the one feature that makes this monitor special.

August 11, 2014 | 11:08 AM - Posted by Anonymous (not verified)

does Ultra Low Motion Blur work with Intel/AMD graphics card?

August 11, 2014 | 04:49 PM - Posted by Wesley (not verified)

ULMB still has workarounds for the VG248QE that do work, but ULMB on something like the ROG Swift will be very difficult to crack because it's a custom scaler designed by Nvidia to drive the monitor. Even Nvidia has been paying attention to how people are using ULMB, disabling it for GSync.

If you'd like any indication of how much Nvidia hates supporting ULMB, they appear to have gotten ASUS to disable it entirely for any Radeon GPUs, even though the option is in the monitor's OSD menu and not part of GSync at all.

August 11, 2014 | 06:14 PM - Posted by Anonymous (not verified)

this really sucks, they are already making money with the gsync module thing inside the monitor, and it shouldn't require any work from forceware or geforce;

personally 120-144Hz and ULMB sounds a lot more exciting than Gsync.

I play games with no vsync and don't care much about tearing, but LCD motion blur is horrible

also being a vendor specific feature just kills it for me;

August 12, 2014 | 12:19 PM - Posted by H1tman_Actua1

You're a total fucken retard.... please find out how variable refresh rate works...kook

August 11, 2014 | 11:09 AM - Posted by Anonymous (not verified)

also, can you disable the annoying red light thing?

August 11, 2014 | 11:44 AM - Posted by Anonymous (not verified)

yes, you can. 4th post in the thread.

August 11, 2014 | 11:38 AM - Posted by Anonymous (not verified)

G-Sync out of range bug

August 11, 2014 | 11:56 AM - Posted by H1tman_Actua1

beta driver 340.43 is broken with Gsync.

August 11, 2014 | 12:18 PM - Posted by Anonymous (not verified)

So are the 340.52 WHQLs.

Beta or WHQL the bug is there. Not very comforting given how long G-Sync has been in development to still have a bug like this persist.

August 11, 2014 | 02:42 PM - Posted by Ryan Shrout

Odd, I used BOTH of those drivers in my testing.

August 11, 2014 | 03:52 PM - Posted by Anonymous (not verified)

You also said the 30fps dip is gone. Linus and others noted its still there in their reviews.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.