Review Index:
Feedback

ASUS MG278Q 2560x1440 144Hz FreeSync Monitor Review

Author: Ryan Shrout
Subject: Displays
Manufacturer: ASUS

Calibration and Viewing Angles

Most monitor vendors today will advertise the color accuracy of their displays to some degree on marketing material, and the MG278Q is no different. On the landing page, on the ASUS website, the company calls out "75% color saturation" along with viewing angles and contrast ratios. But more often than not we find that the out of box experience with monitors can be quite poor, favoring heavy blues and oversaturated colors rather than actual color accuracy.

View Full Size

This screenshot above shows the output report from dispcalGUI on an uncalibrated MG278Q. The only change we made to the monitor before running the calibration was changing the color preset from the default and instead set it to sRGB. This should get the monitor pretty close to what ASUS believes is the calibrated state of the panel, in theory. But you can look at the uncalibrated report and see that this monitor was FAR from producing accurate color! The grays were off significantly as were the greens and blues.

View Full Size

After running through our hours-long calibration process, the monitor was able to move into a very accurate color space. Only a couple of hues remained outside of the acceptable range (light purple and dark purple, for example), but you would be hard pressed to find another TN panel running at 120Hz+ that exhibit these impressive calibration results. 

View Full Size

This diagram shows the result in the typical XxY fashion - the color triangle represents the before-calibration result while the dashed line represents (close to) the final result.

Calibration Profile Download

The Windows color profile management interface is a bit of a mess, with the need to select and enable a profile in multiple layers of the interface. The best guide for loading and enabling a profile can be found over at TFTCentral. We used the following tools to generate our own calibration profile:

Our calibration profile was created using the lowest calibration speed in a dimly lit room. Here are the required settings if you wish to use our profile:

  • FPS mode
  • Brightness: 25
  • Red: 89
  • Green: 97
  • Blue: 100
  • Profile download: (HERE)

The above profile was created specifically for a color temperature target of 6500K at a luminance of 120 cd/m2 (nit). Gamma 2.2. Remember that the only way to get a correct calibration on your specific panel is by using a colorimeter on that very panel. The above settings and profile will only get *your* display to a perfect calibration if it has the exact same properties as our test sample. A perfect match is unlikely, but this should get you far closer to calibrated than just running with defaults.

Viewing Angles

The ASUS MG278Q is a TN panel, a good quality one, but it still exhibits the properties we see with nearly all TN panels when it comes to viewing angles. Straight on, the screen looks great but as we rotated or move around the screen, we see some pretty dramatic color shifting. Notice from the bottom view (looking up at the display), we have nearly 100% inversion.

View Full Size

Center

View Full Size

From Bottom

View Full Size

From Top

View Full Size

From Left

You can see that even from the middle of the screen you are getting some moderate color shift with the MG278Q as you look at the top or bottom of the 27-in panel. Gamers will likely not notice a big change in the game though designers and anyone doing color-based work (Photoshop, etc.) will have issues getting their eyes in the correct place to see the image as it was intended.


December 30, 2015 | 12:58 PM - Posted by Anonymous (not verified)

I know Nvidia has 70+% market share in the discrete gpu market. But they will have to come up with something really great if they want to keep a 200$ Premium on their monitors. Or (and this would be ideal) cut the GSync crap and let's all work on adaptive-sync implementations (will never happen, I know)

December 30, 2015 | 02:38 PM - Posted by Anonymous (not verified)

They have 80%+ and the more they sh1t their customers, the more people buy their products. So don't expect them to cut the crap. Expect them to get worst.

December 30, 2015 | 06:31 PM - Posted by Anonymous (not verified)

They sell partially on having a perception as a high-end brand, so even if AMD has a much better value in the buyer's price range, many of them will still buy Nvidia. They can sell g-sync based on the same type of marketing even though free sync is almost exactly equivalent now.

January 3, 2016 | 08:10 PM - Posted by Anonymous (not verified)

NVidia doesn't want a premium on their monitors. They don't sell monitors, only the GSync module which is currently expensive.

NVidia wants to see monitors sold to drive GPU adoption so ideally they would want GSync monitors to be cheaper than Freesync.

Since the GSync module replaces the "guts" (scaler etc) of the monitor they can do things that Freesync can not likely.

At some point monitor manufacturers will have to add in newer features like light strobe whilst asynchronous mode is active so if GSync ends up making that natively supported but Freesync requires the manufacturer themselves to implement the feature we could see Freesync costing more than GSync or simply not offering certain features.

January 5, 2016 | 10:40 PM - Posted by Anonymous (not verified)

The g sync parts are not worth shit the charge Royalty fees to manufacturers to use them.

December 30, 2015 | 04:02 PM - Posted by Anonymous (not verified)

Another asus monitor with aggressive matte coating.
Have balls release glossy panel

December 30, 2015 | 04:09 PM - Posted by Ryan Shrout

Do you really want that? If so, why?

December 30, 2015 | 08:38 PM - Posted by Anonymous (not verified)

I'm not the person you're quoting, but I prefer glossy because it looks better in a good (dark) environment and doesn't have the blurring that many aggressive coatings have. Matte may look slightly less atrocious under direct light, but every display looks so terrible in those conditions that anyone who cares in the slightest about image quality will make sure to avoid direct light anyway.

January 1, 2016 | 11:55 AM - Posted by Anonymous (not verified)

That was my comment.

Colors look ten times better without.
I can tolerate IPS lighter matte that acer/asus use on thier ips gsync but the film on these TN too heavy TN colors look bad enough dont need aggressive matte on top of that.

Also not a single glossy display for gaming it would be nice not to be all one sided.
We have control over lights in room we have no control when they slap that ugly coating on panel.
Even tablets/phones and TV dont use it anymore for obvious reason.

January 1, 2016 | 12:27 PM - Posted by Anonymous (not verified)

edit
*Also not a single glossy display*
Not a single glossy gsync/freesync display

January 3, 2016 | 12:32 AM - Posted by Anonymous (not verified)

I was under the impression. That most TVs have anti-reflective coatings, at a minimum.

January 3, 2016 | 08:21 PM - Posted by Anonymous (not verified)

There are glossy monitors with anti-reflective coatings. The terminology can get confusing though.

December 30, 2015 | 04:10 PM - Posted by Anonymous (not verified)

"thanks to the latest AMD software update it can now support frame doubling to allow for smooth, variable refresh rates under 42 FPS as well"

This is great news for Adaptive Sync. Now we just need to get NVIDIA to support Adaptive Sync monitors too. Adaptive Sync monitors outnumber G-Sync about 3:1 in terms of number of models available worldwide. What did people think would happen with a free vs. expensive implementation?

G-Sync, it's been nice knowing you.

December 30, 2015 | 04:12 PM - Posted by Anonymous (not verified)

^ and I mean that sincerely as an ASUS VG24QE owner.

January 3, 2016 | 08:29 PM - Posted by Anonymous (not verified)

AMD's frame doubling still requires a 2.5X ratio minimum (i.e. 30Hz to 75Hz, but NOT 30Hz to 60Hz). This is due to the frame creation being driver (software) driven.

People keep saying NVidia's GSync will disappear, but keep in mind that the cost of the GSync module will continue to drop, and the fact that they are working on newer versions that will do things that Freesync simply can't (like light strobing whilst asynchronous mode is active).

Sure, AMD Freesync monitors can have features added like I just mentioned but that would be done by the monitor manufacturer which will cost them money thus be reflected in more expensive monitors.

Conversely, monitor manufacturers will be able to incorporate features that GSync already supports. So the cost of the GSync module will shrink but the features offered will increase.

So it's not quite as simple as "free vs expensive" if you understand the monitor market.

(We wouldn't even have Freesync if NVidia had not created GSync... and NVidia had no incentive to do that without making it proprietary so why hate on NVidia? It may not be ideal, but at least we have asynchronous monitors for smooth gaming and that's awesome.)

January 5, 2016 | 10:41 PM - Posted by Anonymous (not verified)

Hello its not the parts its a fee the manufactures pay to Nvidia.

December 30, 2015 | 04:47 PM - Posted by rl (not verified)

Hey Ryan/PCPer

Why not test the Range for every VRR-Monitor, not just FreeSync.

It would be interesting to know how low the PG279Q panel can be driven as well as the other x-Sync monitors to see whether the module gives an advantage or not. Afaik, you only did that for the original (TN) ROG Swift and then stopped. :(

December 30, 2015 | 06:19 PM - Posted by Anonymous (not verified)

If the frame rate multiplication works as it should, then there isn't much to report. I would hope that they actually verified that it is working correctly.

December 30, 2015 | 06:25 PM - Posted by JohnGR

It wouldn't matter. G-Sync, FreeSync, Adaptive Sync or whatever Sync, if you are at 10-15 fps, the motion will NOT be fluent.

December 30, 2015 | 08:57 PM - Posted by rl (not verified)

It may not matter to you. It surely matters to Ryan, since he tested it and chose to mention it in the review.

And I'm not speaking about 15 fps/Hz like you tried to insinuate but rather the range down to 30 fps/Hz, or even 23.976 fps/Hz.

It's simply a matter of curiosity. Does the G-Sync module allow the OEM/Nvidia to control the VRR range better or not.

Is the module needed to drive those VRR ranges beyond 144Hz to 165Hz and 200Hz... or not.

Does the panel behave different when controlled by the G-Sync module or a normal scaler with Adaptive Sync capabilities.

Are there differences between G-Sync and FreeSync monitors using the same or a similar panel beyond software/hardware features by the IHVs.

December 31, 2015 | 03:23 AM - Posted by JohnGR

If you are getting at least a minimum of 24fps from your card, then GSync or the LFC mode of FreeSync, will double the frame rate and you only need the monitor to support a minimum limit of 48Hz adaptive sync. Almost every monitor does that today. The only problem is with the monitors that don't have an at least double upper limit compared to the low limit to support LFC. With those monitors the best option is to lower the graphics settings and get a good frame rate.
So I wasn't trying to insinuate anything, but only guessing that, wanting to go as low as possible, would only have a meaning if you where thinking frame drops down to 10-15fps. In that case I don't think Adaptive Sync of any kind would manage to magically create a fluid motion.

Now, comparing GSync and FreeSync when those two techs come with $100-$200 dollars difference in their price was never in my opinion valid. The difference in pricing is huge. Even without LFC, FreeSync was a better option because those $200, in the case of the more expensive monitors, can buy you a better card. You can go from a 390X to a Fury X. Even those $100 in the case of cheaper monitors can move you from a 380 to an 8GB 390. So, even if GSync gets more points, the fact is that it's very bad VFM and an option only if you already payed for a 980Ti or a Titan.

In the end, every reader in PCPer could be curious for a number of things. Even a 50 page review would leave something out.

January 3, 2016 | 09:00 PM - Posted by Anonymous (not verified)

You don't have a clear understanding of how Frame Doubling via AMD's software implementation works. There's a good a article at PCPER here you can find relating to the Crimson drivers.

First, to work properly you need a range of at least 2.5X (i.e. 30Hz to 75Hz).

Secondly, the AMD driver is simply telling the monitor to repeat the same frame so that it stays in its supported asynchronous range. If we talk FPS (not frame times) then if the GPU outputs 16FPS normally then it's told to REPEAT each frame so we get 32FPS (32 refreshes) on the physical monitor.

So it's still effectively 16FPS in terms of new content on the screen, but it is a lot smoother than normal 16FPS since we are again staying in asynchronous mode.

(I don't know where you came up with the "48Hz" value either. If it was a 30Hz to 60Hz monitor for example then any FPS the GPU is generating within that range keeps you in asynchronous mode. Frame doubling is ONLY required when your GPU can't output at least 30FPS.)

I assume we need "2.5X" not 2X due to some latency in the driver software.

January 3, 2016 | 09:07 PM - Posted by Anonymous (not verified)

If not clear, it's a doubling on the computer side such as 16x2=32FPS (output of GPU). So, AMD handles this in its AMD driver software (which works pretty well), however NVidia has a lookaside buffer in the GSYNC module to handle this.

(it's actually not a big deal, but perhaps AMD can stop talking about NVidia's supposed latency due to the GSync module since AMD's solution requires a driver fix which is obviously not instantaneous.. in fact it's why the range of the monitor needs to be "2.5X" such as 30Hz to 75Hz which is a big problem for monitors like 4K, 30Hz to 60Hz asynch range).

January 3, 2016 | 09:18 PM - Posted by Anonymous (not verified)

Okay, I see where you got "48Hz" from. Maybe you understand clearly, or maybe not but note the "2.5X" range for proper support.

Again though, as per my other comment note that many games drop to very low FPS values for short periods of time so staying in asynchronous mode is especially important here.

It's not about "magically" creating fluid motion but rather getting the smoothest motion possible. If we normally got about 15FPS but where in VSYNC mode (to avoid tearing) the screen stuttering would likely be horrible. Low FPS + stutter + buffer latency is arguably the worst-case scenario for gaming.

On the other hand 15FPS in asynchronous mode is much, much better.

(I think you might be surprised how often the average game drops below 30FPS. In some cases you need to crank up the average FPS by at least 4X to avoid these drops completely which then obviously has a big visual trade-off. Very few people seem to understand how much these drops affect perceived smoothness.)

January 3, 2016 | 08:34 PM - Posted by Anonymous (not verified)

Sudden DROPS to a low FPS range is where asynchronous support makes the biggest difference.

You wouldn't want to game around 15FPS all the time, but it's still very common to have sudden drops so anything that improves that is important.

December 30, 2015 | 05:08 PM - Posted by jimecherry

even if the ratio of freesync to gsync goes up, which I suspect it will just on ease to implament, what value does NVidia get from supporting free sync. By supporting their own vfr standard they get licensing revenue and are able to co market there vid cards with monitors. The only way I see NVidia getting on board is if free sync out evolves gsync thus making gsync a marketing liability.

December 30, 2015 | 05:30 PM - Posted by Anonymous (not verified)

G-Sync is like PhisX Nvidia will ride this all the way to obscurity. They will not drop it, i think the market will mostly shift away from it due to ease of implementation of Adaptive Sync/Free Sync.

December 30, 2015 | 06:04 PM - Posted by Anonymous (not verified)

Nvidia seems to have put research into how to implement variable refresh without changing the standard but the proper way to do it was to just change the standard. The only advantage right now seems to be slightly better overdrive, but it is unclear how much of this is better overdrive calculations and how much is choice of panels. I don't think the slightly increased ghosting on free sync panels is going to be noticeable under normal use so the price premium on a g-sync panel is not worth it. For the Nvidia fans, I guess they just have to pay the price premium. Nvidia will probably continue to deny free sync support for a while and then quietly as possible start supporting free sync; maybe they will try to come up with some new feature on top of free sync to differentiate.

December 31, 2015 | 12:45 PM - Posted by rl (not verified)

Variable Refresh was already a part of VESA Embedded Displayport 1.3 (eDP) in form of the Panel Self Refresh (PSR) spec. Nvidia is a member of VESA. They could have easily suggested VRR to be implemented in the DisplayPort Spec. AMD did just that. They took the PSR spec and suggested it to be enhanced and implemented as Adaptive Sync. Now it's part of both Specs and Nvidia actually benefits from it by using it to enable Mobile G-Sync on Laptops through the eDP Standard.

Without eDP and PSR there would be no Mobile G-Sync as we know it. And OEMs have no desire to put a huge FPGA inside their ever so smaller Laptops.

December 30, 2015 | 06:13 PM - Posted by aquinoe (not verified)

Here a happy owner of a MG278Q for two months now. I have it connected to a R9 390.

AMD has implemented frame rate target control (FRTC) that prevents the GPU from exceeding certain fps of your choice. So, if you wish, you can play on moderate fps (reducing noise and consumption) by activating FRTC.

On the other hand, variable rate control cames handy when you prefer to "unchain" the R9 390. This card, by the way, is a truly gem for 2560x1440 moreover when you combine it with a freesync monitor.

On other aspects: good enough colors, excellent resolution (much better than FHD without entering on high DPI problems I've suffered on 2160p screens) and better blacks than what I expected.

Viewing angles on this PC monitor is really not a concern (and I don't support it at all in a TV).

The major disadvantage: it isn't cheap.

December 31, 2015 | 02:52 AM - Posted by Master Chen (not verified)

>End of 2015
>TN
INTO THE TRASH!

December 31, 2015 | 03:59 AM - Posted by Anonymous (not verified)

Who are you quoting?

December 31, 2015 | 11:37 AM - Posted by Master Chen (not verified)

My thoughts and feels.

January 3, 2016 | 08:45 PM - Posted by Anonymous (not verified)

TN panels have continued to improve and top-end TN panels have superior response times to minimize ghosting which is important to some people.

Ignoring price, IPS (or similar) has to be superior in all aspects to be a complete drop-in replacement for TN.

Back to price, not everybody can afford IPS especially for the cheaper monitors. Technology like this is always slowly phased out.

December 31, 2015 | 12:45 PM - Posted by Bezzell

I cannot in good conscience pay 200 bucks for variable refresh rate from team green. Seems as though AMD is on the right track. If AMDs next GPU cycle is impressive, I know which direction I'll be leaning.

January 2, 2016 | 08:39 AM - Posted by Mac (not verified)

Nice review, I was hoping you'd pull out the old oscilliscope for below VRR window testing coz the last video with it was quite informative and fascinating.

January 3, 2016 | 12:45 AM - Posted by Anonymous (not verified)

I would like to see that also. It would be interesting to see how both implementations handle the transition from VRR within the displays window and crossing the boundry into frame multiplication. Although, just using it for a while and seeing if they notice anything is a probably the most interesting test. If they played at the boundry, they would have noticed if crossing the boundry was causing any artifacts like stuttering, judder, or tearing. Tearing really should not occur though.

January 5, 2016 | 10:43 PM - Posted by Anonymous (not verified)

Its like everything Nvidia does they create a closed standard on a open standard system then charge through the roof for it.

January 6, 2016 | 12:47 PM - Posted by 8secondz28 (not verified)

I purchased their exact monitor Asus mg278q and I can't get it to display 144hz. I can @120 is the highest it will let me go I have used the supplied DisplayPort cable and the dvi cable still can't get it to display even in battlefield 4 my highest option is 2560x1440. 120hz so please help I have the asrock gaming fitality z97 mobo , i7 4790k, 16g ddr3 and 2 gtx 660 sc in SLI

January 9, 2016 | 10:19 AM - Posted by Anonymous (not verified)

GTX 6xx gpu doesn't support that range with displayport you are locked to 120. Not sure but you should get 144hz via DVI-D port.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.