Review Index:
Feedback

ASUS MG278Q 2560x1440 144Hz FreeSync Monitor Review

Author:
Subject: Displays
Manufacturer: ASUS

Another TN Option for FreeSync Fans

If you had asked me a year ago how many monitors we would be able to store in the PC Perspective offices, I would have vastly underestimated the true answer. It seems that not only is the demand from readers for information about the latest and greatest display technology at a demand that we have never seen, but vendors that sell high quality monitors for enthusiasts and gamers are pumping out more models than I can keep track of.

But this is good, right? The more options we have, the more likely we are to find the best choice for each user, for each budget and for each required feature set. But more choices can also lead to confusion - that's where we continue to chime in. Today we are taking a look at the ASUS MG278Q monitor, a 27-in 2560x1440 display with support for AMD FreeSync technology and sporting a maximum refresh rate of 144Hz. With a TN panel rather than IPS, the MG278Q has a current selling price of just $399, well under the equivalent G-Sync monitors.

View Full Size

Even better, since we started our evaluation on the display, AMD released the Radeon Crimson driver, introducing a new feature called Low Frame Rate Compensation. This essentially allows most of the FreeSync displays on the market to match NVIDIA G-Sync's ability to handle lower frame rates without resorting to V-Sync tearing, etc. If you haven't read about it, do so in the link above.

Continue reading our review of the ASUS MG278Q 2560x1440 144Hz FreeSync monitor!!

Let's take a look at the raw specs from the ASUS website.

  • Display
    Panel Size : Wide Screen 27.0"(68.47cm) 16:9
    Color Saturation : 75%(NTSC) 
    Panel Type : TN
    True Resolution : 2560x1440 *1
    Display Surface Non-glare
    Pixel Pitch : 0.233mm
    Brightness(Max) : 350 cd/㎡
    Contrast Ratio (Max) : 1000:1
    ASUS Smart Contrast Ratio (ASCR) : 100000000:1 
    Viewing Angle (CR≧10) : 170°(H)/160°(V)
    Response Time : 1ms (Gray to Gray)
    Display Colors : 16.7M
    Flicker free
  • Video Feature
    Trace Free Technology : Yes
    Skin-Tone Selection : 3 Modes
    Color Temperature Selection : 4 Modes
    GamePlus(modes) : Yes (Crosshair/Timer/FPS Counter)
    Low Blue Light : Yes
    HDCP support : Yes
    VividPixel : Yes
    GameVisual :6 Modes(Scenery/Racing/Cinema/RTS/RPG/FPS/sRGB Modes)
    FreeSync™ technology supported
  • Audio Features
    Stereo Speakers : 2W x 2 Stereo RMS
  • Convenient Hotkey
    Input Selection
    GamePlus
    5-way OSD Navigation Joystick
    GameVisual
  • I/O Ports
    Signal Input : HDMI x 2, DisplayPort 1.2, Dual-link DVI-D
    Earphone jack : 3.5mm Mini-Jack 
    USB Port(s) : 3.0x2
  • Signal Frequency
    Digital Signal Frequency : 51.2~221.97KHz(H) / 35~144Hz
  • Power Consumption
    Power Consumption < 38.7W (based on ES 6.0) 
    Power Saving Mode <0.5W
    Power Off Mode <0.5W
  • Mechanical Design
    Chassis Colors : Black
    Tilt : +20°~-5°
    Swivel : Yes
    Pivot : Yes
    Height Adjustment : Yes
    VESA Wall Mounting : 100x100mm
    Super Narrow Bezel Design : Yes
    Quick Release Stand Design : Yes
  • Security
    Kensington lock
  • Dimensions
    Phys. Dimension with Stand(WxHxD) : 625 x 563 x 233 mm
    Phys. Dimension without Stand (WxHxD) : 625 x 368 x 63 mm
    Box Dimension(WxHxD) : 753 x 452 x 224 mm
  • Weight
    Net Weight (Esti.) : 7.65 kg 
    Gross Weight (Esti.) : 11.5kg
  • Accessories
    Dual-link DVI cable 
    Audio cable
    Power cord 
    DisplayPort cable 
    Quick start guide
    HDMI cable 
    Support CD
    Warranty Card
  • Compliance and Standards
    Energy Star®, BSMI, CB, CCC, CE, ErP, FCC, J-MOSS, KCC, PSE, RoHS, UL/cUL, VCCI, WEEE, WHQL (Windows 8.1, Windows 7), MEPS, RCM, TUV Flicker-free , KC , eStandby, TUV Low Blue Light, CU(EAC)
  • Note
    FreeSync™ technology supported 

    *1 
    2560 x 1440 at 144Hz (DisplayPort and HDMI-1)
    1920 x 1080 at 120Hz (HDMI-2)

The stand out features from this list include the spacious 27-in size, an upgrade for most of our readers I'm sure. The 2560x1440 resolution means you'll have 77% more pixels than a 1080p Full HD screen, so prepare your GPU and wallet accordingly. A 1ms response time, common on modern TN panels, means you'll have very minimal ghosting and input latency, something that PC gamers are often concerned with. 

View Full Size

Gaming on the ASUS MG278Q

Though it isn't listed specifically, the FreeSync range of the monitor was tested at 42Hz to 144Hz through our internal process. While that VRR lower limit of 42Hz would have been a fairly sizeable factor just a few weeks ago in comparison to NVIDIA G-Sync's inherent ability to maintain tear-free gaming at any low frame rate, the Radeon Crimson release has softened the blow, bringing a software-based solution to fix it.


December 30, 2015 | 12:58 PM - Posted by Anonymous (not verified)

I know Nvidia has 70+% market share in the discrete gpu market. But they will have to come up with something really great if they want to keep a 200$ Premium on their monitors. Or (and this would be ideal) cut the GSync crap and let's all work on adaptive-sync implementations (will never happen, I know)

December 30, 2015 | 02:38 PM - Posted by Anonymous (not verified)

They have 80%+ and the more they sh1t their customers, the more people buy their products. So don't expect them to cut the crap. Expect them to get worst.

December 30, 2015 | 06:31 PM - Posted by Anonymous (not verified)

They sell partially on having a perception as a high-end brand, so even if AMD has a much better value in the buyer's price range, many of them will still buy Nvidia. They can sell g-sync based on the same type of marketing even though free sync is almost exactly equivalent now.

January 3, 2016 | 08:10 PM - Posted by Anonymous (not verified)

NVidia doesn't want a premium on their monitors. They don't sell monitors, only the GSync module which is currently expensive.

NVidia wants to see monitors sold to drive GPU adoption so ideally they would want GSync monitors to be cheaper than Freesync.

Since the GSync module replaces the "guts" (scaler etc) of the monitor they can do things that Freesync can not likely.

At some point monitor manufacturers will have to add in newer features like light strobe whilst asynchronous mode is active so if GSync ends up making that natively supported but Freesync requires the manufacturer themselves to implement the feature we could see Freesync costing more than GSync or simply not offering certain features.

January 5, 2016 | 10:40 PM - Posted by Anonymous (not verified)

The g sync parts are not worth shit the charge Royalty fees to manufacturers to use them.

December 30, 2015 | 04:02 PM - Posted by Anonymous (not verified)

Another asus monitor with aggressive matte coating.
Have balls release glossy panel

December 30, 2015 | 04:09 PM - Posted by Ryan Shrout

Do you really want that? If so, why?

December 30, 2015 | 08:38 PM - Posted by Anonymous (not verified)

I'm not the person you're quoting, but I prefer glossy because it looks better in a good (dark) environment and doesn't have the blurring that many aggressive coatings have. Matte may look slightly less atrocious under direct light, but every display looks so terrible in those conditions that anyone who cares in the slightest about image quality will make sure to avoid direct light anyway.

January 1, 2016 | 11:55 AM - Posted by Anonymous (not verified)

That was my comment.

Colors look ten times better without.
I can tolerate IPS lighter matte that acer/asus use on thier ips gsync but the film on these TN too heavy TN colors look bad enough dont need aggressive matte on top of that.

Also not a single glossy display for gaming it would be nice not to be all one sided.
We have control over lights in room we have no control when they slap that ugly coating on panel.
Even tablets/phones and TV dont use it anymore for obvious reason.

January 1, 2016 | 12:27 PM - Posted by Anonymous (not verified)

edit
*Also not a single glossy display*
Not a single glossy gsync/freesync display

January 3, 2016 | 12:32 AM - Posted by Anonymous (not verified)

I was under the impression. That most TVs have anti-reflective coatings, at a minimum.

January 3, 2016 | 08:21 PM - Posted by Anonymous (not verified)

There are glossy monitors with anti-reflective coatings. The terminology can get confusing though.

December 30, 2015 | 04:10 PM - Posted by Anonymous (not verified)

"thanks to the latest AMD software update it can now support frame doubling to allow for smooth, variable refresh rates under 42 FPS as well"

This is great news for Adaptive Sync. Now we just need to get NVIDIA to support Adaptive Sync monitors too. Adaptive Sync monitors outnumber G-Sync about 3:1 in terms of number of models available worldwide. What did people think would happen with a free vs. expensive implementation?

G-Sync, it's been nice knowing you.

December 30, 2015 | 04:12 PM - Posted by Anonymous (not verified)

^ and I mean that sincerely as an ASUS VG24QE owner.

January 3, 2016 | 08:29 PM - Posted by Anonymous (not verified)

AMD's frame doubling still requires a 2.5X ratio minimum (i.e. 30Hz to 75Hz, but NOT 30Hz to 60Hz). This is due to the frame creation being driver (software) driven.

People keep saying NVidia's GSync will disappear, but keep in mind that the cost of the GSync module will continue to drop, and the fact that they are working on newer versions that will do things that Freesync simply can't (like light strobing whilst asynchronous mode is active).

Sure, AMD Freesync monitors can have features added like I just mentioned but that would be done by the monitor manufacturer which will cost them money thus be reflected in more expensive monitors.

Conversely, monitor manufacturers will be able to incorporate features that GSync already supports. So the cost of the GSync module will shrink but the features offered will increase.

So it's not quite as simple as "free vs expensive" if you understand the monitor market.

(We wouldn't even have Freesync if NVidia had not created GSync... and NVidia had no incentive to do that without making it proprietary so why hate on NVidia? It may not be ideal, but at least we have asynchronous monitors for smooth gaming and that's awesome.)

January 5, 2016 | 10:41 PM - Posted by Anonymous (not verified)

Hello its not the parts its a fee the manufactures pay to Nvidia.

December 30, 2015 | 04:47 PM - Posted by rl (not verified)

Hey Ryan/PCPer

Why not test the Range for every VRR-Monitor, not just FreeSync.

It would be interesting to know how low the PG279Q panel can be driven as well as the other x-Sync monitors to see whether the module gives an advantage or not. Afaik, you only did that for the original (TN) ROG Swift and then stopped. :(

December 30, 2015 | 06:19 PM - Posted by Anonymous (not verified)

If the frame rate multiplication works as it should, then there isn't much to report. I would hope that they actually verified that it is working correctly.

December 30, 2015 | 06:25 PM - Posted by JohnGR

It wouldn't matter. G-Sync, FreeSync, Adaptive Sync or whatever Sync, if you are at 10-15 fps, the motion will NOT be fluent.

December 30, 2015 | 08:57 PM - Posted by rl (not verified)

It may not matter to you. It surely matters to Ryan, since he tested it and chose to mention it in the review.

And I'm not speaking about 15 fps/Hz like you tried to insinuate but rather the range down to 30 fps/Hz, or even 23.976 fps/Hz.

It's simply a matter of curiosity. Does the G-Sync module allow the OEM/Nvidia to control the VRR range better or not.

Is the module needed to drive those VRR ranges beyond 144Hz to 165Hz and 200Hz... or not.

Does the panel behave different when controlled by the G-Sync module or a normal scaler with Adaptive Sync capabilities.

Are there differences between G-Sync and FreeSync monitors using the same or a similar panel beyond software/hardware features by the IHVs.

December 31, 2015 | 03:23 AM - Posted by JohnGR

If you are getting at least a minimum of 24fps from your card, then GSync or the LFC mode of FreeSync, will double the frame rate and you only need the monitor to support a minimum limit of 48Hz adaptive sync. Almost every monitor does that today. The only problem is with the monitors that don't have an at least double upper limit compared to the low limit to support LFC. With those monitors the best option is to lower the graphics settings and get a good frame rate.
So I wasn't trying to insinuate anything, but only guessing that, wanting to go as low as possible, would only have a meaning if you where thinking frame drops down to 10-15fps. In that case I don't think Adaptive Sync of any kind would manage to magically create a fluid motion.

Now, comparing GSync and FreeSync when those two techs come with $100-$200 dollars difference in their price was never in my opinion valid. The difference in pricing is huge. Even without LFC, FreeSync was a better option because those $200, in the case of the more expensive monitors, can buy you a better card. You can go from a 390X to a Fury X. Even those $100 in the case of cheaper monitors can move you from a 380 to an 8GB 390. So, even if GSync gets more points, the fact is that it's very bad VFM and an option only if you already payed for a 980Ti or a Titan.

In the end, every reader in PCPer could be curious for a number of things. Even a 50 page review would leave something out.

January 3, 2016 | 09:00 PM - Posted by Anonymous (not verified)

You don't have a clear understanding of how Frame Doubling via AMD's software implementation works. There's a good a article at PCPER here you can find relating to the Crimson drivers.

First, to work properly you need a range of at least 2.5X (i.e. 30Hz to 75Hz).

Secondly, the AMD driver is simply telling the monitor to repeat the same frame so that it stays in its supported asynchronous range. If we talk FPS (not frame times) then if the GPU outputs 16FPS normally then it's told to REPEAT each frame so we get 32FPS (32 refreshes) on the physical monitor.

So it's still effectively 16FPS in terms of new content on the screen, but it is a lot smoother than normal 16FPS since we are again staying in asynchronous mode.

(I don't know where you came up with the "48Hz" value either. If it was a 30Hz to 60Hz monitor for example then any FPS the GPU is generating within that range keeps you in asynchronous mode. Frame doubling is ONLY required when your GPU can't output at least 30FPS.)

I assume we need "2.5X" not 2X due to some latency in the driver software.

January 3, 2016 | 09:07 PM - Posted by Anonymous (not verified)

If not clear, it's a doubling on the computer side such as 16x2=32FPS (output of GPU). So, AMD handles this in its AMD driver software (which works pretty well), however NVidia has a lookaside buffer in the GSYNC module to handle this.

(it's actually not a big deal, but perhaps AMD can stop talking about NVidia's supposed latency due to the GSync module since AMD's solution requires a driver fix which is obviously not instantaneous.. in fact it's why the range of the monitor needs to be "2.5X" such as 30Hz to 75Hz which is a big problem for monitors like 4K, 30Hz to 60Hz asynch range).

January 3, 2016 | 09:18 PM - Posted by Anonymous (not verified)

Okay, I see where you got "48Hz" from. Maybe you understand clearly, or maybe not but note the "2.5X" range for proper support.

Again though, as per my other comment note that many games drop to very low FPS values for short periods of time so staying in asynchronous mode is especially important here.

It's not about "magically" creating fluid motion but rather getting the smoothest motion possible. If we normally got about 15FPS but where in VSYNC mode (to avoid tearing) the screen stuttering would likely be horrible. Low FPS + stutter + buffer latency is arguably the worst-case scenario for gaming.

On the other hand 15FPS in asynchronous mode is much, much better.

(I think you might be surprised how often the average game drops below 30FPS. In some cases you need to crank up the average FPS by at least 4X to avoid these drops completely which then obviously has a big visual trade-off. Very few people seem to understand how much these drops affect perceived smoothness.)

January 3, 2016 | 08:34 PM - Posted by Anonymous (not verified)

Sudden DROPS to a low FPS range is where asynchronous support makes the biggest difference.

You wouldn't want to game around 15FPS all the time, but it's still very common to have sudden drops so anything that improves that is important.

December 30, 2015 | 05:08 PM - Posted by jimecherry

even if the ratio of freesync to gsync goes up, which I suspect it will just on ease to implament, what value does NVidia get from supporting free sync. By supporting their own vfr standard they get licensing revenue and are able to co market there vid cards with monitors. The only way I see NVidia getting on board is if free sync out evolves gsync thus making gsync a marketing liability.

December 30, 2015 | 05:30 PM - Posted by Anonymous (not verified)

G-Sync is like PhisX Nvidia will ride this all the way to obscurity. They will not drop it, i think the market will mostly shift away from it due to ease of implementation of Adaptive Sync/Free Sync.

December 30, 2015 | 06:04 PM - Posted by Anonymous (not verified)

Nvidia seems to have put research into how to implement variable refresh without changing the standard but the proper way to do it was to just change the standard. The only advantage right now seems to be slightly better overdrive, but it is unclear how much of this is better overdrive calculations and how much is choice of panels. I don't think the slightly increased ghosting on free sync panels is going to be noticeable under normal use so the price premium on a g-sync panel is not worth it. For the Nvidia fans, I guess they just have to pay the price premium. Nvidia will probably continue to deny free sync support for a while and then quietly as possible start supporting free sync; maybe they will try to come up with some new feature on top of free sync to differentiate.

December 31, 2015 | 12:45 PM - Posted by rl (not verified)

Variable Refresh was already a part of VESA Embedded Displayport 1.3 (eDP) in form of the Panel Self Refresh (PSR) spec. Nvidia is a member of VESA. They could have easily suggested VRR to be implemented in the DisplayPort Spec. AMD did just that. They took the PSR spec and suggested it to be enhanced and implemented as Adaptive Sync. Now it's part of both Specs and Nvidia actually benefits from it by using it to enable Mobile G-Sync on Laptops through the eDP Standard.

Without eDP and PSR there would be no Mobile G-Sync as we know it. And OEMs have no desire to put a huge FPGA inside their ever so smaller Laptops.

December 30, 2015 | 06:13 PM - Posted by aquinoe (not verified)

Here a happy owner of a MG278Q for two months now. I have it connected to a R9 390.

AMD has implemented frame rate target control (FRTC) that prevents the GPU from exceeding certain fps of your choice. So, if you wish, you can play on moderate fps (reducing noise and consumption) by activating FRTC.

On the other hand, variable rate control cames handy when you prefer to "unchain" the R9 390. This card, by the way, is a truly gem for 2560x1440 moreover when you combine it with a freesync monitor.

On other aspects: good enough colors, excellent resolution (much better than FHD without entering on high DPI problems I've suffered on 2160p screens) and better blacks than what I expected.

Viewing angles on this PC monitor is really not a concern (and I don't support it at all in a TV).

The major disadvantage: it isn't cheap.

December 31, 2015 | 02:52 AM - Posted by Master Chen (not verified)

>End of 2015
>TN
INTO THE TRASH!

December 31, 2015 | 03:59 AM - Posted by Anonymous (not verified)

Who are you quoting?

December 31, 2015 | 11:37 AM - Posted by Master Chen (not verified)

My thoughts and feels.

January 3, 2016 | 08:45 PM - Posted by Anonymous (not verified)

TN panels have continued to improve and top-end TN panels have superior response times to minimize ghosting which is important to some people.

Ignoring price, IPS (or similar) has to be superior in all aspects to be a complete drop-in replacement for TN.

Back to price, not everybody can afford IPS especially for the cheaper monitors. Technology like this is always slowly phased out.

December 31, 2015 | 12:45 PM - Posted by Bezzell

I cannot in good conscience pay 200 bucks for variable refresh rate from team green. Seems as though AMD is on the right track. If AMDs next GPU cycle is impressive, I know which direction I'll be leaning.

January 2, 2016 | 08:39 AM - Posted by Mac (not verified)

Nice review, I was hoping you'd pull out the old oscilliscope for below VRR window testing coz the last video with it was quite informative and fascinating.

January 3, 2016 | 12:45 AM - Posted by Anonymous (not verified)

I would like to see that also. It would be interesting to see how both implementations handle the transition from VRR within the displays window and crossing the boundry into frame multiplication. Although, just using it for a while and seeing if they notice anything is a probably the most interesting test. If they played at the boundry, they would have noticed if crossing the boundry was causing any artifacts like stuttering, judder, or tearing. Tearing really should not occur though.

January 5, 2016 | 10:43 PM - Posted by Anonymous (not verified)

Its like everything Nvidia does they create a closed standard on a open standard system then charge through the roof for it.

January 6, 2016 | 12:47 PM - Posted by 8secondz28 (not verified)

I purchased their exact monitor Asus mg278q and I can't get it to display 144hz. I can @120 is the highest it will let me go I have used the supplied DisplayPort cable and the dvi cable still can't get it to display even in battlefield 4 my highest option is 2560x1440. 120hz so please help I have the asrock gaming fitality z97 mobo , i7 4790k, 16g ddr3 and 2 gtx 660 sc in SLI

January 9, 2016 | 10:19 AM - Posted by Anonymous (not verified)

GTX 6xx gpu doesn't support that range with displayport you are locked to 120. Not sure but you should get 144hz via DVI-D port.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.