Review Index:
Feedback

ASUS ROG Swift PG27UQ 27" 4K 144Hz G-SYNC Monitor: True HDR Arrives on the Desktop

Author:
Manufacturer: ASUS

A long time coming

To say that the ASUS ROG Swift PG27UQ has been a long time coming is a bit of an understatement. In the computer hardware world where we are generally lucky to know about a product for 6-months, the PG27UQ is a product that has been around in some form or another for at least 18 months.

Originally demonstrated at CES 2017, the ASUS ROG Swift PG27UQ debuted alongside the Acer Predator X27 as the world's first G-SYNC displays supporting HDR. With promised brightness levels of 1000 nits, G-SYNC HDR was a surprising and aggressive announcement considering that HDR was just starting to pick up steam on TVs, and was unheard of for PC monitors. On top of the HDR support, these monitors were the first announced displays sporting a 144Hz refresh rate at 4K, due to their DisplayPort 1.4 connections.

However, delays lead to the PG27UQ being displayed yet again at CES this year, with a promised release date of Q1 2018. Even more slippages in release lead us to today, where the ASUS PG27UQ is available for pre-order for a staggering $2,000 and set to ship at some point this month.

In some ways, the launch of the PG27UQ very much mirrors the launch of the original G-SYNC display, the ROG Swift PG278Q. Both displays represented the launch of an oft waited technology, in a 27" form factor, and were seen as extremely expensive at their time of release.

View Full Size

Finally, we have our hands on a production model of the ASUS PG27UQ, the first monitor to support G-SYNC HDR, as well as 144Hz refresh rate at 4K. Can a PC monitor really be worth a $2,000 price tag? 

Continue reading our review of the ASUS ROG PG27UQ G-SYNC HDR Monitor!

Specifications

View Full Size

There's a lot of ground to cover with the specifications of the ASUS PG27UQ, and many of them represent industry firsts. 

While there has been one other Korean monitor that has supported 4K 144Hz, the ASUS PG27UQ is the first widely available display to take advantage of this new 4K high refresh mode offered by DisplayPort 1.4.

On the HDR side, you have HDR10 decoding powered by a 384-zone Full Array Local Dimming (FALD) backlight, capable of reaching 1000 nits in certain scenarios. As per the DisplayHDR 1000 specification, the display must be capable of both flashing the entire screen with an instantaneous brightness of 1000 nits, as well as sustaining a 10% patch in the middle of the display at 1000 nits indefinitely.

The typical brightness of 600 nits is still higher than the measured peak brightness of any other HDR monitor we have taken a look at so far, an impressive feat.

Of course, brightness isn't the only important aspect of HDR. ASUS claims the PG27UQ is capable of reproducing color to an accuracy of 99% of the AdobeRGB gamut, and 97% of DCI-P3. While it isn't the absolute highest claims of color reproduction we've seen on a display, these numbers still represent color accuracy in the top echelon of pc monitors.

Edit: For clarification, the 98Hz limit represents the refresh rate at which the monitor switches from 4:4:4 chroma subsampling to 4:2:2 subsampling. While this shouldn't affect things such as games and movies to a noticeable extent, 4:2:2 can result in blurry or difficult to read text in certain scenarios. For more information on Chroma Subsampling, see this great article over at Rtings. 

For the moment, given the lack of available GPU products to push games above 98Hz at 4K, we feel like keeping the monitor in 98Hz mode is a good compromise. However, for a display this expensive, it's a negative that may this display age faster than expected.

Here is the full capability of the PG27UQ across different refresh rates and modes:

SDR: 98 Hz 10-bit RGB, 120 Hz 8-bit RGB, 144 Hz (overclocked)  8-bit YCbCr422

HDR: 98 Hz 10-bit RGB, 120 Hz 8-bit RGB (dithered, only with Win10 RS4), 120 Hz 10-bit YCbCr422, 144 Hz (overclocked) 10-bit YCbCr422

One caveat though, with HDR enabled at 4K, the maximum refresh rate is limited to 98Hz. While this isn't a problem for today, where graphics cards can barely hit 60 FPS at 4K, it is certainly something to be aware of when buying a $2,000 product that you would expect to last for quite a long time to come.

View Full Size

One of the unique advantages that ASUS has with the PG27UQ over the similarly specced Acer Predator X27 is DisplayHDR 1000 certification. While it's unclear if this is due to ASUS getting access to slightly higher quality panels than Acer, or Acer just not going through the certification process, it still reflects the confidence that ASUS has in this product.

Design

The first surprising aspect of the PG27UQ is the form factor. While monitors have been trending towards thinner and thinner chassis and bezels due to advancements in LCD and LED backlighting technology, the PG27UQ goes in the opposite direction.

View Full Size

The bezels are thicker than we saw with even the original ROG Swift, the chassis is thick, and the monitor overall is quite heavy. It seems that some sacrifices had to be made in order to fit things like a Full Array Local Dimming backlight into a 27" display form factor.

View Full Size

Inclusions like ASUS Aura Sync-compatible RGB lighting help make this feel like a unique product if you are into that sort of thing.

View Full Size

However, for those of you like me that aren't into RGB, these lightning effects can be easily disabled in the monitor's OSD.

View Full Size

Like we've seen with a lot of G-SYNC displays, the input options are a bit barren on the PG27UQ. You get the primary DisplayPort 1.4 connector for all of your G-SYNC enabled content, an HDMI 2.0 port, Dual port USB 3.0 Hub, and a headphone jack.

We were pleasantly surprised to see that the HDMI 2.0 port supports full 4K60 with HDR-enabled, allowing you to hook up a device other than your PC, like a game console to this premium product and still get the fullest possible experience. Although keep in mind that Dolby Vision HDR is not supported here, so your mileage may vary with regards to playing back certain video content.

Early impressions of the similar Acer Predator X27 uncovered that it used active cooling for the electronics, which was annoyingly loud when used with Acer's VESA mounting kit.

View Full Size

While the ASUS PG27UQ also features a blower style fan with an intake in the VESA mount area, ASUS includes standoffs which provide plenty of clearance between the back of the monitor and the VESA mounting plate, solving the issue. 

View Full Size

It's worth noting that despite the monitor having an always-on active cooling fan, it was only noticeable with no other ambient noise in the room, and in close proximity to the display. With any other noise in the room, such as our test PC turned on, the fan noise was drowned out.

Still, if you shut down your PC when not using it, and keep your PC setup in a quiet room, the best course of action would be to completely turn off the monitor when not using it.

Speaking of the fan, let's take a closer look at the internals of the ASUS PG27UQ with a teardown.

Review Terms and Disclosure
All Information as of the Date of Publication
How product was obtained: The product is on loan from ASUS for the purpose of this review.
What happens to the product after review: The product remains the property of ASUS but is on extended loan for future testing and product comparisons.
Company involvement: ASUS had no control over the content of the review and was not consulted prior to publication.
PC Perspective Compensation: Neither PC Perspective nor any of its staff were paid or compensated in any way by ASUS for this review.
Advertising Disclosure: ASUS has purchased advertising at PC Perspective during the past twelve months.
Affiliate links: This article contains affiliate links to online retailers. PC Perspective may receive compensation for purchases through those links.
Consulting Disclosure: ASUS is not a current client of Shrout Research for products or services related to this review. 

June 22, 2018 | 06:12 PM - Posted by zme-ul

3840 x 2160 is not 4K, what the hell mates! this is UHD
if you don't know the difference, what the hell are you doing in the tech review media?

June 22, 2018 | 06:19 PM - Posted by Anonymouse (not verified)

Everyone knows. Nobody cares. Shut up.

June 22, 2018 | 06:19 PM - Posted by 3840x2160 is 4K (not verified)

The term "4K" is generic, and refers to any resolution with a horizontal pixel count of approximately 4000 pixels. Several different 4K resolutions have been standardized by various organizations.

June 22, 2018 | 07:19 PM - Posted by amdcanprymy8086frommycolddeadhands (not verified)

yea, doesn't the marketing term "HD" refer to both 720 and 1080p? At least in the tech world?

June 23, 2018 | 06:58 AM - Posted by Apersonrespondinghere (not verified)

No, the term for 720p is HD and 1080p is called FullHD/FHD.

June 22, 2018 | 09:40 PM - Posted by Photonboy

It DOES mean 4K if that's the accepted usage by enough people. In fact, dictionaries are constantly modified to change words to what people use whether wrong or not.

Try going into WalMart, stand in the TV section, and tell people that those aren't 4K HDTV's... no they are 3840x2160 so DAMN IT... Samsung should know better!!

June 23, 2018 | 08:38 AM - Posted by zme-ul-is-wrong (not verified)

You need to stop. You're wrong. It's a fact 3840x2160 is 4K. It's widely accepted, except by people who think they're smart when they're not.

Accept you're wrong and move on with your life.

June 25, 2018 | 04:43 PM - Posted by WolfWings (not verified)

TL;DR: 4K = 2160p, that's the standard the industry adopted.

The default aspect ratio these days is 16:9, so 16:9 2160p = 3840x2160

There *ARE* 4096x2160 monitors, but it's a wonky aspect ratio only used by the Digital Cinema Initiatives standards body so it's almost exclusively used by movie projection to efficiently support 2.39:1 through 1.85:1 aspect ratio projections. This is indeed where the term "4K" started.

But 4096x2304 monitor panels aren't made, never have been, never will be, however 3840x2160 has numerous benefits in relation to the existing 1080p/720p sizes because scaling is exact and precise.

So kindly do shut the heck up about "That isn't 4K!" because every standards body has accepted that it is.

June 30, 2018 | 05:17 AM - Posted by Andon M. Coleman (not verified)

Equally meaningless. You speak of DCI-4K as if it were the only thing referred to as 4K while being completely unaware that UHD is not, in fact, a single resolution.

UHD-8K exists, DCI-8K doesn't. So, let's continue this stupid discussion in the future when digital cinema catches up to UHD and you'll swear up and down that movie theaters aren't 8K.

June 22, 2018 | 07:06 PM - Posted by Anony mouse (not verified)

You guys didnt go into much detail that Youtubers are providing.

Is it running 4:2:0 / 4:2:2 / 4:4:4 ?

Is the FALD noticeable during letterbox in-game cinematics or media content movies, online videos.

June 23, 2018 | 02:15 AM - Posted by Aldo (not verified)

Should be running 4:2:2 if you go above 98Hz. Really this is a 4K 98Hz monitor. The 144Hz stuff was sort of marketing BS. Chroma subsampling has never really been used on desktop monitors.

June 23, 2018 | 01:45 PM - Posted by ZhaZha (not verified)

Actually, if not allow sub-sampling, it can reach 120hz for 4k SDR and 98hz 4k HDR. Anything above will require subsampling(4:2:2). So I would say it's a 4k 120hz monitor.

June 22, 2018 | 08:00 PM - Posted by Axle Grease

It doesn't look good.

http://www.guru3d.com/news-story/asus-and-acer-uhd-g-sync-hdr-monitors-f...

June 22, 2018 | 10:01 PM - Posted by Photonboy

PCPER can you find out about THIS?

I'm confused still:
https://en.wikipedia.org/wiki/DisplayPort

Go down roughly half way and there's a chart (Bandwidth is based on uncompressed 4:4:4.) that shows 4K at 144Hz using 4:2:2 with DP1.3...

Then right next to it we see 4K at 144Hz using DP1.4 under DSC (lossless compression). Actually supports up to 240Hz via 4:2:0 or 4:4:4 via DSC.

My understanding is that the need to drop to 98Hz or drop to 4:2:2 is a problem that DSC resolves.

NVidia PASCAL does support DSC:
http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,2.html

(correct me if I'm wrong but if DSC is supported by v1.4 it's required for a card stating "DP1.4" to have DSC?)

It's inconceivable this monitor would not.

*So we should be able to get up to 4K 4:4:4 on this monitor?!!

June 22, 2018 | 10:41 PM - Posted by asdkjfhlaskhdfhlasdfka (not verified)

DSC is "visually" lossless, not mathematically.

June 23, 2018 | 02:08 AM - Posted by Aldo (not verified)

It's not a requirement for cards with DP1.4 to support DSC. I believe that is an optional feature. Regardless, Nvidia's Gsync module does not support DSC, even though Freesync scalers do.

June 25, 2018 | 06:13 AM - Posted by Spunjji

Nice to know that $500 in the BOM goes to a module that was out of date before it launched. xD

June 26, 2018 | 07:33 PM - Posted by Photonboy

1) Do you have any PROOF that GSync Modules lack DSC support? I find that highly unlikely considering how long the standard has been out AND its high relevance to the purpose of the GSync Module.

2) As for "visually lossless" above. While I appreciate the distinction there are two issues then:

a) First off, I thought that implied it was difficult if not impossible for the average person to tell the difference between true lossless and visually lossless, and

b) if DSC was available then why bother switching to 4:2:2 subsampling which apparently DOES have obvious artifacting and/or color differences?

It seems likely to me that if DSC was available it should be utilized.

So either one, some or all of the following lack support for DSC:
a) NVidia Pascal (and other) GPU's,
b) the Asus monitor
c) NVidia drivers
d) Windows?

I thought DSC was already utilized for Workstation setups though I could be wrong but either way I'm baffled why DSC isn't apparently working here.

At this point my BEST GUESS is that it's most GPU's (even Pascal?) that lack support and that the MONITOR probably has it.

June 23, 2018 | 12:50 AM - Posted by CraigN (not verified)

So, just curious, is HDR *only* able to be enabled at 4K? I know that using monitors at their non-native resolution is not ideal, but is there anything from stopping a user from buying this and running it at 2560x1440 at 144 Hz with HDR enabled?

Thanks!

June 23, 2018 | 02:02 AM - Posted by jckl (not verified)

Think its possible to remove and repurpose the FPGA?

June 24, 2018 | 05:25 AM - Posted by ET3D

Best comment! :)

So true. The FPGA is over $2000 to buy separately, and here you pay $2000 and get not only the FPGA, but 3GB of RAM on the board and a very good LCD panel.

June 28, 2018 | 10:29 AM - Posted by Stef (not verified)

Best comment, really! XD

June 23, 2018 | 02:18 AM - Posted by Aldo (not verified)

Really dissapointed with this product. Chroma subsampling on a $2000 monitor? Are you kidding me? And on top of that we have to deal with shady marketing which doesn't even mention that you have to run the monitor at 98Hz to avoid it. Not to mention the fan noise and other quality control issues. Asus should have spent another couple of months and got this thing right. For a $2000 product this seems to have a lot of compromises.

June 23, 2018 | 01:57 PM - Posted by ZhaZha (not verified)

Mine arrived two days ago and I am very satisfied with it. Haven't noticed any fan noise, HDR is really amazing, it turned the far cry 5 into a totally different game! Now I feel it really worth the price.

June 26, 2018 | 08:30 PM - Posted by Photonboy

Glad Far Cry 5 looks great, but there may be times when the subsampling artifacts are noticeable according to at least one review.

Also, did the game run above 98FPS for you?

I assume you used GSYNC so unless you output more than 98FPS the monitor was under 98Hz and thus running full 4:4:4 and not reverting to subsampling.

To COMPARE I wonder if there's a way to force 4:2:2 instead of 4:4:4 then switch back and forth (below 98FPS/98Hz) to compare the difference.

June 24, 2018 | 06:30 PM - Posted by James

Pretty much all of the video you see, unless you happen to be a professional video producer, is going to be chroma sub-sampled in some manner. It would only be noticeable for small text really. If you are playing a game where you really think you need 144 Hz at 4K, then you probably are not playing a game where you are reading a lot of small text. You probably aren’t going to be able to achieve 144 Hz in most games at 4K with good quality anyway. If the FPS is really, really important to you, then you could always run it at 1080p with all of the bells and whistles and 144 Hz.

I am not a fan of g-sync at all. Nvidia went to a lot of trouble to try to keep it proprietary without changing the standards. This is exactly the type of thing that should be added to standards though. There should be no vendor lock in between video cards and displays. It would be nice if they just make variable refresh a required part of the standard, then nvidia would essentially have to support it. The continued expense of using FGPAs is also a problem. It is going to be difficult to do HDR color accuracy and brightness accuracy with a variable refresh rate, but this still doesn’t require a g-sync TCON. It is just going to take a while for standard ASIC TCONs to catch up. It will require a lot of work to get the controller to achieve good color and brightness accuracy, but it is required for FreeSync 2. Designing an ASIC for the job will take longer, but it will be a lot cheaper in volume and probably will not require an active cooling solution, although with the bandwidth required, who knows. Hopefully we will get at least quantum dot enhanced computer displays; I don’t know if there are any available yet. That makes it easier to achieve the color accuracy and brightness for HDR. It will still require very precise control to make it work with varying frame times. OLED probably isn’t coming to computer displays due to burn in issues with static content, so the best solution seems to be quantum dot enhanced LCD right now.

The difficulties with 144 Hz aren’t really anything to do with variable refresh. If you wanted to run at a constant refresh of 144 Hz HDR, you would still have those limitations due to the higher bit rates required. This just doesn’t come up with TVs since video content generally never exceeds 60 FPS. It would generally be better to implement compression rather than sub-sampling, so that is something that could be done until we have the entire system capable of supporting the necessary bit rates. Running at the ridiculously high bit rates that is required for 4K HDR at high refresh rate is just going to be problematic. A lot of old and/or cheap cables will not be able to do it, which will cause all kinds of problems.

June 26, 2018 | 09:31 PM - Posted by Photonboy

I'm not sure I agree with all your points though they are valid concerns.

1) Good luck FORCING asynchronous to be part of a standard and being forced to implement it (do you have to support EVERYTHING in DP1.4 or DirectX12 on every GPU? No.) Yes, it would be nice but I don't know how you could force that.

2) This monitor is Quantum Dot already.

What we still need is further improvements on the LCD panel or preferably switch completely to OLED once OLED manufacturing improves image retention (not burn-in) and reduces cost.

Multiple LED backlights are used to provide local dimming but that would not be necessary with better LCD panel control to avoid light leakage or OLED that produces its own light per sub-pixel.

3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU's (which is probably more important than selling more GSync monitors).

NVidia worked to create asynchronous monitors which we likely would not even have without their investment. Somebody had to spend the cash to get this started. Hey, I'd prefer an open standard too instead of a walled garden but if we're going to play the blame game then AMD should be in there too.

Hopefully Freesync 2 HDR (I believe it's "2" and "HDR" both) HDTV's and monitors for gaming consoles helps put pressure on NVidia at some point but so long as there's no financial incentive good luck. That's just how business works.

(XBox One X is experimenting with Freesync 2 HDR... good video at Digital Foundry. FYI, it's still got some big issues so hopefully that can be mostly sorted out in driver support. For example, one game stuttered when it was in the asynchronous range. Why? Well, it's a lot more complicated than you may think it should be.)

4) NVidia made a proprietary module as they apparently discovered the current methods would be limited such as the difficult of overdrive working with a variable refresh rate so we agree on that. They discussed adding other features later on too.

You may question the use of an FPGA programmability is the REASON for that, plus it stands to reason they would find ways to reduce cost when it makes sense because I've got to assume there's little profit for NVidia in this and perhaps they are even LOSING money... in fact the mentioned early on that they wanted to add support in later modules for V-by-One instead of using LVDS to reduce cost so certainly cost has been on their minds from early on so if they pay up to $500 per module it's not sustainable.

So certainly an ASIC is coming. I have no doubt a low-cost solution will show up in the near future (next couple years). Possibly they still need the flexibility of programming the FPGA via firmware until they get the bugs ironed out, including programming for each PANEL type (AHVA etc).

A custom module might be as cheap or even CHEAPER to make a similar monitor in Freesync later on too as the (ASIC) module price should plummet but much of the work done by NVidia won't need to be repeated by individual monitor manufacturers (like they are being forced to do now with Freesync 2 HDR that adds to the cost).

So if the cost can drop to $50 or whatever for a drop-in solution (which already also eliminates the scaler), AND if it gives some benefit vs Freesync then it should be all worth it for NVidia. I have to assume that's the long-term plan.

Probably iron out all the HARDWARE bugs to the point they can just load up individual panel specs from a separate chip into the ASIC.

5) Compression vs sub-sampling: not my area but if cost is negligible then we should simply have BOTH options then implement whichever makes the most sense such as DSC 4:4:4 should the GPU and monitor support it or 4:2:2 sub-sampling if not.

6) Cables: don't disagree, but that's progress. Not sure if a proper DP1.4 cable is included or not but for $2000USD I certainly hope so.

June 28, 2018 | 05:19 PM - Posted by Nerdist (not verified)

3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU's (which is probably more important than selling more GSync monitors).

Your bias appears to be clouding your technical comprehension.

Any GPU can work with Freesync. It's an open VESA standard, meaning any VESA member has access to the spec and can implement it.

AMD isn't preventing anybody from supporting FreeSync. nVidia is deliberately not supporting FreeSync for obvious reasons.

This issue isn't a "blame game". It's just facts. Like most companies, nVidia has an economic interest in tying customers to their "ecosystem" and making it more difficult for customers to switch to the competition. That's part of what G-SYNC does. That's not evil. It's just business. But it's clearly also not what is best for consumers.

AMD doesn't do this. If the tables were turned and AMD was in the dominant position, AMD would behave no differently however. It's business!

July 12, 2018 | 12:53 PM - Posted by Photonboy

Ah, well Freesync original may be open source but "Freesync 2 HDR" which is what they are transitioning to DOES require an AMD graphics card. I guess I'm half right so I'm half wrong too.
https://www.tomshardware.com/news/amd-freesync-2-hdr-lfc,33248.html

"There’s a lot of coordination that needs to happen between game developers, AMD, and display vendors, so it remains to be seen how enthusiastically AMD’s partners embrace FreeSync 2, particularly because the technology is going to be proprietary for now."

June 30, 2018 | 05:32 AM - Posted by Andon M. Coleman (not verified)

Actually, OLED burn-in is an issue that needs to be resolved. Even with LG's panel aging nonsense that goes on when you turn their set off, my 4K OLED has the LG calibration menu permanently burnt-in.

Of course LG is stupid for using a white menu system while manipulating the TV's built-in controls, but they at least did us all a favor and put to rest any doubt that OLED will burn in.

Panasonic or any respectable vendor would have DCC calibration. Probably better to just use a separate video processor until LG figures out what the hell they're doing.

July 12, 2018 | 01:08 PM - Posted by Photonboy

Burn-In is apparently a non-issue for newer HDTV's. The issue now is Image Retention. The difference is that burn-in is permanent or at least can be difficult to get rid of whereas Image Retention is similar to ghosting in that a pixel can retain its color for too long if held at the same or similar frequency and/or brightness (I don't know the exact details).

THIS sort of agrees AND conflicts with what I said though (you have to just read it):
https://www.cnet.com/news/oled-screen-burn-in-what-you-need-to-know/

It's partly confusing because OLED screens vary in type and of course PHONES typically have a more static image.

*anyway, some articles say a recent, quality OLED HDTV is almost impossible to achieve BURN-IN whereas Image Retention remains an issue. One is permanent the other is not.

Either way it makes screens unsuitable as desktop monitors until this is sorted out.

June 23, 2018 | 03:40 AM - Posted by Dark_wizzie

Any panel uniformity issues? Based on your naked eye. The 1440p 27in 144hz gsync displays were garbage in QC.

June 23, 2018 | 05:16 AM - Posted by Kommando Kodiak (not verified)

Reading this article on the monitor being reviewed, I got mine yesterday

June 23, 2018 | 01:58 PM - Posted by ZhaZha (not verified)

lol, me too.

June 23, 2018 | 02:38 PM - Posted by Curtis Vincent (not verified)

Whats It like, im wanting to get one as Im a games art student, I want great colour accuracy but also high refresh so I can use it for work and games.Not out in the UK yet and prices are stupid high but Im hoping its worth it?

June 23, 2018 | 07:15 AM - Posted by DM (not verified)

Hopefully nobody buys it so nvidia will be forced to support freesync.

June 25, 2018 | 06:16 AM - Posted by Spunjji

There are at least 2 people in the comments here claiming to have bought it already. I really wish people wouldn't for the exact reason you've posted but some people don't care and others need it for their epeen, so who knows.

June 27, 2018 | 11:10 AM - Posted by Stef (not verified)

Freesync 2 require the use of a proprietary API by game developers to works with HDR

June 23, 2018 | 07:26 AM - Posted by Jabbadap

Hmm does gsync work with hdmi? Or was that old rumor dud.

June 23, 2018 | 08:31 AM - Posted by Anonymouse (not verified)

"For clarification, the 98Hz limit represents the refresh rate at which the monitor switches from 4:4:4 chroma subsampling to 4:2:2 subsampling. While this shouldn't affect things such as games and movies to a noticeable extent, 4:2:2 can result in blurry or difficult to read text in certain scenarios. "

You can go up to UHD @ 120Hz at 8 bits per pixel, the 98Hz DisplayPort limit is for 10bpp. While you would be advised against using a wide gamut output at 8bpp (due to more noticeable banding on gradients), you can still enable FALD and use the larger dynamic range at 120Hz. This also means 120Hz sRGB desktop use is not an issue.

------

I'd also be willing to wager that with the 11xx/20xx series cards, 144Hz 4:4:4 may be an option with RAMDACs 'overacheiving' over the DP 1.3/1.4 clock rate of 810MHz. That could be using the notional 'DP 1.5' pre-final standard (as Nvidia are part of VESA) or it could be a 'just for Nvidia' clock rate.

June 23, 2018 | 10:01 AM - Posted by Jabbadap

There's no RAMDACs on new graphics cards. No analog outputs no digital to analog converter. I really don't know if that DVI like pixel overclock works through displayport. But yeah if that could work, that gsync module has to be able to take that clock too.

June 25, 2018 | 03:52 AM - Posted by psuedonymous

RAMDACs have been the norm for so long it just ends up as the colloquial term for "that PHY layer bit" even if it's actually a TMDS modulator!

DisplayPort does not 'overclock' like DVI or HDMI does, as it is not a pixel-clock-based transport but a packetised transport. There are 4 valid symbol rates (162MHz, 270MHz, 540MHz, 810MHz) and if your resolution/refresh rate does not 'use up' all the bandwidth available at that symbol rate that bandwidth is just 'wasted'.
For Nvidia to push more bandwidth via DP, they would either need to be using the notional DP 1.5 symbol rate (with no guarantee their cards will ever by DP 1.5 compliant when the standard is ratified, because there are more changes than that and all sorts of timing requirements to meet), or just an arbitrary rate that they can pick because they make the devices on both ends of the cable (the GPU, and the G-Sync panel controller).

June 23, 2018 | 09:53 AM - Posted by Anonymous34ffds322 (not verified)

Rumored asus as put hold on stores from selling and anyone who purchases should return for firmware update these monitors are garbage for what you are paying not even 110 bit panels lol

https://old.reddit.com/r/nvidia/comments/8t406e/asus_recalls_their_4khdr...

June 23, 2018 | 11:57 AM - Posted by jcchg

How does it compares with Samsung CHG70?

June 23, 2018 | 12:06 PM - Posted by KingKookaluke (not verified)

1-99-9999! Ha!~

June 23, 2018 | 12:26 PM - Posted by Monstieur (not verified)

There is no need to run the monitor in 10-bit YCbCr422 120 Hz mode for HDR. 8-bit RGB 120 Hz is visually identical, with no banding. It's only required that the application render to a 10-bit DirectX 11 surface. The NVIDIA driver automatically performs 10-bit to 8-bit dithering which eliminates banding, allowing you to run the game in 8-bit RGB 120 Hz mode and avoid chroma subsampling. The 10-bit YCbCr422 mode is only really needed for consoles or Blu-Ray players on the HDMI port.

June 24, 2018 | 02:13 AM - Posted by Allyn Malventano

In my previous observation of HDR panels, games running in 8-bit HDR certainly had noticeable banding in areas of sky / other areas with low contrast. Also, if there was no need for 10-bit, then it wouldn’t be part of the standard.

June 25, 2018 | 05:32 AM - Posted by Monstieur (not verified)

It's still required that the game render to a 10-bit DirectX 11 surface to prevent banding. The GPU to monitor signal alone can be 8-bit RGB since the GPU performs dithering automatically.

Switching on HDR in a game should make it switch from a 8-bit surface to a 10-bit surface. If it doesn't happen automatically then it'll cause banding.

June 25, 2018 | 05:38 AM - Posted by Monstieur (not verified)

10-bit is part of the standard because the *source content* needs to be 10-bit to present small enough steps that don't result in banding in high luminance scenes. Because it should be possible to transmit 10-bit YCbCr422 video from a Blu-ray player to the display without further processing (like dithering), HDR10 displays are required to accept a 10-bit signal.

However if the source can perform dithering, an 8-bit signal is sufficient to transmit 10-bit content without banding. In the case of a PC, 8-bit RGB is clearly preferable to 10-bit YCbCr422. If some games are revering to 8-bit rendering in HDR mode, then it's still a software issue that can be addressed.

June 30, 2018 | 05:43 AM - Posted by Andon M. Coleman (not verified)

The NVIDIA driver creates banding in this scenario, actually. If allowed to run in fullscreen exclusive mode, any 10-bit RGB:A surface on all G-Sync displays I have tested get a delightful red/green banding that resembles chromatic aberration.

The only workaround is to force the game into windowed mode, and engage flip model presentation rather than BitBlit. This allows G-Sync to work and prevents said artifacts.

July 24, 2018 | 03:24 PM - Posted by Monstieur (not verified)

I don't have this issue with my LG OLED. I always set the display to 8-bit RGB, and a 10-bit DX11 surface is always dithered without banding in both borderless window and fullscreen exclusive mode.

June 23, 2018 | 12:39 PM - Posted by JohnGR

True Blurring Arrives. :p

June 23, 2018 | 01:40 PM - Posted by EJCRV (not verified)

Who would really use 4K on a 27" monitor. Way to small for that kind of resolution.

June 25, 2018 | 06:39 AM - Posted by Spunjji

I have a 24" 4K monitor. Also had a laptop with a 15" 4K screen. Your comment is inaccurate.

June 23, 2018 | 11:46 PM - Posted by SeriousWriteDownsForGreenThisTime (not verified)

Ha ha, Nvidia made too many GPUs thinking that the GPU coin mining craze was a stable market and now Nvidia has Pascal SKU stocks up to the rafters. So no new generation until the older generation's inventories are moved.

Rally Nvidia the only reason your stripped of compute Pascal SKUs were purchased by miners was because they could not get their hands on enough AMD Polaris/Vega units owing to AMD getting Burned the first time around with so much unsold inventorty to write off.

Now that there are plenty of Polaris/Vega SKUs in stock with all their shader heavy designs able to be used for mining and the miners are not needing and Pascal(Less shader cores than AMD's SKUs) for mining. The price of coin has fallen and now Nvidia is having to put off introducing Pascal's sucessor for gaming.

Old JHH over at Nvidia has taken to weaing a London Fog full length trench coat while he stands in the back ally trying to sell some overstocked inventory.

June 24, 2018 | 12:33 AM - Posted by MrB (not verified)

True HDR? A "True HDR" display doesn't have a *REAL* native contrast ratio of only 1000:1

It's like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.

Real HDR displays don't exist, and won't exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.

June 24, 2018 | 12:34 AM - Posted by MrB (not verified)

True HDR? A "True HDR" display doesn't have a *REAL* native contrast ratio of only 1000:1

It's like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.

Real HDR displays don't exist, and won't exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.

June 28, 2018 | 09:11 AM - Posted by Photonboy

I'd say you are PARTLY correct... don't forget there is localized backlight control (384 zones) so that works in tandem with the brightness of pixels.

For example if an area is supposed to be light grey then the backlight for that area drops the light a lot to minimize light leakage.

It's not the same as individual pixel control, however it's also far better than an identical monitor without individual LED zones to control.

It's obviously still problematic to have pixels that are very BRIGHT in the same LED zone as pixels that are meant to be very DARK since you need to keep the LED bright in that case thus the pixels that should be dark will be more GREY.

Not perfect but a big improvement. There are monitors coming with more than a thousand zones, plus OLED will eventually get sorted out and probably replace everything.

June 24, 2018 | 06:23 AM - Posted by Bakath

It's not even real 10bit I believe it is a 8+2 which is plain stupid.

June 24, 2018 | 10:24 AM - Posted by grovdna (not verified)

as soon as I read active fan, I switched off. $ 2000 for an active fan monitor? No thanks very much.

June 24, 2018 | 10:46 AM - Posted by RS (not verified)

Same here...not a big fan of the "Fan"....no pun intended

June 24, 2018 | 06:01 PM - Posted by James

Curious as to how the desktop at SDR is handled when displaying content in HDR in a window. The rest of the desktop getting dimmer would seem to be the expected result. SDR is supposed to specify a range of 0 to 100 nits while HDR can specify a range of 1000 nits, 4000 nits, or more. When you switch the display to some HDR mode, anything that is SDR will have to be converted to HDR in some manner. If you just convert it into 0 to 100 of a 1000 nit range, it will look very dim. Is there a shift in the whole screen when you start an HDR video? You can’t blow up the SDR content to the full HDR range since it would probably cause significant banding.

June 25, 2018 | 05:43 AM - Posted by Monstieur (not verified)

Windows has a slider that effectively sets the SDR content peak brightness level. By default it's at 0 which looks like 100 nits to my eyes.

However it does not perform gamma correction and the SDR content is incorrected displayed on the display's PQ gamma curve.

June 28, 2018 | 06:08 PM - Posted by Nerdist (not verified)

Curious as to how the desktop at SDR is handled when displaying content in HDR in a window.

Monitors don't work the way you think they do.

SDR doesn't define any brightness range at all. The brightest 8-bit value (255,255,255) simply means "brightest", whereas what that actually means in terms of luminance depends on the brightness setting of your monitor. Nothing more.

HDR values map to actual luminance levels, so the brightness setting on the monitor is typically deactivated/irrelevant. You only get HDR in full screen mode. In a window on the Windows desktop you can only ever get an SDR image, not least because consumer grade GPU's refuse to deliver a 10-bit signal unless forced into full screen HDR mode.

June 24, 2018 | 06:06 PM - Posted by Curtis Vincent (not verified)

So If we are fine with 98Hz is this then going to be able to deliver the colour accuracy and contrast as advertised? I want one as Im an artist that does gaming and wants the best of both without the need for seperate displays.

June 25, 2018 | 09:00 AM - Posted by Power (not verified)

Good first impressions. Now, please, send the monitor to Rtings for the review.

June 25, 2018 | 04:19 PM - Posted by GonnaWaitForHDMI2 (not verified)

What's the minimum refresh rate for GSync?

June 25, 2018 | 07:04 PM - Posted by John Pombrio

How quickly we forget. I remember buying the 24 inch SONY CRT monitor GDM-FW900 for $2 grand and it weighed a ton. Best CRT Trinitron monitor I ever had. Now I have the ASUS PG279Q monitor and I love it but it was also not cheap. I would get this monitor but even my EVGA GTX 1080 Ti FTW3 would not be able to do it justice. I suppose I should get a 4Kish TV first, heh.

June 26, 2018 | 12:07 PM - Posted by Quentin (not verified)

Your comment about games not being able to push UHD / 4K at 144 Hz is far from correct. You are forgetting about older games. For instance, the original Far Cry is still a great game and runs at 180 fps at 4k.

However, it's clear that Displayport 1.4 has insufficient bandwidth, so it's probably worth waiting for HDMI 2.1, which should, and the refreshed monitor and new GPU to run it.

June 26, 2018 | 03:09 PM - Posted by Colin Pastuch (not verified)

In what universe does a 27 inch LCD with bad backlighting compare to a 55 inch LG OLED? I have a 1080 Ti and I wouldn't buy this with YOUR money.

June 30, 2018 | 06:45 AM - Posted by Andon M. Coleman (not verified)

This one? I have both of these things and there's no comparison whatsoever. In fact, I was the first consumer in the US to purchase and review LG's 4K OLED when they went on sale in 2015.

You need to treat a 55 inch LG OLED like a plasma, which means even in an ideal light controlled environment, the damn thing's going to dim itself noticeably in a typical PC color scheme with white dominating everything.

June 28, 2018 | 01:13 AM - Posted by Hypetrain (not verified)

So on DisplayPort 1.4 it's 3840x2160+10bit+HDR+4:4:4 @98 Hz,
how high will the upcoming ultra-wide models go at 3440x1440+10bit+HDR+4:4:4?

Not sure if I did the math correctly but shouldn't it be 165 Hz (panel's max is 200 Hz according to announcements)?

June 28, 2018 | 09:25 AM - Posted by Photonboy

You are correct that scaling is proportional to the pixel count assuming everything else is identical.

Also, I still haven't been given an answer about DSC which allows visually lossless compression and if the monitor and GPU support it you can actually achieve 4K@240Hz.

But the chart (about halfway down in link below) also shows 4K@120Hz using 4:4:4 without DSC so why is this monitor only at 98Hz?

Something doesn't add up.
https://en.wikipedia.org/wiki/DisplayPort

Update: I'm an IDIOT. That's 4:4:4 at 8-bit, not 10-bit.

June 28, 2018 | 06:22 PM - Posted by Nerdist (not verified)

These are the numbers using exact bandwidth calculations:

10-bit = 159 Hz
8-bit = 198 Hz

June 29, 2018 | 04:38 PM - Posted by Danny (not verified)

Coming from 1080P TN Monitor, should I buy 4k 144hz IPS Monitor?

June 30, 2018 | 10:46 AM - Posted by DakTannon(forgotpassword) (not verified)

I actually own a pg27uq and it is BEAUTIFUL BUT... I have noticed problems Running a 3930k@4.8ghz, 32GB 1866mhz cas9, 2x titan x (Pascal) window 7pro 64 on samsung 840 with latest nvidia drivers, and window 10 pro 64 on samsung 850 evo also latest drivers. Now with that out of the way the monitor seems to cause tons of microstutter in games even with sli disabled I get serious microstutter even in titles like destiny 2 that ran fine on my XB280hk 4k 60hz I am getting 90-120 fps in destiny but sometimes it goes to stutter mode and even the menus it stutters and the only solution is at tab, although enter repeatedly and hope or closing and reopening and hoping, and in games like fallout 4 that only go to 60 fps the games like this become a horrible unplayable stuttery mess with momentary stalls followed by black screens the gameplay resumes and there is no fixing even lowering the monitor to 60hz just makes it worse has anyone else experienced this?

July 3, 2018 | 07:31 PM - Posted by Photonboy

Do you have GSYNC enabled or not?

If NO turn it on and if YES try disabling it temporarily... possibly try GSYNC OFF but force Adaptive VSync Half Refresh (which should cap to 72FPS VSYNC ON but disable VSYNC if you can't output 72FPS).

Can you force an FPS cap using NVInspector or some other tool?

I assume GSYNC is ON and you obviously don't want it off but for now maybe try the different options so you can at least narrow down the problem.

Maybe its an NVidia driver issue, but a quick Google doesn't show much so maybe its simply early adopter blues that will get sorted out.

Also, maybe try shutting down and physically removing a video card. I know SLI is supposedly off in software but it's all I can think of if the obvious isn't helping.

Plus of course contacting the monitor manufacturer but I do think it's a software issue.

July 13, 2018 | 11:24 AM - Posted by Onyx1640

I really hope we start seeing more size and resolution options; 1440p at 27 and 32 inches would be a great start. I'd also love to see a 35 or so inch 4k model.

July 28, 2018 | 01:53 AM - Posted by Tom Kaminski (not verified)

Great review! Considering that it has an active fan, I am curious what the power consumption is under different settings/scenarios. Does the power consumption vary with refresh rate,GSYNC,HDR, etc? The specs say max 180W, which is a *lot* for a monitor. It would be great to update the review with this info. Thanks.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.