Review Index:
Feedback

ASUS ROG Swift PG27UQ 27" 4K 144Hz G-SYNC Monitor: True HDR Arrives on the Desktop

Author: Ken Addison
Manufacturer: ASUS

Teardown and Hardware Analysis

Once we took off the stock monitor stand in order to investigate potential fan intake issues with the VESA mount, we couldn't help ourselves from continuing further to see what makes this display tick.

I'm actually impressed at how easily the PG27UQ came apart compared to some other displays we've taken apart. After removing the stand, a metal spudger was used to release the plastic clips holding the two pieces of the enclosure together. Taking care to remove the cables for the LEDs and controls on the back panel, we were left with access to the innards of the panel.

View Full Size

From the general layout, it appears there are two modules on the left and right side of the display that are likely controlling the 384-zone FALD backlight, which connects to PCB in the middle which the display outputs are attached to, and is responsible for interfacing directly to the LCD panel.

View Full Size

The LCD panel is the M270QAN02.2 from AUOptronics. While datasheets are available for similar models from AUO, this panel does not yet appear in databases like panelook.

View Full Size

View Full Size

After disconnecting the cables running to the PCB in the middle, and removing the bracket, we gained access to the electronics responsible for controlling the LCD panel itself.

A New G-SYNC Module

Now that we have a better view of the PCB, we can see exactly what the aforementioned blower fan and heatsink assembly are responsible for— the all-new G-SYNC module.

Over the years, there has been a lot of speculation about if/when NVIDIA would move from an FPGA solution to a cheaper, and smaller ASIC solution for controlling G-SYNC monitors. While extensible due to their programmability, FGPA's are generally significantly more expensive than ASICs and take up more physical space.

Removing the heatsink and thermal paste, we get our first peek at the G-SYNC module itself. 

View Full Size

As it turns out, G-SYNC HDR, like its predecessor is powered by an FPGA from Altera. In this case, NVIDIA is using an Intel Altera Arria 10 GX 480 FPGA. Thanks to the extensive documentation from Intel, including a model number decoder, we are able to get some more information about this particular FPGA.

A mid-range option in the Arria 10 lineup, the GX480 provides 480,000 reprogrammable logic, as well as twenty-four 17.4 Gbps Transceivers for I/O. Important for this given application, the GX480 also supports 222 pairs of LVDS I/O.

View Full Size

DRAM from Micron can also be spotted on this G-SYNC module. From the datasheet, we can confirm that this is, in fact, a total of  3 GB of DDR4-2400 memory. This memory is likely being used in the same lookaside buffer roll as the 768MB of memory on the original G-SYNC module, but is much higher capacity, and much faster.

While there's not a whole lot we can glean from the specs of the FPGA itself, it starts to paint a more clear picture of the current G-SYNC HDR situation. While our original speculation as to the $2,000 price point of the first G-SYNC HDR monitors was mostly based on potential LCD panel cost, it's now more clear that the new G-SYNC module makes up a substantial cost.

It's an unstocked item, without a large bulk quantity price break, but you can actually find this exact same FPGA on both Digikey and Mouser, available to buy. It's clear that NVIDIA isn't paying the $2600 per each FPGA that both sites are asking, but it shows that these are not cheap components in the least. I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory.


June 22, 2018 | 06:12 PM - Posted by zme-ul

3840 x 2160 is not 4K, what the hell mates! this is UHD
if you don't know the difference, what the hell are you doing in the tech review media?

June 22, 2018 | 06:19 PM - Posted by Anonymouse (not verified)

Everyone knows. Nobody cares. Shut up.

June 22, 2018 | 06:19 PM - Posted by 3840x2160 is 4K (not verified)

The term "4K" is generic, and refers to any resolution with a horizontal pixel count of approximately 4000 pixels. Several different 4K resolutions have been standardized by various organizations.

June 22, 2018 | 07:19 PM - Posted by amdcanprymy8086frommycolddeadhands (not verified)

yea, doesn't the marketing term "HD" refer to both 720 and 1080p? At least in the tech world?

June 23, 2018 | 06:58 AM - Posted by Apersonrespondinghere (not verified)

No, the term for 720p is HD and 1080p is called FullHD/FHD.

June 22, 2018 | 09:40 PM - Posted by Photonboy

It DOES mean 4K if that's the accepted usage by enough people. In fact, dictionaries are constantly modified to change words to what people use whether wrong or not.

Try going into WalMart, stand in the TV section, and tell people that those aren't 4K HDTV's... no they are 3840x2160 so DAMN IT... Samsung should know better!!

June 23, 2018 | 08:38 AM - Posted by zme-ul-is-wrong (not verified)

You need to stop. You're wrong. It's a fact 3840x2160 is 4K. It's widely accepted, except by people who think they're smart when they're not.

Accept you're wrong and move on with your life.

June 25, 2018 | 04:43 PM - Posted by WolfWings (not verified)

TL;DR: 4K = 2160p, that's the standard the industry adopted.

The default aspect ratio these days is 16:9, so 16:9 2160p = 3840x2160

There *ARE* 4096x2160 monitors, but it's a wonky aspect ratio only used by the Digital Cinema Initiatives standards body so it's almost exclusively used by movie projection to efficiently support 2.39:1 through 1.85:1 aspect ratio projections. This is indeed where the term "4K" started.

But 4096x2304 monitor panels aren't made, never have been, never will be, however 3840x2160 has numerous benefits in relation to the existing 1080p/720p sizes because scaling is exact and precise.

So kindly do shut the heck up about "That isn't 4K!" because every standards body has accepted that it is.

June 30, 2018 | 05:17 AM - Posted by Andon M. Coleman (not verified)

Equally meaningless. You speak of DCI-4K as if it were the only thing referred to as 4K while being completely unaware that UHD is not, in fact, a single resolution.

UHD-8K exists, DCI-8K doesn't. So, let's continue this stupid discussion in the future when digital cinema catches up to UHD and you'll swear up and down that movie theaters aren't 8K.

June 22, 2018 | 07:06 PM - Posted by Anony mouse (not verified)

You guys didnt go into much detail that Youtubers are providing.

Is it running 4:2:0 / 4:2:2 / 4:4:4 ?

Is the FALD noticeable during letterbox in-game cinematics or media content movies, online videos.

June 23, 2018 | 02:15 AM - Posted by Aldo (not verified)

Should be running 4:2:2 if you go above 98Hz. Really this is a 4K 98Hz monitor. The 144Hz stuff was sort of marketing BS. Chroma subsampling has never really been used on desktop monitors.

June 23, 2018 | 01:45 PM - Posted by ZhaZha (not verified)

Actually, if not allow sub-sampling, it can reach 120hz for 4k SDR and 98hz 4k HDR. Anything above will require subsampling(4:2:2). So I would say it's a 4k 120hz monitor.

June 22, 2018 | 08:00 PM - Posted by Axle Grease

It doesn't look good.

http://www.guru3d.com/news-story/asus-and-acer-uhd-g-sync-hdr-monitors-f...

June 22, 2018 | 10:01 PM - Posted by Photonboy

PCPER can you find out about THIS?

I'm confused still:
https://en.wikipedia.org/wiki/DisplayPort

Go down roughly half way and there's a chart (Bandwidth is based on uncompressed 4:4:4.) that shows 4K at 144Hz using 4:2:2 with DP1.3...

Then right next to it we see 4K at 144Hz using DP1.4 under DSC (lossless compression). Actually supports up to 240Hz via 4:2:0 or 4:4:4 via DSC.

My understanding is that the need to drop to 98Hz or drop to 4:2:2 is a problem that DSC resolves.

NVidia PASCAL does support DSC:
http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,2.html

(correct me if I'm wrong but if DSC is supported by v1.4 it's required for a card stating "DP1.4" to have DSC?)

It's inconceivable this monitor would not.

*So we should be able to get up to 4K 4:4:4 on this monitor?!!

June 22, 2018 | 10:41 PM - Posted by asdkjfhlaskhdfhlasdfka (not verified)

DSC is "visually" lossless, not mathematically.

June 23, 2018 | 02:08 AM - Posted by Aldo (not verified)

It's not a requirement for cards with DP1.4 to support DSC. I believe that is an optional feature. Regardless, Nvidia's Gsync module does not support DSC, even though Freesync scalers do.

June 25, 2018 | 06:13 AM - Posted by Spunjji

Nice to know that $500 in the BOM goes to a module that was out of date before it launched. xD

June 26, 2018 | 07:33 PM - Posted by Photonboy

1) Do you have any PROOF that GSync Modules lack DSC support? I find that highly unlikely considering how long the standard has been out AND its high relevance to the purpose of the GSync Module.

2) As for "visually lossless" above. While I appreciate the distinction there are two issues then:

a) First off, I thought that implied it was difficult if not impossible for the average person to tell the difference between true lossless and visually lossless, and

b) if DSC was available then why bother switching to 4:2:2 subsampling which apparently DOES have obvious artifacting and/or color differences?

It seems likely to me that if DSC was available it should be utilized.

So either one, some or all of the following lack support for DSC:
a) NVidia Pascal (and other) GPU's,
b) the Asus monitor
c) NVidia drivers
d) Windows?

I thought DSC was already utilized for Workstation setups though I could be wrong but either way I'm baffled why DSC isn't apparently working here.

At this point my BEST GUESS is that it's most GPU's (even Pascal?) that lack support and that the MONITOR probably has it.

June 23, 2018 | 12:50 AM - Posted by CraigN (not verified)

So, just curious, is HDR *only* able to be enabled at 4K? I know that using monitors at their non-native resolution is not ideal, but is there anything from stopping a user from buying this and running it at 2560x1440 at 144 Hz with HDR enabled?

Thanks!

June 23, 2018 | 02:02 AM - Posted by jckl (not verified)

Think its possible to remove and repurpose the FPGA?

June 24, 2018 | 05:25 AM - Posted by ET3D

Best comment! :)

So true. The FPGA is over $2000 to buy separately, and here you pay $2000 and get not only the FPGA, but 3GB of RAM on the board and a very good LCD panel.

June 28, 2018 | 10:29 AM - Posted by Stef (not verified)

Best comment, really! XD

June 23, 2018 | 02:18 AM - Posted by Aldo (not verified)

Really dissapointed with this product. Chroma subsampling on a $2000 monitor? Are you kidding me? And on top of that we have to deal with shady marketing which doesn't even mention that you have to run the monitor at 98Hz to avoid it. Not to mention the fan noise and other quality control issues. Asus should have spent another couple of months and got this thing right. For a $2000 product this seems to have a lot of compromises.

June 23, 2018 | 01:57 PM - Posted by ZhaZha (not verified)

Mine arrived two days ago and I am very satisfied with it. Haven't noticed any fan noise, HDR is really amazing, it turned the far cry 5 into a totally different game! Now I feel it really worth the price.

June 26, 2018 | 08:30 PM - Posted by Photonboy

Glad Far Cry 5 looks great, but there may be times when the subsampling artifacts are noticeable according to at least one review.

Also, did the game run above 98FPS for you?

I assume you used GSYNC so unless you output more than 98FPS the monitor was under 98Hz and thus running full 4:4:4 and not reverting to subsampling.

To COMPARE I wonder if there's a way to force 4:2:2 instead of 4:4:4 then switch back and forth (below 98FPS/98Hz) to compare the difference.

June 24, 2018 | 06:30 PM - Posted by James

Pretty much all of the video you see, unless you happen to be a professional video producer, is going to be chroma sub-sampled in some manner. It would only be noticeable for small text really. If you are playing a game where you really think you need 144 Hz at 4K, then you probably are not playing a game where you are reading a lot of small text. You probably aren’t going to be able to achieve 144 Hz in most games at 4K with good quality anyway. If the FPS is really, really important to you, then you could always run it at 1080p with all of the bells and whistles and 144 Hz.

I am not a fan of g-sync at all. Nvidia went to a lot of trouble to try to keep it proprietary without changing the standards. This is exactly the type of thing that should be added to standards though. There should be no vendor lock in between video cards and displays. It would be nice if they just make variable refresh a required part of the standard, then nvidia would essentially have to support it. The continued expense of using FGPAs is also a problem. It is going to be difficult to do HDR color accuracy and brightness accuracy with a variable refresh rate, but this still doesn’t require a g-sync TCON. It is just going to take a while for standard ASIC TCONs to catch up. It will require a lot of work to get the controller to achieve good color and brightness accuracy, but it is required for FreeSync 2. Designing an ASIC for the job will take longer, but it will be a lot cheaper in volume and probably will not require an active cooling solution, although with the bandwidth required, who knows. Hopefully we will get at least quantum dot enhanced computer displays; I don’t know if there are any available yet. That makes it easier to achieve the color accuracy and brightness for HDR. It will still require very precise control to make it work with varying frame times. OLED probably isn’t coming to computer displays due to burn in issues with static content, so the best solution seems to be quantum dot enhanced LCD right now.

The difficulties with 144 Hz aren’t really anything to do with variable refresh. If you wanted to run at a constant refresh of 144 Hz HDR, you would still have those limitations due to the higher bit rates required. This just doesn’t come up with TVs since video content generally never exceeds 60 FPS. It would generally be better to implement compression rather than sub-sampling, so that is something that could be done until we have the entire system capable of supporting the necessary bit rates. Running at the ridiculously high bit rates that is required for 4K HDR at high refresh rate is just going to be problematic. A lot of old and/or cheap cables will not be able to do it, which will cause all kinds of problems.

June 26, 2018 | 09:31 PM - Posted by Photonboy

I'm not sure I agree with all your points though they are valid concerns.

1) Good luck FORCING asynchronous to be part of a standard and being forced to implement it (do you have to support EVERYTHING in DP1.4 or DirectX12 on every GPU? No.) Yes, it would be nice but I don't know how you could force that.

2) This monitor is Quantum Dot already.

What we still need is further improvements on the LCD panel or preferably switch completely to OLED once OLED manufacturing improves image retention (not burn-in) and reduces cost.

Multiple LED backlights are used to provide local dimming but that would not be necessary with better LCD panel control to avoid light leakage or OLED that produces its own light per sub-pixel.

3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU's (which is probably more important than selling more GSync monitors).

NVidia worked to create asynchronous monitors which we likely would not even have without their investment. Somebody had to spend the cash to get this started. Hey, I'd prefer an open standard too instead of a walled garden but if we're going to play the blame game then AMD should be in there too.

Hopefully Freesync 2 HDR (I believe it's "2" and "HDR" both) HDTV's and monitors for gaming consoles helps put pressure on NVidia at some point but so long as there's no financial incentive good luck. That's just how business works.

(XBox One X is experimenting with Freesync 2 HDR... good video at Digital Foundry. FYI, it's still got some big issues so hopefully that can be mostly sorted out in driver support. For example, one game stuttered when it was in the asynchronous range. Why? Well, it's a lot more complicated than you may think it should be.)

4) NVidia made a proprietary module as they apparently discovered the current methods would be limited such as the difficult of overdrive working with a variable refresh rate so we agree on that. They discussed adding other features later on too.

You may question the use of an FPGA programmability is the REASON for that, plus it stands to reason they would find ways to reduce cost when it makes sense because I've got to assume there's little profit for NVidia in this and perhaps they are even LOSING money... in fact the mentioned early on that they wanted to add support in later modules for V-by-One instead of using LVDS to reduce cost so certainly cost has been on their minds from early on so if they pay up to $500 per module it's not sustainable.

So certainly an ASIC is coming. I have no doubt a low-cost solution will show up in the near future (next couple years). Possibly they still need the flexibility of programming the FPGA via firmware until they get the bugs ironed out, including programming for each PANEL type (AHVA etc).

A custom module might be as cheap or even CHEAPER to make a similar monitor in Freesync later on too as the (ASIC) module price should plummet but much of the work done by NVidia won't need to be repeated by individual monitor manufacturers (like they are being forced to do now with Freesync 2 HDR that adds to the cost).

So if the cost can drop to $50 or whatever for a drop-in solution (which already also eliminates the scaler), AND if it gives some benefit vs Freesync then it should be all worth it for NVidia. I have to assume that's the long-term plan.

Probably iron out all the HARDWARE bugs to the point they can just load up individual panel specs from a separate chip into the ASIC.

5) Compression vs sub-sampling: not my area but if cost is negligible then we should simply have BOTH options then implement whichever makes the most sense such as DSC 4:4:4 should the GPU and monitor support it or 4:2:2 sub-sampling if not.

6) Cables: don't disagree, but that's progress. Not sure if a proper DP1.4 cable is included or not but for $2000USD I certainly hope so.

June 28, 2018 | 05:19 PM - Posted by Nerdist (not verified)

3) How come AMD escapes peoples wrath anyway? They could have allowed NVidia cards to work with Freesync. NVidia probably would have put support in their drivers as it would sell more GPU's (which is probably more important than selling more GSync monitors).

Your bias appears to be clouding your technical comprehension.

Any GPU can work with Freesync. It's an open VESA standard, meaning any VESA member has access to the spec and can implement it.

AMD isn't preventing anybody from supporting FreeSync. nVidia is deliberately not supporting FreeSync for obvious reasons.

This issue isn't a "blame game". It's just facts. Like most companies, nVidia has an economic interest in tying customers to their "ecosystem" and making it more difficult for customers to switch to the competition. That's part of what G-SYNC does. That's not evil. It's just business. But it's clearly also not what is best for consumers.

AMD doesn't do this. If the tables were turned and AMD was in the dominant position, AMD would behave no differently however. It's business!

July 12, 2018 | 12:53 PM - Posted by Photonboy

Ah, well Freesync original may be open source but "Freesync 2 HDR" which is what they are transitioning to DOES require an AMD graphics card. I guess I'm half right so I'm half wrong too.
https://www.tomshardware.com/news/amd-freesync-2-hdr-lfc,33248.html

"There’s a lot of coordination that needs to happen between game developers, AMD, and display vendors, so it remains to be seen how enthusiastically AMD’s partners embrace FreeSync 2, particularly because the technology is going to be proprietary for now."

June 30, 2018 | 05:32 AM - Posted by Andon M. Coleman (not verified)

Actually, OLED burn-in is an issue that needs to be resolved. Even with LG's panel aging nonsense that goes on when you turn their set off, my 4K OLED has the LG calibration menu permanently burnt-in.

Of course LG is stupid for using a white menu system while manipulating the TV's built-in controls, but they at least did us all a favor and put to rest any doubt that OLED will burn in.

Panasonic or any respectable vendor would have DCC calibration. Probably better to just use a separate video processor until LG figures out what the hell they're doing.

July 12, 2018 | 01:08 PM - Posted by Photonboy

Burn-In is apparently a non-issue for newer HDTV's. The issue now is Image Retention. The difference is that burn-in is permanent or at least can be difficult to get rid of whereas Image Retention is similar to ghosting in that a pixel can retain its color for too long if held at the same or similar frequency and/or brightness (I don't know the exact details).

THIS sort of agrees AND conflicts with what I said though (you have to just read it):
https://www.cnet.com/news/oled-screen-burn-in-what-you-need-to-know/

It's partly confusing because OLED screens vary in type and of course PHONES typically have a more static image.

*anyway, some articles say a recent, quality OLED HDTV is almost impossible to achieve BURN-IN whereas Image Retention remains an issue. One is permanent the other is not.

Either way it makes screens unsuitable as desktop monitors until this is sorted out.

June 23, 2018 | 03:40 AM - Posted by Dark_wizzie

Any panel uniformity issues? Based on your naked eye. The 1440p 27in 144hz gsync displays were garbage in QC.

June 23, 2018 | 05:16 AM - Posted by Kommando Kodiak (not verified)

Reading this article on the monitor being reviewed, I got mine yesterday

June 23, 2018 | 01:58 PM - Posted by ZhaZha (not verified)

lol, me too.

June 23, 2018 | 02:38 PM - Posted by Curtis Vincent (not verified)

Whats It like, im wanting to get one as Im a games art student, I want great colour accuracy but also high refresh so I can use it for work and games.Not out in the UK yet and prices are stupid high but Im hoping its worth it?

June 23, 2018 | 07:15 AM - Posted by DM (not verified)

Hopefully nobody buys it so nvidia will be forced to support freesync.

June 25, 2018 | 06:16 AM - Posted by Spunjji

There are at least 2 people in the comments here claiming to have bought it already. I really wish people wouldn't for the exact reason you've posted but some people don't care and others need it for their epeen, so who knows.

June 27, 2018 | 11:10 AM - Posted by Stef (not verified)

Freesync 2 require the use of a proprietary API by game developers to works with HDR

June 23, 2018 | 07:26 AM - Posted by Jabbadap

Hmm does gsync work with hdmi? Or was that old rumor dud.

June 23, 2018 | 08:31 AM - Posted by Anonymouse (not verified)

"For clarification, the 98Hz limit represents the refresh rate at which the monitor switches from 4:4:4 chroma subsampling to 4:2:2 subsampling. While this shouldn't affect things such as games and movies to a noticeable extent, 4:2:2 can result in blurry or difficult to read text in certain scenarios. "

You can go up to UHD @ 120Hz at 8 bits per pixel, the 98Hz DisplayPort limit is for 10bpp. While you would be advised against using a wide gamut output at 8bpp (due to more noticeable banding on gradients), you can still enable FALD and use the larger dynamic range at 120Hz. This also means 120Hz sRGB desktop use is not an issue.

------

I'd also be willing to wager that with the 11xx/20xx series cards, 144Hz 4:4:4 may be an option with RAMDACs 'overacheiving' over the DP 1.3/1.4 clock rate of 810MHz. That could be using the notional 'DP 1.5' pre-final standard (as Nvidia are part of VESA) or it could be a 'just for Nvidia' clock rate.

June 23, 2018 | 10:01 AM - Posted by Jabbadap

There's no RAMDACs on new graphics cards. No analog outputs no digital to analog converter. I really don't know if that DVI like pixel overclock works through displayport. But yeah if that could work, that gsync module has to be able to take that clock too.

June 25, 2018 | 03:52 AM - Posted by psuedonymous

RAMDACs have been the norm for so long it just ends up as the colloquial term for "that PHY layer bit" even if it's actually a TMDS modulator!

DisplayPort does not 'overclock' like DVI or HDMI does, as it is not a pixel-clock-based transport but a packetised transport. There are 4 valid symbol rates (162MHz, 270MHz, 540MHz, 810MHz) and if your resolution/refresh rate does not 'use up' all the bandwidth available at that symbol rate that bandwidth is just 'wasted'.
For Nvidia to push more bandwidth via DP, they would either need to be using the notional DP 1.5 symbol rate (with no guarantee their cards will ever by DP 1.5 compliant when the standard is ratified, because there are more changes than that and all sorts of timing requirements to meet), or just an arbitrary rate that they can pick because they make the devices on both ends of the cable (the GPU, and the G-Sync panel controller).

June 23, 2018 | 09:53 AM - Posted by Anonymous34ffds322 (not verified)

Rumored asus as put hold on stores from selling and anyone who purchases should return for firmware update these monitors are garbage for what you are paying not even 110 bit panels lol

https://old.reddit.com/r/nvidia/comments/8t406e/asus_recalls_their_4khdr...

June 23, 2018 | 11:57 AM - Posted by jcchg

How does it compares with Samsung CHG70?

June 23, 2018 | 12:06 PM - Posted by KingKookaluke (not verified)

1-99-9999! Ha!~

June 23, 2018 | 12:26 PM - Posted by Monstieur (not verified)

There is no need to run the monitor in 10-bit YCbCr422 120 Hz mode for HDR. 8-bit RGB 120 Hz is visually identical, with no banding. It's only required that the application render to a 10-bit DirectX 11 surface. The NVIDIA driver automatically performs 10-bit to 8-bit dithering which eliminates banding, allowing you to run the game in 8-bit RGB 120 Hz mode and avoid chroma subsampling. The 10-bit YCbCr422 mode is only really needed for consoles or Blu-Ray players on the HDMI port.

June 24, 2018 | 02:13 AM - Posted by Allyn Malventano

In my previous observation of HDR panels, games running in 8-bit HDR certainly had noticeable banding in areas of sky / other areas with low contrast. Also, if there was no need for 10-bit, then it wouldn’t be part of the standard.

June 25, 2018 | 05:32 AM - Posted by Monstieur (not verified)

It's still required that the game render to a 10-bit DirectX 11 surface to prevent banding. The GPU to monitor signal alone can be 8-bit RGB since the GPU performs dithering automatically.

Switching on HDR in a game should make it switch from a 8-bit surface to a 10-bit surface. If it doesn't happen automatically then it'll cause banding.

June 25, 2018 | 05:38 AM - Posted by Monstieur (not verified)

10-bit is part of the standard because the *source content* needs to be 10-bit to present small enough steps that don't result in banding in high luminance scenes. Because it should be possible to transmit 10-bit YCbCr422 video from a Blu-ray player to the display without further processing (like dithering), HDR10 displays are required to accept a 10-bit signal.

However if the source can perform dithering, an 8-bit signal is sufficient to transmit 10-bit content without banding. In the case of a PC, 8-bit RGB is clearly preferable to 10-bit YCbCr422. If some games are revering to 8-bit rendering in HDR mode, then it's still a software issue that can be addressed.

June 30, 2018 | 05:43 AM - Posted by Andon M. Coleman (not verified)

The NVIDIA driver creates banding in this scenario, actually. If allowed to run in fullscreen exclusive mode, any 10-bit RGB:A surface on all G-Sync displays I have tested get a delightful red/green banding that resembles chromatic aberration.

The only workaround is to force the game into windowed mode, and engage flip model presentation rather than BitBlit. This allows G-Sync to work and prevents said artifacts.

July 24, 2018 | 03:24 PM - Posted by Monstieur (not verified)

I don't have this issue with my LG OLED. I always set the display to 8-bit RGB, and a 10-bit DX11 surface is always dithered without banding in both borderless window and fullscreen exclusive mode.

June 23, 2018 | 12:39 PM - Posted by JohnGR

True Blurring Arrives. :p

June 23, 2018 | 01:40 PM - Posted by EJCRV (not verified)

Who would really use 4K on a 27" monitor. Way to small for that kind of resolution.

June 25, 2018 | 06:39 AM - Posted by Spunjji

I have a 24" 4K monitor. Also had a laptop with a 15" 4K screen. Your comment is inaccurate.

June 23, 2018 | 11:46 PM - Posted by SeriousWriteDownsForGreenThisTime (not verified)

Ha ha, Nvidia made too many GPUs thinking that the GPU coin mining craze was a stable market and now Nvidia has Pascal SKU stocks up to the rafters. So no new generation until the older generation's inventories are moved.

Rally Nvidia the only reason your stripped of compute Pascal SKUs were purchased by miners was because they could not get their hands on enough AMD Polaris/Vega units owing to AMD getting Burned the first time around with so much unsold inventorty to write off.

Now that there are plenty of Polaris/Vega SKUs in stock with all their shader heavy designs able to be used for mining and the miners are not needing and Pascal(Less shader cores than AMD's SKUs) for mining. The price of coin has fallen and now Nvidia is having to put off introducing Pascal's sucessor for gaming.

Old JHH over at Nvidia has taken to weaing a London Fog full length trench coat while he stands in the back ally trying to sell some overstocked inventory.

June 24, 2018 | 12:33 AM - Posted by MrB (not verified)

True HDR? A "True HDR" display doesn't have a *REAL* native contrast ratio of only 1000:1

It's like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.

Real HDR displays don't exist, and won't exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.

June 24, 2018 | 12:34 AM - Posted by MrB (not verified)

True HDR? A "True HDR" display doesn't have a *REAL* native contrast ratio of only 1000:1

It's like the awful dynamic contrast ratio BS that has been pushed for years, only slightly less bad.

Real HDR displays don't exist, and won't exist until this industry makes real concrete improvements to the foundation of display technology instead of adding more issues on top of the existing ones.

June 28, 2018 | 09:11 AM - Posted by Photonboy

I'd say you are PARTLY correct... don't forget there is localized backlight control (384 zones) so that works in tandem with the brightness of pixels.

For example if an area is supposed to be light grey then the backlight for that area drops the light a lot to minimize light leakage.

It's not the same as individual pixel control, however it's also far better than an identical monitor without individual LED zones to control.

It's obviously still problematic to have pixels that are very BRIGHT in the same LED zone as pixels that are meant to be very DARK since you need to keep the LED bright in that case thus the pixels that should be dark will be more GREY.

Not perfect but a big improvement. There are monitors coming with more than a thousand zones, plus OLED will eventually get sorted out and probably replace everything.

June 24, 2018 | 06:23 AM - Posted by Bakath

It's not even real 10bit I believe it is a 8+2 which is plain stupid.

June 24, 2018 | 10:24 AM - Posted by grovdna (not verified)

as soon as I read active fan, I switched off. $ 2000 for an active fan monitor? No thanks very much.

June 24, 2018 | 10:46 AM - Posted by RS (not verified)

Same here...not a big fan of the "Fan"....no pun intended

June 24, 2018 | 06:01 PM - Posted by James

Curious as to how the desktop at SDR is handled when displaying content in HDR in a window. The rest of the desktop getting dimmer would seem to be the expected result. SDR is supposed to specify a range of 0 to 100 nits while HDR can specify a range of 1000 nits, 4000 nits, or more. When you switch the display to some HDR mode, anything that is SDR will have to be converted to HDR in some manner. If you just convert it into 0 to 100 of a 1000 nit range, it will look very dim. Is there a shift in the whole screen when you start an HDR video? You can’t blow up the SDR content to the full HDR range since it would probably cause significant banding.

June 25, 2018 | 05:43 AM - Posted by Monstieur (not verified)

Windows has a slider that effectively sets the SDR content peak brightness level. By default it's at 0 which looks like 100 nits to my eyes.

However it does not perform gamma correction and the SDR content is incorrected displayed on the display's PQ gamma curve.

June 28, 2018 | 06:08 PM - Posted by Nerdist (not verified)

Curious as to how the desktop at SDR is handled when displaying content in HDR in a window.

Monitors don't work the way you think they do.

SDR doesn't define any brightness range at all. The brightest 8-bit value (255,255,255) simply means "brightest", whereas what that actually means in terms of luminance depends on the brightness setting of your monitor. Nothing more.

HDR values map to actual luminance levels, so the brightness setting on the monitor is typically deactivated/irrelevant. You only get HDR in full screen mode. In a window on the Windows desktop you can only ever get an SDR image, not least because consumer grade GPU's refuse to deliver a 10-bit signal unless forced into full screen HDR mode.

June 24, 2018 | 06:06 PM - Posted by Curtis Vincent (not verified)

So If we are fine with 98Hz is this then going to be able to deliver the colour accuracy and contrast as advertised? I want one as Im an artist that does gaming and wants the best of both without the need for seperate displays.

June 25, 2018 | 09:00 AM - Posted by Power (not verified)

Good first impressions. Now, please, send the monitor to Rtings for the review.

June 25, 2018 | 04:19 PM - Posted by GonnaWaitForHDMI2 (not verified)

What's the minimum refresh rate for GSync?

June 25, 2018 | 07:04 PM - Posted by John Pombrio

How quickly we forget. I remember buying the 24 inch SONY CRT monitor GDM-FW900 for $2 grand and it weighed a ton. Best CRT Trinitron monitor I ever had. Now I have the ASUS PG279Q monitor and I love it but it was also not cheap. I would get this monitor but even my EVGA GTX 1080 Ti FTW3 would not be able to do it justice. I suppose I should get a 4Kish TV first, heh.

June 26, 2018 | 12:07 PM - Posted by Quentin (not verified)

Your comment about games not being able to push UHD / 4K at 144 Hz is far from correct. You are forgetting about older games. For instance, the original Far Cry is still a great game and runs at 180 fps at 4k.

However, it's clear that Displayport 1.4 has insufficient bandwidth, so it's probably worth waiting for HDMI 2.1, which should, and the refreshed monitor and new GPU to run it.

June 26, 2018 | 03:09 PM - Posted by Colin Pastuch (not verified)

In what universe does a 27 inch LCD with bad backlighting compare to a 55 inch LG OLED? I have a 1080 Ti and I wouldn't buy this with YOUR money.

June 30, 2018 | 06:45 AM - Posted by Andon M. Coleman (not verified)

This one? I have both of these things and there's no comparison whatsoever. In fact, I was the first consumer in the US to purchase and review LG's 4K OLED when they went on sale in 2015.

You need to treat a 55 inch LG OLED like a plasma, which means even in an ideal light controlled environment, the damn thing's going to dim itself noticeably in a typical PC color scheme with white dominating everything.

June 28, 2018 | 01:13 AM - Posted by Hypetrain (not verified)

So on DisplayPort 1.4 it's 3840x2160+10bit+HDR+4:4:4 @98 Hz,
how high will the upcoming ultra-wide models go at 3440x1440+10bit+HDR+4:4:4?

Not sure if I did the math correctly but shouldn't it be 165 Hz (panel's max is 200 Hz according to announcements)?

June 28, 2018 | 09:25 AM - Posted by Photonboy

You are correct that scaling is proportional to the pixel count assuming everything else is identical.

Also, I still haven't been given an answer about DSC which allows visually lossless compression and if the monitor and GPU support it you can actually achieve 4K@240Hz.

But the chart (about halfway down in link below) also shows 4K@120Hz using 4:4:4 without DSC so why is this monitor only at 98Hz?

Something doesn't add up.
https://en.wikipedia.org/wiki/DisplayPort

Update: I'm an IDIOT. That's 4:4:4 at 8-bit, not 10-bit.

June 28, 2018 | 06:22 PM - Posted by Nerdist (not verified)

These are the numbers using exact bandwidth calculations:

10-bit = 159 Hz
8-bit = 198 Hz

June 29, 2018 | 04:38 PM - Posted by Danny (not verified)

Coming from 1080P TN Monitor, should I buy 4k 144hz IPS Monitor?

June 30, 2018 | 10:46 AM - Posted by DakTannon(forgotpassword) (not verified)

I actually own a pg27uq and it is BEAUTIFUL BUT... I have noticed problems Running a 3930k@4.8ghz, 32GB 1866mhz cas9, 2x titan x (Pascal) window 7pro 64 on samsung 840 with latest nvidia drivers, and window 10 pro 64 on samsung 850 evo also latest drivers. Now with that out of the way the monitor seems to cause tons of microstutter in games even with sli disabled I get serious microstutter even in titles like destiny 2 that ran fine on my XB280hk 4k 60hz I am getting 90-120 fps in destiny but sometimes it goes to stutter mode and even the menus it stutters and the only solution is at tab, although enter repeatedly and hope or closing and reopening and hoping, and in games like fallout 4 that only go to 60 fps the games like this become a horrible unplayable stuttery mess with momentary stalls followed by black screens the gameplay resumes and there is no fixing even lowering the monitor to 60hz just makes it worse has anyone else experienced this?

July 3, 2018 | 07:31 PM - Posted by Photonboy

Do you have GSYNC enabled or not?

If NO turn it on and if YES try disabling it temporarily... possibly try GSYNC OFF but force Adaptive VSync Half Refresh (which should cap to 72FPS VSYNC ON but disable VSYNC if you can't output 72FPS).

Can you force an FPS cap using NVInspector or some other tool?

I assume GSYNC is ON and you obviously don't want it off but for now maybe try the different options so you can at least narrow down the problem.

Maybe its an NVidia driver issue, but a quick Google doesn't show much so maybe its simply early adopter blues that will get sorted out.

Also, maybe try shutting down and physically removing a video card. I know SLI is supposedly off in software but it's all I can think of if the obvious isn't helping.

Plus of course contacting the monitor manufacturer but I do think it's a software issue.

July 13, 2018 | 11:24 AM - Posted by Onyx1640

I really hope we start seeing more size and resolution options; 1440p at 27 and 32 inches would be a great start. I'd also love to see a 35 or so inch 4k model.

July 28, 2018 | 01:53 AM - Posted by Tom Kaminski (not verified)

Great review! Considering that it has an active fan, I am curious what the power consumption is under different settings/scenarios. Does the power consumption vary with refresh rate,GSYNC,HDR, etc? The specs say max 180W, which is a *lot* for a monitor. It would be great to update the review with this info. Thanks.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.