Subject: Displays
Manufacturer: ASUS

Introduction, Specifications, and Packaging

AMD fans have been patiently waiting for a proper FreeSync display to be released. The first round of displays using the Adaptive Sync variable refresh rate technology arrived with an ineffective or otherwise disabled overdrive feature, resulting in less than optimal pixel response times and overall visual quality, especially when operating in variable refresh rate modes. Meanwhile G-Sync users had overdrive functionality properly functioning , as well as a recently introduced 1440P IPS panel from Acer. The FreeSync camp was overdue for an IPS 1440P display superior to that first round of releases, hopefully with those overdrive issues corrected. Well it appears that ASUS, the makers of the ROG Swift, have just rectified that situation with a panel we can finally recommend to AMD users:

DSC02594.jpg

Before we get into the full review, here is a sampling of our recent display reviews from both sides of the camp:

  • ASUS PG278Q 27in TN 1440P 144Hz G-Sync
  • Acer XB270H 27in TN 1080P 144Hz G-Sync
  • Acer XB280HK 28in TN 4K 60Hz G-Sync
  • Acer XB270HU 27in IPS 1440P 144Hz G-Sync
  • LG 34UM67 34in IPS 25x18 21:9 48-75Hz FreeSync
  • BenQ XL2730Z 27in TN 1440P 40-144Hz FreeSync
  • Acer XG270HU 27in TN 1440P 40-144Hz FreeSync
  • ASUS MG279Q 27in IPS 1440P 144Hz FreeSync(35-90Hz) < You are here

The reason for there being no minimum rating on the G-Sync panels above is explained in our article 'Dissecting G-Sync and FreeSync - How the Technologies Differ', though the short version is that G-Sync can effectively remain in VRR down to <1 FPS regardless of the hardware minimum of the display panel itself.

Continue reading as we will look at this new ASUS MG279Q 27" 144Hz 1440P IPS FreeSync display!

Subject: Displays
Manufacturer: Seiki

Introduction and Specifications

Seiki has spent the past few years making quite the entrance into the display market. Starting with LCD TVs, they seemingly came out of nowhere back in April of 2013 with a 50” 4K display that was available at a very competitive price at that time. Since then, we’ve seen a few more display releases out of Seiki, and they were becoming popular among home theater enthusiasts on a budget and for gamers who wanted a bigger panel in front of them. Last June, Seiki announced a desktop line of 4K monitors. These would not just be repurposed televisions, but ground-up designs intended for desktop professionals and gamers alike. The most eagerly awaited part of this announcement was promised 60 Hz support at 4K resolutions.

Just under a year later, we are happy to bring you a review of the first iteration on this new Seiki Pro lineup:

seikipro1.jpg

Behold, 40 inches of 4K professional desktop display goodness!

Subject: Displays
Manufacturer: Acer

Introduction and Specifications

Displays have been a hot item as of late here at PC Perspective. Today we are looking at the new Acer XB270HU. In short, this is an IPS version of the ASUS ROG Swift. For the long version, it is a 1440P, 144Hz, G-Sync enabled 27 inch display. This is the first G-Sync display released with an IPS panel, which is what makes this release such a big deal. Acer has been pushing hard on the display front, with recent releases of the following variable refresh capable displays:

  • XB270H 27in 1080P 144Hz G-Sync
  • XB280HK 28in 4K 60Hz G-SYnc
  • XG270HU 27in 1440P 40-144Hz FreeSync
  • XB270HU 27in 1440P 144Hz G-Sync < you are here

The last entry in that list is the subject of todays review, and it should look familiar to those who have been tracking Acer's previous G-Sync display releases:

DSC01299.JPG

Here's our video overview of this new display. I encourage you to flip through the review as there are more comparison pictures and information to go along.

Continue reading our review of the Acer XB270HU 1440P 144Hz IPS G-Sync Monitor!!

Author:
Subject: Displays
Manufacturer: LG

A monitor for those that like it long

It takes a lot to really impress someone that sits in front of dual 2560x1600 30-in IPS screens all day, but the LG 34UM95 did just that. With a 34-in diagonal 3440x1440 resolution panel forming a 21:9 aspect ratio, built on LG IPS technology for flawless viewing angles, this monitor creates a work and gaming experience that is basically unmatched in today's market. Whether you need to open up a half-dozen Excel or Word documents, keep an eye on your Twitter feed while looking at 12 browsers or run games at near Eyefinity/Surround levels without bezels, the LG 34UM95 is a perfect option.

Originally priced north of $1200, the 34UM95 and many in LG's 21:9 lineup have dropped in price considerably, giving them more avenues into users' homes. There are obvious gaming advantages to the 34-in display compared to a pair of 1920x1080 panels (no bezel, 20% more pixels) but if you have a pair of 2560x1440 screens you are going to be giving up a bit. Some games might not handle 21:9 resolutions well either, just as we continue to see Eyefinity/Surround unsupported occasionally.

Productivity users will immediately see an improvement, both for those us inundated with spreadsheets, web pages and text documents as well as the more creative types with Adobe Premiere timelines. I know that Ken would definitely have approved us keeping this monitor here at the office for his use.

Check out the video above for more thoughts on the LG 34UM95!

Author:
Subject: Displays
Manufacturer: AMD

What is FreeSync?

FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name - and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.

But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.

disp4.jpg

A set of three new FreeSync monitors from Acer, LG and BenQ.

Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:

The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

slides01.jpg

Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.

Continue reading our first impressions of the newly released AMD FreeSync technology!!

Manufacturer: NVIDIA

Introduction

It has been an abnormal week for us here at PC Perspective. Our typical review schedule has pretty much flown out the window, and the past seven days have been filled with learning, researching, retesting, and publishing. That might sound like the norm, but in these cases the process was initiated by tips from our readers. Last Saturday (24 Jan), a few things were brewing:

We had to do a bit of triage here of course, as we can only research and write so quickly. Ryan worked the GTX 970 piece as it was the hottest item. I began a few days of research and testing on the 840 EVO slow down issue reappearing on some drives, and we kept tabs on that third thing, which at the time seemed really farfetched. With those two first items taken care of, Ryan shifted his efforts to GTX 970 SLI testing while I shifted my focus to finding out of there was any credence to this G-Sync laptop thing.

A few weeks ago, an ASUS Nordic Support rep inadvertently leaked an interim build of the NVIDIA driver. This was a mobile driver build (version 346.87) focused at their G751 line of laptops. One recipient of this driver link posted it to the ROG forum back on the 20th. A fellow by the name Gamenab, owning the same laptop cited in that thread, presumably stumbled across this driver, tried it out, and was more than likely greeted by this popup after the installation completed:

gsync panel connected-.png

Now I know what you’re thinking, and it’s probably the same thing anyone would think. How on earth is this possible? To cut a long story short, while the link to the 346.87 driver was removed shortly after being posted to that forum, we managed to get our hands on a copy of it, installed it on the ASUS G751 that we had in for review, and wouldn’t you know it we were greeted by the same popup!

Ok, so it’s a popup, could it be a bug? We checked NVIDIA control panel and the options were consistent with that of a G-Sync connected system. We fired up the pendulum demo and watched the screen carefully, passing the machine around the office to be inspected by all. We then fired up some graphics benchmarks that were well suited to show off the technology (Unigine Heaven, Metro: Last Light, etc), and everything looked great – smooth steady pans with no juddering or tearing to be seen. Ken Addison, our Video Editor and jack of all trades, researched the panel type and found that it was likely capable of 100 Hz refresh. We quickly dug created a custom profile, hit apply, and our 75 Hz G-Sync laptop was instantly transformed into a 100 Hz G-Sync laptop!

Ryan's Note: I think it is important here to point out that we didn't just look at demos and benchmarks for this evaluation but actually looked at real-world gameplay situations. Playing through Metro: Last Light showed very smooth pans and rotation, Assassin's Creed played smoothly as well and flying through Unigine Heaven manually was a great experience. Crysis 3, Battlefield 4, etc. This was NOT just a couple of demos that we ran through - the variable refresh portion of this mobile G-Sync enabled panel was working and working very well.

custom hz--.png

At this point in our tinkering, we had no idea how or why this was working, but there was no doubt that we were getting a similar experience as we have seen with G-Sync panels. As I digested what was going on, I thought surely this can’t be as good as it seems to be… Let’s find out, shall we?

Continue reading our story on Mobile G-Sync and impressions of our early testing!!

Author:
Subject: Displays
Manufacturer: Acer

Technical Specifications

NVIDIA's G-Sync technology and the monitors that integrate it continue to be one of hottest discussion topics surrounding PC technology and PC gaming. We at PC Perspective have dived into the world of variable refresh rate displays in great detail, discussing the technological reasons for it's existence, talking with co-creator Tom Petersen in studio, doing the first triple-panel Surround G-Sync testing as well as reviewing several different G-Sync monitor's available on the market. We were even the first to find the reason behind the reported flickering a 0 FPS on G-Sync monitors.

IMG_0643.JPG

A lot of has happened in the world of displays in the year or more since NVIDIA first announced G-Sync technology including a proliferation of low cost 4K panels as well as discussion of FreeSync, AMD's standards-based alternative to G-Sync. We are still waiting for our first hands on time (other than a static demo) with monitors supporting FreeSync / AdaptiveSync and it is quite likely that will occur at CES this January. If it doesn't, AMD is going to have some serious explaining to do...

But today we are looking at the new Acer XB270H, a 1920x1080 27-in monitor with G-Sync support and a 144 Hz refresh rate; a unique combination. In fact, there is no other 27-in 144 Hz 1080p monitor on the market that we are aware of after a quick search of Newegg.com and Amazon.com. But does this monitor offer the same kind of experience as the ASUS ROG Swift PG278Q or even the Acer XB280HK 4K G-Sync panels?

Continue reading our review of the Acer XB270H 1080p 144 Hz G-Sync Monitor!!

Manufacturer: PC Percpective

Overview

We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.

IMG_9328.JPG

As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.

Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.

screen refresh rate-.png

First a bit of misnomer correction / foundation laying:

  • The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
  • LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
  • In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
  • The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).

With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:

eve ss-2-.png

Measured panel section brightness over time during a 'stall' event. Click to enlarge.

The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.

Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.

A few final points before we go:

  • This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
  • The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
  • The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).

This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.

During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:

"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.

This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.

When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."

So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below. 

(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)

Author:
Subject: Displays
Manufacturer: Philips

Technical Specifications

Since the introduction of the first low cost 4K TVs in the form of the SEIKI SE50UY04, and then into the wild world of MST 4K monitors from ASUS and others, and finally with the release of single stream low cost 4K panels, PC Perspective has been covering the monitor resolution revolution heavily. Just look at these reviews:

philips1.jpg

Today we bring in another vendor's 4K consumer monitor and put it to the test, pitting against the formidable options from ASUS, Samsung, Acer and others. The Philips 288P6LJEB 4K 60 Hz monitor closely mirrors many of the specifications and qualities of other low-cost 4K panels, but with a couple of twits that help it stand out.

The Philips display is a 28-in class TN panel, has a 60 Hz refresh rate when utilizing the DisplayPort 1.2 connection option but adds connection capability that most other 4K panels in this price range leave off. Here are the specs from Philips:

Continue reading our review of the Philips 288P6LJEB 4K 60 Hz Monitor!!

Author:
Subject: Displays
Manufacturer: Acer

Technical Specifications

Here they come - the G-Sync monitors are finally arriving at our doors! A little over a month ago we got to review the ASUS ROG Swift PG278Q, a 2560x1440 144 Hz monitor that was the first retail-ready display to bring NVIDIA's variable refresh technology to consumers. It was a great first option with a high refresh rate along with support for ULMB (ultra low motion blur) technology, giving users a shot at either option.

Today we are taking a look at our second G-Sync monitor that will hit streets sometime in mid-October with an identical $799 price point. The Acer XB280HK is a 28-in 4K monitor with a maximum refresh rate of 60 Hz and of course, support for NVIDIA G-Sync.

The Acer XB280HK, first announced at Computex in June, is the first 4K monitor on the market to be announced with support for variable refresh. It isn't that far behind the first low-cost 4K monitors to hit the market, period: the ASUS PB287Q and the Samsung U28D590D both shipped in May of 2014 with very similar feature sets, minus G-Sync. I discussed much of the general usability benefits (and issues) that arose when using a consumer 4K panel with Windows 8.1 in those reviews, so you'll want to be sure you read up on that in addition to the discussion of 4K + G-Sync we'll have today.

02.jpg

While we dive into the specifics on the Acer XB280HK monitor today, I will skip over most of the discussion about G-Sync, how it works and why we want it. In our ASUS PG278Q review I had a good, concise discussion on the technical background of NVIDIA G-Sync technology and how it improves gaming.

The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

tearing5.jpg

Continue reading our review of the Acer XB280HK 4K G-Sync Monitor!!

Author:
Manufacturer: ASUS

The Waiting Game

NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.

In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.

IMG_9328.JPG

Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.

That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.

Continue reading our review of the ASUS ROG Swift PG278Q 2560x1440 G-Sync Monitor!!

Author:
Subject: Displays
Manufacturer: ASUS

4K for $649

The growth and adoption of 4K resolution panels (most commonly 3840x2160) has really been the biggest story of the past year or so in the world of PC gaming. After a couple of TVs that ran at 3840x2160 over HDMI at 30 Hz found there way into our offices, the first real 60 Hz 4K monitor that I got some hands on time with was the ASUS PQ321Q. This monitor was definitely targeted at the profressional market with its IGZO display (near IPS quality) and somewhat high price tag of $3500. It has since dropped to $2400 or so but it remains somewhat complicated by the use of MST technology (multi-stream transport) that was required to hit 60 Hz.

Earlier this month I took a look at the Samsung U28D590D 28-in 4K panel that was capable of 60 Hz refresh rates for just $699. This display used a single-stream transport DisplayPort connection to keep setup simple but used a TN panel rather than IPS/IGZO. This meant viewing angles were not as strong (though better than most TN screens you have seen before) but...that price! 

Today we have our second low cost, SST 4K monitor to evaluate, the ASUS PB287Q. We saw it at CES back in January and with a launch date of June 10th and an MSRP $649, ASUS is setting itself up for an impressive release. 

So what can you expect if you purchase the ASUS PB287Q 4K monitor? In short you get an adequate screen that won't live up to IPS standards but is just good enough for the PC gamer and productivity user in all of us. You'll also get a form factor that well exceeds that of the Samsung U28D590D with fully moveable stand and VESA mounting. And a price of $649 for a 3840x2160 screen doesn't hurt either.

asus01.jpg

Read on the next pages for more details on the user experience in Windows 8.1 as well as while gaming to see if this is the right monitor for you to buy this summer!

Continue reading our review of the ASUS PB287Q 4K 60 Hz 28-in Monitor!!

Author:
Subject: Displays
Manufacturer: Samsung

3840x2160 for Cheap!!

It has been just over a year ago when we first got our hands on a 4K display. At the time, we were using a 50-in Seiki 3840x2160 HDTV that ran at a 30 Hz refresh rate and was disappointing in terms of its gaming experience, but impressive in image quality and price ($1500 at the time). Of course, we had to benchmark graphics cards at 4K resolutions and the results proved what we expected - you are going to need some impressive hardware to run at 4K with acceptable frame rates.

Since that story was published, we saw progress in the world of 4K displays with the ASUS PQ321Q, a 4K monitor (not a TV) that was built to handle 60 Hz refresh rates. The problem, of course, was the requirement for a multi-stream connection that essentially pushes two distinct streams over a single DisplayPort cable to the monitor, each at 1920x2160. While in theory that wasn't a problem, we saw a lot configuration and installation headaches as we worked through the growing pains of drivers and firmware. Also, it was priced at $3200 when we first reviewed it, though that number has fallen to $2400 recently.

IMG_0053.jpg

Today we are looking at the Samsung U28D590D, the first 4K panel we have seen that supports a 60 Hz refresh rate with a single stream (single tile) implementation. That means that not only do you get the better experiences associated with a 60 Hz refresh rate over a 30 Hz, you also gain a much more simple and compatible installation and setup. No tricky driver issues to be found here! If you have a DisplayPort 1.2-capable graphics card, it's just plug and play.

The Samsung U28D590D uses a 28-in TN panel, which is obviously of a lower quality in terms of colors and viewing angles than the IGZO screen used on the ASUS PQ321Q, but it's not as bad as you might expect based on previous TN panel implementations. We'll talk a bit more about that below. The best part of course is the price - you can find the Samsung 4K panel for as low as $690!

Continue reading our review of the Samsung U28D590D 28-in 4K 60 HZ Monitor!!

Author:
Manufacturer: NVIDIA

Quality time with G-Sync

Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology.  When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers.  This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync. 

IMG_8938.JPG

NVIDIA's Prototype G-Sync Monitor

We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics.  All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.

Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed.  This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.

In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync.  In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better.  It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.

The story today is more about extensive hands-on testing with the G-Sync prototype monitors.  The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future.  These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market.  However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user. 

Continue reading our tech preview of NVIDIA G-Sync!!

Author:
Subject: Editorial
Manufacturer: Quakecon

The Densest 2.5 Hours Imaginable

John Carmack again kicked off this year's Quakecon with an extended technical discussion about nearly every topic bouncing around his head.  These speeches are somewhat legendary for the depth of discussion on what are often esoteric topics, but they typically expose some very important sea changes in the industry, both in terms of hardware and software.  John was a bit more organized and succinct this year by keeping things in check with some 300 lines of discussion that he thought would be interesting for us.
 
Next Generation Consoles
 
John cut to the chase and started off the discussion about the upcoming generation of consoles.  John was both happy and sad that we are moving to a new generation of products.  He feels that they really have a good handle on the optimizations of the previous generation of consoles to really extract every ounce of performance and create some interesting content.  The advantages of a new generation of consoles are very obvious, and that is particularly exciting for John.
 
31978_06_pre_orders_for_next_gen_xbox_one_controllers_and_headsets_now_open_full.jpg
 
The two major consoles are very, very similar.  There are of course differences between the two, but the basis for the two are very much the same.  As we well know, the two consoles feature APUs designed by AMD and share a lot of similarities.  The Sony hardware is a bit more robust and has more memory bandwidth, but when all is said and done, the similarities outweigh the differences by a large margin.  John mentioned that this was very good for AMD, as they are still in second place in terms of performance from current architectures as compared to Intel and their world class process technology.
 
Some years back there was a thought that Intel would in fact take over the next generation of consoles.  Larrabee was an interesting architecture in that it melded x86 CPUs with robust vector units in a high speed fabric on a chip.  With their prowess in process technology, this seemed a logical move for the console makers.  Time has passed, and Intel did not execute on Larrabee as many had expected.  While the technology has been implemented in the current Xeon Phi product, it has never hit the consumer world.
 
Author:
Subject: Displays
Manufacturer: ASUS

Specifications and Overview

Talk to most PC enthusiasts today, be they gamers or developers, and ask them what technology they are most interested in for the next year or so and you will most likely hear about 4K somewhere in the discussion.  While the world of consumer electronics and HDTV has been stuck in the rut of 1080p for quite some time now, computers, smartphones and tablets are racing in the direction of higher resolutions and higher pixel densities.  4K is a developing standard that pushes screen resolutions to 4K x 2K pixels and if you remove the competing options discussion (3840x2160 versus 4096x2160 are the most prominent) this move is all good news for the industry.

I first dove into the area of 4K displays when I purchased the SEIKI SE50UY04 50-in 4K TV in April for $1300 when it popped up online.  The TV showed up days later and we did an unboxing and preview of the experience and I was blown away by the quality difference by moving to a 3840x2160 screen, even with other caveats to be had.  It was a 30 Hz panel, half a typical LCD computer display today, it had limited functionality and it honestly wasn't the best quality TV I had ever used.  But it was 4K, it was inexpensive and it was available. 

It was hard to beat at the time but the biggest drawback was the lack of 60 Hz support, the ability for the screen to truly push 60 frames per second to the panel.  This caused some less than desirable results with Windows usage and even in gaming where visual tearing was more prominent when Vsync was disabled.  But a strength of this design was that it only required a single HDMI connection and would work with basically any current graphics systems.  I did some Frame Rating game performance testing at 4K and found that GPU horsepower was definitely a limiting factor. 

IMG_9767.JPG

Today I follow up our initial unboxing and preview of the ASUS PQ321Q 4K monitor with a more thorough review and summary of our usage results.  There is quite a bit that differs between our experience with the SEIKI and the ASUS panels and it is more than just the screen sizes.

Continue reading our review of the ASUS PQ321Q 4K 60 Hz Tiled Monitor!!

Author:
Subject: Displays
Manufacturer: ASUS
Tagged: video, pq321q, PQ321, asus, 4k

Some more 4K love!

This morning Fedex dropped off a new product at our offices, one that I was very eagerly awaiting: the ASUS PQ321Q 31.5-in 4K 60 Hz monitor!

pq321-1.jpg

While we are far from ready to post a full review of the display and have lots of more game testing to get to, we did host a live stream for the unboxing and initial testing of the PQ321Q that I think is worth sharing.

In this video we do a walk around the $3500 4K display, hook it up to both NVIDIA and AMD test bed at 60 Hz and then proceed to install 3-Way SLI Titans to see how it games!  Enjoy this quick preview before our full review of the ASUS PQ321Q.

UPDATE: This display is now available for purchase if you want to shell out the $3500!

Author:
Manufacturer: Oculus

Our first thoughts and impressions

Since first hearing about the Kickstarter project that raised nearly 2.5 million dollars from over 9,500 contributors, I have eagerly been awaiting the arrival of my Oculus Rift development kit.  Not because I plan on quitting the hardware review business to start working on a new 3D, VR-ready gaming project but just because as a technology enthusiast I need to see the new, fun gadgets and what they might mean for the future of gaming.

I have read other user's accounts of their time with the Oculus Rift, including a great write up in a Q&A form Ben Kuchera over at Penny Arcade Report, but I needed my own hands-on time with the consumer-oriented VR (virtual reality) product.  Having tried it for very short periods of time at both Quakecon 2012 and CES 2013 (less than 5 minutes) I wanted to see how it performed and more importantly, how my body reacted to it.

I don't consider myself a person that gets motion sick.  Really, I don't.  I fly all the time, sit in the back of busses, ride roller coasters, watch 3D movies and play fast-paced PC games on large screens.  The only instances I tend to get any kind of unease with motion is on what I call "roundy-round" rides, the kind that simply go in circles over and over.  Think about something like this, The Scrambler, or the Teacups at Disney World.  How would I react to time with the Oculus Rift, this was my biggest fear... 

For now I don't want to get into the politics of the Rift, how John Carmack was initially a huge proponent of the project then backed off on how close we might be the higher-quality consumer version of the device.  We'll cover those aspects in a future story.  For now I only had time for some first impressions.

Watch the video above for a walk through of the development kit as well as some of the demos, as best can be demonstrated in a 2D plane! 

Continue on to the full story for some photos and my final FIRST impressions of the Oculus Rift!

Manufacturer: PC Perspective

And Why the Industry Misses the Point

3d_01_title2.png

I am going to take a somewhat unpopular stance: I really like stereoscopic 3D. I also expect to change your mind and get you excited about stereoscopic 3D too - unless of course a circumstance such as monovision interferes with your ability to see 3D at all. I expect to accomplish where the industry has failed simply because I will not ignore the benefits of 3D in my explanation.

Firstly - we see a crisp image when our brain is more clearly able to make out objects in a scene.

We typically have two major methods of increasing the crispness of an image: we either increase the resolution or we increase the contrast of the picture. As resolution increases we receive a finer grid of positional information to place and contain the objects in the scene. As contrast increases we receive a wider difference between the brightest points and the darkest points from a scene which prevents objects from blending together in a mess of grey.

We are also able to experience depth information by comparing the parallax effect across both of our eyes. We are able to encapsulate each object into a 3D volume and position each capsule a more defined distance apart. Encapsulated objects appear crisper because we can more clearly see them as sharply defined independent objects.

Be careful with this stereoscopic 3D image. To see the 3D effect you must slowly cross your eyes until the two images align in the center. This should only be attempted by adults with fully developed eyes and without prior medical conditions. Also, sit a comfortable distance away so you do not need to cross your eyes too far inward and rest your eyes until they no longer feel strained. In short - do not pull an eye muscle or something. Use common sense. Also move your mouse cursor far away from the image as it will break your focusing lock and click on the image to make it full sized.

3d_03.png

Again, be careful when crossing your eyes to see stereoscopic 3D and relax them when you are done.

The above image is a scene from Unreal Tournament 3 laid out in a cross-eyed 3D format. If you are safely able to experience the 3D image then I would like you to pay careful attention to how crisp the 3D image appeared. Compare this level of crispness to either the left or right eye image by itself.

Which has the crisper picture quality?

That is basically why 3D is awesome: it makes your picture quality appear substantially better by giving your brain more information about the object. This effect can also play with how the brain perceives the world you present it: similar to how HDR tonal mapping plays with exposure ranges we cannot see and infrared photography plays with colors we cannot see to modify the photograph - which we can see - for surreal effects.

So what goes terribly wrong? Read on to the article to find out.

Author:
Subject: Displays
Manufacturer:

From Viewers Like You...

About two months ago, a viewer of the podcast that Ryan co-hosts on the This Week in Tech network, This Week in Computer Hardware, wrote in with some information that immediately excited the staff here at PC Perspective. Ryan for a long time has been of the opinion that the proliferation of 1080p displays, and prohibitive cost of high resolution monitors has been holding the industry back as a whole. With talk of 4K displays being introduced for consumers this year, a major topic on the podcast in the weeks prior to this viewer email had centered around why we haven't seen affordable 2560x1440 (or 2560x1600) displays.

22081ba9_1024X768.jpeg

This brings us back to the knowledge which the listener Jeremy bestowed upon us.  Jeremy brought to our attention that various eBay sellers were reselling and exporting generic 27", IPS, LED backlight, 2560x1440 monitors from South Korea. What is remarkable about these displays however is that various models can be found for just around, or even under $350. Everyone listening, including Ryan and his co-host Patrick Norton became immediately interested in these monitors, and I went into research mode.

Continue reading our review of the 27-in Achieva Shimian 2560x1440 monitor!