Manufacturer: PC Percpective

Overview

We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.

IMG_9328.JPG

As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.

Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.

screen refresh rate-.png

First a bit of misnomer correction / foundation laying:

  • The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
  • LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
  • In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
  • The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).

With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:

eve ss-2-.png

Measured panel section brightness over time during a 'stall' event. Click to enlarge.

The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.

Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.

A few final points before we go:

  • This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
  • The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
  • The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).

This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.

During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:

"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.

This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.

When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."

So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below. 

(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)

The MSI GTX 980 GAMING 4G and its fancy new fan

Subject: Graphics Cards | December 1, 2014 - 02:52 PM |
Tagged: msi, nvidia, GTX 980, GAMING 4G, factory overclocked, Twin Frozr V

MSI has updated their Twin Frozr V with Torx fans which are effective at moving a lot of air very quietly and 'S' shaped heatpipes which bear the name SuperSU.  Connectivity is provided by dual-link DVI-I, HDMI and three DisplayPort plugs which ought to provide enough flexibility for anyone.  It is clocked at 1216 - 1331MHz out of the box with GDDR5 running at 7GHz effective which [H]ard|OCP managed to increase to 1406 - 1533MHz and 7.16GHz on the memory which is rather impressive for a Maxwell chip with NVIDIA's power limits and shows just how much you can squeeze out of their new chip without needing to up the amount of juice you are providing it.  The overclocked card upped the full system wattage to 378W which was much lower than the R9 290 they tested against and the GPU temperature went as high as 70C when pushed to the limit which again is lower than the 290 however NVIDIA's selling price is certainly higher than AMD's.  Check out their full review here.

1417398114ROQehtibgG_1_1.jpg

"The MSI GTX 980 GAMING 4G video card has a factory overclock and the new Twin Frozr V cooling system. We'll push it to its highest custom overclock and pit it against the ASUS ROG R9 290X MATRIX Platinum overclocker, and determine the gaming bang for your buck. May the best card win."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Manufacturer: MSI

Card Overview

It has been a couple of months since the release of the GeForce GTX 970 and the GM204 GPU that it is based on. After the initial wave of stock on day one, NVIDIA had admittedly struggled to keep these products available. Couple that with rampant concerns over coil whine from some non-reference designs, and you could see why we were a bit hesitant to focus and spend our time on retail GTX 970 reviews.

IMG_9818.JPG

These issues appear to be settled for the most part. Finding GeForce GTX 970 cards is no longer a problem and users with coil whine are getting RMA replacements from NVIDIA's partners. Because of that, we feel much more comfortable reporting our results with the various retail cards that we have in house, and you'll see quite a few reviews coming from PC Perspective in the coming weeks.

But let's start with the MSI GeForce GTX 970 4GB Gaming card. Based on user reviews, this is one of the most popular retail cards. MSI's Gaming series of cards combines a custom cooler that typically runs quieter and more efficient than reference design, and it comes with a price tag that is within arms reach of the lower cost options as well.

The MSI GeForce GTX 970 4GB Gaming

MSI continues with its Dragon Army branding, and its associated black/red color scheme, which I think is appealing to a wide range of users. I'm sure NVIDIA would like to see a green or neutral color scheme, but hey, there are only so many colors to go around.

IMG_9816.JPG

Continue reading our review of the MSI GeForce GTX 970 Gaming graphics card!!

ASUS Announces GeForce GTX 970 DirectCU Mini: More Mini-ITX Gaming Goodness

Subject: Graphics Cards | November 29, 2014 - 09:57 AM |
Tagged: pcie, PCI Express, nvidia, mini-itx, GTX 970, graphics card, geforce, directcu mini, DirectCU, asus

ASUS has announced a tiny new addition to their GTX 970 family, and it will be their most powerful mini-ITX friendly card yet with a full GeForce GTX 970 GPU.

970_1.png

Image credit: ASUS

The ASUS 970 DirectCU Mini card will feature a modest factory overclock on the GTX 970 core running at 1088 MHz (stock 1050 MHz) with a 1228 MHz Boost Clock (stock 1178 MHz). Memory is not overclocked and remains at the stock 7 GHz speed.

970_2.png

The GTX 970 DirectCU Mini features a full backplate. Image credit: ASUS

The ASUS GTX 970 DirectCU Mini uses a single 8-pin PCIe power connector in place of the standard dual 6-pin configuration, which shouldn’t be a problem considering the 150W spec of the larger connector (and 145W NVIDIA spec of the 970).

970_3.png

Part of this complete mITX gaming breakfast. Image credit: ASUS

The tiny card offers a full array of display outputs including a pair of dual-link DVI connectors, HDMI 2.0, and DisplayPort 1.2. No word yet on pricing or availability, but the product page is up on the ASUS site.

Oak Ridge National Laboratory Chooses IBM and NVIDIA for Two Supercomputers, Summit and Sierra

Subject: General Tech, Systems | November 27, 2014 - 08:53 PM |
Tagged: nvidia, IBM, power9, Volta

The Oak Ridge National Laboratory has been interested in a successor for their Titan Supercomputer. Sponsored by the US Department of Energy, the new computer will be based on NVIDIA's Volta (GPU) and IBM's POWER9 (CPU) architectures. Its official name will be “Summit”, and it will have a little sibling, “Sierra”. Sierra, also based on Volta and POWER9, will be installed at the Lawrence Livermore National Laboratory.

nvidia-ibm-coral_summit_sierra_supercomputers.png

Image Credit: NVIDIA

The main feature of these supercomputers is expected to be “NVLink”, which is said to allow unified memory between CPU and GPU. This means that, if you have a workload that alternates rapidly between serial and parallel tasks, that you can save the lag in transferring memory between each switch. One example of this would be a series of for-each loops on a large data set with a bit of logic, checks, and conditional branches between. Memory management is like a lag between each chunk of work, especially across two banks of memory attached by a slow bus.

Summit and Sierra are both built by IBM, while Titan, Oak Ridge's previous supercomputer, was developed by Cray. Not much is known about the specifics of Sierra, but Summit will be about 5x-10x faster (peak computational throughput) than its predecessor at less than a fifth of the nodes. Despite the fewer nodes, it will suck down more total power (~10MW, up from Titan's ~9MW).

These two supercomputers are worth $325 million USD (combined). They are expected to go online in 2017. According to Reuters, an additional $100 million USD will go toward research into "extreme" supercomputing.

Source: Anandtech

For those who dare to taunt the honey badger

Subject: General Tech | November 26, 2014 - 02:42 PM |
Tagged: far cry 3, amd, nvidia, gaming

Far Cry 4 uses the same engine as the previous game, Dunia Engine 2, albeit updated and modified for the new features GPUs can handle, especially NVIDIA's Gameworks features.  This gives you some idea of how your system will handle the game but for a definitive look at performance just check out this review at [H]ard|OCP.  For their testing they used the GeForce 344.75 WHQL on their GTX 980 and 970 and the Catalyst 14.11.2 Beta for the R9 290X and 290.  On the Ultra preset running at 1440p the performance differences between the AMD and NVIDIA cards were negligible, once they started testing the new features such as the enhanced godrays and AA options there were some significant differences which you should educate yourself about.  It is worth noting that even two GTX 980s in SLI at 3600x1920 are not capable of handling 8x MSAA, thankfully SMAA is supported in the game.

1416562151m24RGeeAUx_2_1_l.jpg

"Far Cry 4 is here, and we take an early look at how current video cards stack up in performance, and which quality settings are graphically demanding. We will also look at some image quality comparisons and talk about the state of this game at launch. Will it measure up to Far Cry 3 in terms of graphic fidelity?"

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP

NVIDIA SHIELD Gets Two New Games for GRID: Psychonauts and Red Faction: Armageddon

Subject: General Tech, Mobile | November 25, 2014 - 07:21 PM |
Tagged: nvidia, shield, grid, shield tablet, Psychonauts, red faction: armageddon

Last Tuesday, NVIDIA launched the November SHIELD update with Android 5.0 Lollipop, The Green Box promotion, and a refreshed GRID service. Regarding the last part, which is a game streaming service, they also committed to adding at least one extra title per week. Now, seven days later, they pushed two titles to the service: Psychonauts and Red Faction: Armageddon.

red-faction-armageddon-image.jpg

While I have never played Red Faction: Armageddon, I did purchase Psychonauts for the Xbox and, later, the PC. It is a fun, linear narrative about kids in a summer camp that specializes in telekinetic/telepathic education for gifted individuals. If you have a SHIELD device, and you are able to play it on GRID, try it. Like it or not, it's free and does not require an installation.

psychonauts-screenshot.jpg

As will be the case until June 30th, access to the service is free for owners of the SHIELD and SHIELD Tablet. Future titles are expected to be announced on Twitter via the “#GRIDTuesday” hashtag. We will probably have a news post about them, too.

Source: NVIDIA

Podcast #327 - NVIDIA MFAA, Corsair's Neutron XT SSD, New Dell 4K Monitors

Subject: General Tech | November 20, 2014 - 02:40 PM |
Tagged: podcast, video, msi, am3+, windows 10, Inateck, corsair, Neutron XT, nvidia, mfaa, shield, grid, gigabyte, raptr, Dell 4K

PC Perspective Podcast #327 - 11/20/2014

Join us this week as we discuss NVIDIA MFAA, Corsair's Neutron XT SSD, New Dell 4K Monitors

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Samsung Announces First FreeSync UHD Monitors

Subject: Displays | November 20, 2014 - 10:50 AM |
Tagged: TN, Samsung, nvidia, monitor, ips, g-sync, freesync, amd

We have been teased for the past few months about when we would see the first implementations of AMD’s FreeSync technology, but now we finally have some concrete news about who will actually be producing these products.

Samsung has announced that they will be introducing the world’s first FreeSync enabled Ultra HD monitors.  The first models to include this feature will be the updated UD590 and the new UE850.  These will be introduced to the market in March of 2015.  The current UD590 monitor is a 28” unit with 3845x2160 resolution with up to 1 billion colors.  This looks to be one of those advanced TN panels that are selling from $500 to $900, depending on the model.

Samsung-UD590.jpg

AMD had promised some hand’s on time for journalists by the end of this year, and shipping products in the first half of next year.  It seems that Samsung is the first to jump on the wagon.  We would imagine that others will be offering the technology.  In theory this technology offers many of the same benefits of NVIDIA’s G-SYNC, but it does not require the same level of hardware.  I can imagine that we will be seeing some interesting comparisons next year with shipping hardware and how Free-Sync stacks up to G-SYNC.

Joe Chan, Vice President of Samsung Electronics Southeast Asia Headquarters commented, “We are very pleased to adopt AMD FreeSync technology to our 2015 Samsung Electronics Visual Display division’s UHD monitor roadmap, which fully supports open standards.  With this technology, we believe users including gamers will be able to enjoy their videos and games to be played with smoother frame display without stuttering or tearing on their monitors.”

Source: Samsung

Testing Updated NVIDIA GRID on SHIELD Tablet with Lollipop

Subject: Mobile | November 18, 2014 - 10:40 AM |
Tagged: tegra, shield tablet, shield, nvidia, grid

In December of last year we took NVIDIA's GRID technology through some testing and discussed our experiences in text and video. At that point you were able to play 8 specific games under the guise of a beta program. The experience was pretty good and a definite improvement over my first attempt at streaming games (OnLive). Here is what I wrote last year:

Overall my experience with the first beta of GRID was very positive including both latency and image quality.  Yes, there were definitely times when we got a lot of macro-blocking due to bandwidth hiccups, but they were infrequent.  You could tell pretty much anytime there was motion on the screen that you were watching a video rather than native gameplay, but I think the effect is much less apparent now than it was when I first tried services like OnLive.

Input latency is also definitely seen, and was most evident in my testing with Street Fighter IV.  You can even see some of it in our video embedded on this post.  That is something that NVIDIA claims to have really optimized for with their integrated H.264 encoding on the server GPUs, but getting more servers in more locations will help tremendously moving forward.

Today, along with the official roll out of the Android 5.0 Lollipop software update for the SHIELD Tablet, the NVIDIA GRID service goes into official release. What exactly that means is up in the air, as the service is still set to be free to all SHIELD and SHIELD Tablet users through June 2015. What I can tell you is that the quality of the experience has been improved and the game selection has expanded quite a bit, with more to come.

tabletpic.jpg

Setup of GRID is much easier now, as long as you have the appropriate hardware to get GRID service up and running. That means a SHIELD Portable or SHIELD Tablet with SHIELD Controller. These are the items that stand out beyond that:

  • Internet connection with at least a 10 Mbps download speed
  • Home network with 60 ms or less ping time to a GRID server
  • NVIDIA GameStream-ready 5 GHz Wi-Fi router

I have asked for the location of the GRID servers geographically, as that will definitely be a factor in your ability to get the appropriate 60 ms or lower ping time. (UPDATE: NVIDIA tells me that the current locations are Oregon and Virginia.) The list of compatible routers has been growing over the last year as well including some from Netgear, D-Link, Buffalo and ASUS. If you don't already have one of these routers, you can still TRY to use the GRID service but it won't be officially supported by NVIDIA.

batman3.jpg

LEGO Batman 2

The games available to play on NVIDIA GRID has expanded as well.

  • Alan Wake's American Nightmare
  • Astebreed
  • Batman: Arkham Asylum
  • Batman: Arkham City
  • Borderlands
  • Borderlands 2
  • Brutal Legend
  • Darksiders
  • Darksiders 2
  • Dead Island
  • Dirt 2
  • LEGO Batman 2
  • LEGO Marvel Super Heroes
  • Race Driver GRID
  • Strike Suit Zero
  • Saints Row: The Third
  • Street Fighter X Tekken
  • The Witcher 2: Assassins of Kings
  • Trine 2
  • Ultra Street Fighter IV

There are some great titles in here including Borderlands, Saint's Row, The Witcher 2, the Batman games, etc. and if you haven't played them before then getting access to them for free is awesome. Even better, NVIDIA has committed to adding one new game each week between now and June of next year. NVIDIA upgraded the login / account system to move away from being associated solely with the device and instead uses your Google account login information to register save data.

games1.jpg

In terms of game quality and gaming experience, I would say that GRID continues to improve. I spent some time with DiRT 2, LEGO Batman 2, Street Fighter IV and The Witcher 2 and in all cases the games looked great, with very little macro-blocking or stutter. We tested on both our office connection (1.0 Gbps fiber) and my home connection (30 Mbps cable) and the results were pretty much the same.

For those concerned with latency of input, there is definitely still some there, most apparent in fighting game like Street Fighter IV. With Borderlands and Borderlands 2 being the only FPS games in the collection, you could likely assume that the twich-style actions of these types of shooters would be most affected. Titles like Street Fighter IV and DiRT 2, for those of us that don't consider ourselves experts, can be adjusted to; you can make your mind compensate for the added input differences of playing games locally.

dirt21.jpg

DiRT 2

With the SHIELD Tablet, another possible use for GRID is to play these streaming games on your TV. The tablet itself has an HDMI output and is capable of outputting 1080p to your big screen. With the SHIELD Controller you can get a true couch gaming experience with GRID; I am looking forward to showing this to my niece and nephews over the Thanksgiving holiday and getting some reactions and feedback.

witcher22.jpg

The Witcher 2

The other big news today is the release of SHIELD Tablet software update 2.0 that includes Android 5.0 and Lollipop, updates for the new GRID release and an updated NVIDIA Dabbler V2.0 program. We'll have more thoughts on that software update very soon but you can get more details on the upgrades Lollipop provides for NVIDIA's tablet right here.