Subject: Graphics Cards | December 9, 2014 - 03:08 PM | Jeremy Hellstrom
Tagged: amd, catalyst, driver, omega
With AMD's new leader and restructuring comes a new type of driver update. The Omega driver is intended to provide a large number of new features as well as performance updates once a year. It does not replace the current cycle of Beta and WHQL driver updates and the next driver update will incorporate all of the changes from the Omega driver plus the new bug fixes or updates that the driver was released to address.
Many sites including The Tech Report have had at least a small amount of time to test the new driver and have not seen much in the way of installation issues, or unfortunately performance improvements on systems not using an AMD APU. As more time for testing elapses and more reviews come out we may see improvements on low end systems but for now the higher end machines show little to no improvement on raw FPS rates. Keep your eyes peeled for an update once we have had time to test the change on frame pacing results, which are far more important than just increasing your FPS.
The main reason to be excited about this release, it is the long list of new features, from a DSR-like feature called Virtual Super Resolution which allows you to increase the resolution of your monitor although for now 4K super resolution is limited to the R285 as it is the only AMD Tonga card on the market at the moment. Along with the release of the Omega driver comes news about Freesync displays, another feature enabled in the new driver and their availability; we have a release date of January or February with a 4K model arriving in March.
Check out the links to The Tech Report and below to read the full list of new features that this driver brings and don't forget to click on Ryan's article as well.
"AMD has introduced what may be its biggest graphics driver release ever, with more than 20 new features, 400 bug fixes, and some miscellaneous performance improvements."
Here are some more Graphics Card articles from around the web:
- Introducing the new AMD Catalyst Omega Driver @ [H]ard|OCP
- AMD Catalyst 14.12 Omega Performance Analysis @ techPowerUp
- AMD Catalyst Omega Drivers; Details & Performance @ Hardware Canucks
- AMD Catalyst Omega Launch @ Kitguru
- XFX R9 285 Double Dissipation Black Edition Review @ OCC
- Alpenföhn Peter 2 on GTX 970 @ HardwareOverclock
- Colorful iGame GTX 970 4096 MB @ techPowerUp
- ASUS GTX 980 STRIX OC Review @ Hardware Canucks
- Swiftech Komodo R9 LE GPU Water Block Review @ OCIA.net
There are smart people that work at AMD. A quick look at the company's products, including the APU lineup as well as the discrete GPU fields, clearly indicates a lineup of talent in engineering, design, marketing and business. It's not perfect of course, and very few companies can claim to be, but the strengths of AMD are there and easily discernible to those of us on the outside looking in with the correct vision.
Because AMD has smart people working hard to improve the company, they are also aware of its shortcomings. For many years now, the thorn of GPU software has been sticking in AMD's side, tarnishing the name of Radeon and the products it releases. Even though the Catalyst graphics driver has improved substantially year after year, the truth is that NVIDIA's driver team has been keeping ahead of AMD consistently in basically all regards: features, driver installation, driver stability, performance improvements over time.
If knowing is half the battle, acting on that knowledge is at least another 49%. AMD is hoping to address driver concerns now and into the future with the release of the Catalyst Omega driver. This driver sets itself apart from previous releases in several different ways, starting with a host of new features, some incremental performance improvements and a drastically amped up testing and validation process.
AMD considers this a "special edition" driver and is something that they plan to repeat on a yearly basis. That note in itself is an interesting point - is that often enough to really change the experience and perception of the Catalyst driver program going forward? Though AMD does include some specific numbers of tested cases for its validation of the Omega driver (441,000+ automated test runs, 11,000+ manual test runs) we don't have side by side data from NVIDIA to compare it to. If AMD is only doing a roundup of testing like this once a year, but NVIDIA does it more often, then AMD might soon find itself back in the same position it has been.
UPDATE: There has been some confusion based on this story that I want to correct. AMD informed us that it is still planning on releasing other drivers throughout the year that will address performance updates for specific games and bug fixes for applications and titles released between today and the pending update for the next "special edition." AMD is NOT saying that they will only have a driver drop once a year.
But before we worry about what's going to happen in the future, let's look into what AMD has changed and added to the new Catalyst Omega driver released today.
We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.
As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.
Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.
First a bit of misnomer correction / foundation laying:
- The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
- LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
- In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
- The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).
With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:
Measured panel section brightness over time during a 'stall' event. Click to enlarge.
The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.
Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.
A few final points before we go:
- This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
- The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
- The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).
This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.
During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:
"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.
This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.
When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."
So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below.
(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)
Subject: General Tech, Graphics Cards | December 2, 2014 - 03:11 AM | Scott Michaud
Tagged: amd, GCN, dice, frostbite
Inverse trigonometric functions are difficult to compute. Their use is often avoided like the plague. If, however, the value is absolutely necessary, it will probably be solved by approximations or, if possible, replacing them with easier functions by clever use of trig identities.
If you want to see how the experts approach this problem, then Sébastien Lagarde, a senior developer of the Frostbite engine at DICE, goes into detail with a blog post. By detail, I mean you will see some GPU assembly being stepped through by the end of it. What makes this particularly interesting is the diagrams at the end, showing what each method outputs as represented by the shading of a sphere.
If you are feeling brave, take a look.
Subject: Graphics Cards | December 1, 2014 - 02:52 PM | Jeremy Hellstrom
Tagged: msi, nvidia, GTX 980, GAMING 4G, factory overclocked, Twin Frozr V
MSI has updated their Twin Frozr V with Torx fans which are effective at moving a lot of air very quietly and 'S' shaped heatpipes which bear the name SuperSU. Connectivity is provided by dual-link DVI-I, HDMI and three DisplayPort plugs which ought to provide enough flexibility for anyone. It is clocked at 1216 - 1331MHz out of the box with GDDR5 running at 7GHz effective which [H]ard|OCP managed to increase to 1406 - 1533MHz and 7.16GHz on the memory which is rather impressive for a Maxwell chip with NVIDIA's power limits and shows just how much you can squeeze out of their new chip without needing to up the amount of juice you are providing it. The overclocked card upped the full system wattage to 378W which was much lower than the R9 290 they tested against and the GPU temperature went as high as 70C when pushed to the limit which again is lower than the 290 however NVIDIA's selling price is certainly higher than AMD's. Check out their full review here.
"The MSI GTX 980 GAMING 4G video card has a factory overclock and the new Twin Frozr V cooling system. We'll push it to its highest custom overclock and pit it against the ASUS ROG R9 290X MATRIX Platinum overclocker, and determine the gaming bang for your buck. May the best card win."
Here are some more Graphics Card articles from around the web:
- NVIDIA Multi-Frame Sampled AA @ [H]ard|OCP
- OcUK GeForce GTX 970 'NVIDIA 970 Cooler Edition' @ Kitguru
- EVGA GTX 980 Classified Video Card Review @ Hardware Asylum
- EVGA GeForce GTX 970 FTW ACX 2.0 Review @ Hardware Canucks
- MSI GTX 980 Gaming 4G Review @ Neoseeker
It has been a couple of months since the release of the GeForce GTX 970 and the GM204 GPU that it is based on. After the initial wave of stock on day one, NVIDIA had admittedly struggled to keep these products available. Couple that with rampant concerns over coil whine from some non-reference designs, and you could see why we were a bit hesitant to focus and spend our time on retail GTX 970 reviews.
These issues appear to be settled for the most part. Finding GeForce GTX 970 cards is no longer a problem and users with coil whine are getting RMA replacements from NVIDIA's partners. Because of that, we feel much more comfortable reporting our results with the various retail cards that we have in house, and you'll see quite a few reviews coming from PC Perspective in the coming weeks.
But let's start with the MSI GeForce GTX 970 4GB Gaming card. Based on user reviews, this is one of the most popular retail cards. MSI's Gaming series of cards combines a custom cooler that typically runs quieter and more efficient than reference design, and it comes with a price tag that is within arms reach of the lower cost options as well.
The MSI GeForce GTX 970 4GB Gaming
MSI continues with its Dragon Army branding, and its associated black/red color scheme, which I think is appealing to a wide range of users. I'm sure NVIDIA would like to see a green or neutral color scheme, but hey, there are only so many colors to go around.
Subject: Graphics Cards | November 29, 2014 - 09:57 AM | Sebastian Peak
Tagged: pcie, PCI Express, nvidia, mini-itx, GTX 970, graphics card, geforce, directcu mini, DirectCU, asus
ASUS has announced a tiny new addition to their GTX 970 family, and it will be their most powerful mini-ITX friendly card yet with a full GeForce GTX 970 GPU.
Image credit: ASUS
The ASUS 970 DirectCU Mini card will feature a modest factory overclock on the GTX 970 core running at 1088 MHz (stock 1050 MHz) with a 1228 MHz Boost Clock (stock 1178 MHz). Memory is not overclocked and remains at the stock 7 GHz speed.
The GTX 970 DirectCU Mini features a full backplate. Image credit: ASUS
The ASUS GTX 970 DirectCU Mini uses a single 8-pin PCIe power connector in place of the standard dual 6-pin configuration, which shouldn’t be a problem considering the 150W spec of the larger connector (and 145W NVIDIA spec of the 970).
Part of this complete mITX gaming breakfast. Image credit: ASUS
The tiny card offers a full array of display outputs including a pair of dual-link DVI connectors, HDMI 2.0, and DisplayPort 1.2. No word yet on pricing or availability, but the product page is up on the ASUS site.
Subject: Graphics Cards | November 20, 2014 - 07:08 PM | Jeremy Hellstrom
Tagged: sli, NVIDA, GTX 970
The contestants are lined up in [H]ard|OCP's test bench, at around $700 you have a pair of GTX 970's and in the same weight class are a pair of R9 290X cards, next weighing in at just under $550 are two R9 290s, and rounding out the completion are a pair of GTX 780's who punch somewhere between $800 to $1000 depending on when you look. The cards are tested for their ability to perform on a 4K stage as well as in the larger 5760x1200 multi-monitor event. After a long and gruelling battle the extra work the 290X put into trimming its self down and fitting into a lower weight class has proven to be well worth the effort as they managed to show up the 970's in every performance category although certainly not in power efficiency. Any of these pairings will be powerful but none can match a pair of GTX 980's who are also in a price class all by themselves.
"We take 2-Way NVIDIA GeForce GTX 970 SLI for a spin and compare it to R9 290X CF, R9 290 CF, GTX 780 SLI at 4K resolution as well as NV Surround on a triple-display setup. If you want to see how all these video cards compare in these different display configurations we've got just the thing. Find out what $700 SLI gets you."
Here are some more Graphics Card articles from around the web:
- ZOTAC GeForce GTX 970 AMP! Extreme Edition @ Bjorn3d
- MSI GTX 970 Gaming 4G Review @ OCC
- Gigabyte GTX 970 WindForce 3X @ HardwareOverclock
- NVIDIA GeForce GTX 980 SLI Overclocked @ [H]ard|OCP
- MSI GTX 980 Gaming 4G @ Bjorn3D
- Raijintek Morpheus VGA Cooler Review @ Hardware Asylum
- AMD Radeon Gallium3D Is Catching Up & Sometimes Beating Catalyst On Linux @ Phoronix
MFAA Technology Recap
In mid-September NVIDIA took the wraps off of the GeForce GTX 980 and GTX 970 GPUs, the first products based on the GM204 GPU utilizing the Maxwell architecture. Our review of the chip, those products and the package that NVIDIA had put together was incredibly glowing. Not only was performance impressive but they were able to offer that performance with power efficiency besting anything else on the market.
Of course, along with the new GPU were a set of new product features coming along for the ride. Two of the most impressive were Dynamic Super Resolution (DSR) and Multi-Frame Sampled AA (MFAA) but only one was available at launch: DSR. With it, you could take advantage of the extreme power of the GTX 980/970 with older games, render in a higher resolution than your panel, and have it filtered down to match your screen in post. The results were great. But NVIDIA spent as much time talking about MFAA (not mother-fu**ing AA as it turned out) during the product briefings and I was shocked when I found out the feature wouldn't be ready to test or included along with launch.
That changes today with the release of NVIDIA's 344.75 driver, the first to implement support for the new and potentially important anti-aliasing method.
Before we dive into the results of our testing, both in performance and image quality, let's get a quick recap on what exactly MFAA is and how it works.
Here is what I wrote back in September in our initial review:
While most of the deep, architectural changes in GM204 are based around power and area efficiency, there are still some interesting feature additions NVIDIA has made to these cards that depend on some specific hardware implementations. First up is a new antialiasing method called MFAA, or Multi-Frame Sampled AA. This new method alternates the AA sample pattern, which is now programmable via software, in both temporal and spatial directions.
The goal is to change the AA sample pattern in a way to produce near 4xMSAA quality at the effective cost of 2x MSAA (in terms of performance). NVIDIA showed a couple of demos of this in action during the press meetings but the only gameplay we saw was in a static scene. I do have some questions about how this temporal addition is affected by fast motion on the screen, though NVIDIA asserts that MFAA will very rarely ever fall below the image quality of standard 2x MSAA.
That information is still correct but we do have a little bit more detail on how this works than we did before. For reasons pertaining to patents NVIDIA seems a bit less interested in sharing exact details than I would like to see, but we'll work with what we have.
Subject: Graphics Cards | November 14, 2014 - 11:46 AM | Ryan Shrout
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way
Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.
That. Looks. Amazing.
What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:
- WATERFORCE :3-WAY SLI Water Cooling System
- Real-Time Display and Control
- Flex Display Technology
- Powered by NVIDIA GeForce GTX 980 GPU
- Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
- Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
- BASE: 1228 MHz / BOOST: 1329 MHz
- System power supply requirement: 1200W(with six 8-pin external power connectors)
The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.
Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.
With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.
Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.
Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.
Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.
Can I get two please?