CES 2015: Gigabyte GTX 980 WaterForce 3-Way SLI Monster Spotted

Subject: Graphics Cards, Shows and Expos | January 5, 2015 - 07:15 PM |
Tagged: waterforce, GTX 980, gigabyte, ces 2015, CES, 3-way sli

Back in November Gigabyte asked for all your money in exchange for a set of three GeForce GTX 980 cards each running in a circuit of self-contained water cooling. After finally seeing the GTX 980 WaterForce in person I can tell you that it's big, it's expensive and it's damned impressive looking.

wf-2.jpg

With a price tag of $2999, there is a significant mark up over buying just a set of three retail GTX 980 cards, but this design is unique. Each GPU is individually cooled via a 120mm radiator and fan that is mounted inside of a chassis that rests on top of your PC case. On the front you'll find a temperature, fan speed and pump speed indicator along with some knobs and buttons to adjust settings and targets.

wf-3.jpg

Oh, and it ships inside of a suitcase that you can reuse for later travel. Ha! Think we can convince Gigabyte to send us one for testing?

wf-1.jpg

Coverage of CES 2015 is brought to you by Logitech!

PC Perspective's CES 2015 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

ASUS updates their popular series with the GTX 980 STRIX DC II OC

Subject: Graphics Cards | January 5, 2015 - 05:31 PM |
Tagged: GTX 980 STRIX DirectCU II OC, strix, asus, nvidia, factory overclocked

ASUS' popular STRIX line was recently updated to include NVIDIA's top card and now [H]ard|OCP has had a chance to benchmark this GTX 980 with custom quiet cooling.  The DirectCU II cooling system can operate at 0dB under all but the heaviest of loads and the 10 phase power design will mean you can go beyond the small factory overclock that the card arrives with.  [H]ard|OCP took the card from a Boost Clock of 1279MHz to 1500MHz and the RAM from 7GHz to 7.9GHz with noticeable performance improvements part of why it received a Gold Award.  If the ~$130 price difference between this card and the R9 290X does not bother you then it is a great choice for a new GPU.

14192008366QHJaRnDQg_1_1.jpg

"Today we delve into the ASUS GTX 980 STRIX DC II OC, which features custom cooling, 0dB fans and high overclocking potential. We'll experiment with this Maxwell GPU by overclocking it to the extreme. It will perform head to head against the ASUS ROG R9 290X MATRIX-P in today's most demanding games, including Far Cry 4."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

CES 2015: MSI Announces GTX 970 Gaming 100ME - 100 millionth GeForce GPU

Subject: Graphics Cards, Shows and Expos | January 4, 2015 - 03:56 PM |
Tagged: msi, GTX 970, gaming, ces 2015, CES, 100me

To celebrate the shipment of 100 million GeForce GPUs, MSI is launching a new revision of the GeForce GTX 970, the Gaming 100ME (millionth edition). The cooler is identical that used in the GTX 970 Gaming 4G but replaces the red color scheme of the MSI Gaming brand with a green very close to that of NVIDIA's.

100me-1.jpg

This will also ship with a "special gift" and will be a limited edition, much like the Golden Edition GTX 970 from earlier this year.

100me-2.jpg

MSI had some other minor updates to its GPU line including the GTX 970 4GD5T OC with a cool looking black and white color scheme and an 8GB version of the Radeon R9 290X.

Coverage of CES 2015 is brought to you by Logitech!

PC Perspective's CES 2015 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

GPU Rumors: AMD Plans 20nm but NVIDIA Waits for 16nm

Subject: General Tech, Graphics Cards | December 28, 2014 - 09:47 PM |
Tagged: radeon, nvidia, gtx, geforce, amd

According to an anonymous source of WCCFTech, AMD is preparing a 20nm-based graphics architecture that is expected to release in April or May. Originally, they predicted that the graphics devices, which they call R9 300 series, would be available in February or March. The reason for this “delay” is a massive demand for 20nm production.

nvidia-gtx-vs-amd-gaming-evolved.jpg

The source also claims that NVIDIA will skip 20nm entirely and instead opt for 16nm when that becomes available (which is said to be mid or late 2016). The expectation is that NVIDIA will answer AMD's new graphics devices with a higher-end Maxwell device that is still at 28nm. Earlier rumors, based on a leaked SiSoftware entry, claim 3072 CUDA cores that are clocked between 1.1 GHz and 1.39 GHz. If true, this would give it between 6.75 and 8.54 TeraFLOPs of performance, the higher of which is right around the advertised performance of a GeForce Titan Z (only in a single compute device that does not require distribution of work like what SLI was created to automate).

Will this strategy work in NVIDIA's favor? I don't know. 28nm is a fairly stable process at this point, which will probably allow them to get chips that can be bigger and more aggressively clocked. On the other hand, they pretty much need to rely upon chips that are bigger and more aggressively clocked to be competitive with AMD's slightly more design architecture. Previous rumors also hint that AMD is looking at water-cooling for their reference card, which might place yet another handicap against NVIDIA, although cooling is not an area that NVIDIA struggles in.

Source: WCCFTech

Nvidia GeForce 347.09 beta drivers have arrived

Subject: Graphics Cards | December 17, 2014 - 09:19 PM |
Tagged: geforce, nvidia, 347.09 beta

The 347.09 beta driver is out, which will help performance in Elite: Dangerous and Metal Gear Solid V: Ground Zeroes.  If you use GeForce Experience they will install automatically otherwise head to the driver page to manually install them.  Project CARS should also benefit from this new beta and you will be able to enable 3D on Alien: Isolation, Elite: Dangerous, Escape Dead Island, Far Cry 4 and Middle-Earth - Shadow of Mordor.  NVIDIA's new incremental updates, called GeForce Game Ready will mean more frequent driver updates with less changes than we have become accustomed to but do benefit those playing the games which they have been designed to improve.

14-NV-GTX_980_970-668x258-GF-Header-PDP-3A.jpg

As with the previous WHQL driver, GTX 980M SLI and GTX 970M SLI on notebooks does not function and if you do plan on updating your gaming laptop you should disable SLI before installing them.  You can catch up on all the changes in this PDF

Source: NVIDIA

AMD Omega is no longer in Alpha

Subject: Graphics Cards | December 9, 2014 - 03:08 PM |
Tagged: amd, catalyst, driver, omega

With AMD's new leader and restructuring comes a new type of driver update.  The Omega driver is intended to provide a large number of new features as well as performance updates once a year.  It does not replace the current cycle of Beta and WHQL driver updates and the next driver update will incorporate all of the changes from the Omega driver plus the new bug fixes or updates that the driver was released to address.

Many sites including The Tech Report have had at least a small amount of time to test the new driver and have not seen much in the way of installation issues, or unfortunately performance improvements on systems not using an AMD APU.  As more time for testing elapses and more reviews come out we may see improvements on low end systems but for now the higher end machines show little to no improvement on raw FPS rates.  Keep your eyes peeled for an update once we have had time to test the change on frame pacing results, which are far more important than just increasing your FPS. 

The main reason to be excited about this release, it is the long list of new features, from a DSR-like feature called Virtual Super Resolution which allows you to increase the resolution of your monitor although for now 4K super resolution is limited to the R285 as it is the only AMD Tonga card on the market at the moment.  Along with the release of the Omega driver comes news about Freesync displays, another feature enabled in the new driver and their availability; we have a release date of January or February with a 4K model arriving in March.

Check out the links to The Tech Report and below to read the full list of new features that this driver brings and don't forget to click on Ryan's article as well.

freesync-slide.jpg

"AMD has introduced what may be its biggest graphics driver release ever, with more than 20 new features, 400 bug fixes, and some miscellaneous performance improvements."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Author:
Manufacturer: AMD

New Features

There are smart people that work at AMD. A quick look at the company's products, including the APU lineup as well as the discrete GPU fields, clearly indicates a lineup of talent in engineering, design, marketing and business. It's not perfect of course, and very few companies can claim to be, but the strengths of AMD are there and easily discernible to those of us on the outside looking in with the correct vision.

Because AMD has smart people working hard to improve the company, they are also aware of its shortcomings. For many years now, the thorn of GPU software has been sticking in AMD's side, tarnishing the name of Radeon and the products it releases. Even though the Catalyst graphics driver has improved substantially year after year, the truth is that NVIDIA's driver team has been keeping ahead of AMD consistently in basically all regards: features, driver installation, driver stability, performance improvements over time.

slide01.jpg

If knowing is half the battle, acting on that knowledge is at least another 49%. AMD is hoping to address driver concerns now and into the future with the release of the Catalyst Omega driver. This driver sets itself apart from previous releases in several different ways, starting with a host of new features, some incremental performance improvements and a drastically amped up testing and validation process.

slide02.jpg

AMD considers this a "special edition" driver and is something that they plan to repeat on a yearly basis. That note in itself is an interesting point - is that often enough to really change the experience and perception of the Catalyst driver program going forward? Though AMD does include some specific numbers of tested cases for its validation of the Omega driver (441,000+ automated test runs, 11,000+ manual test runs) we don't have side by side data from NVIDIA to compare it to. If AMD is only doing a roundup of testing like this once a year, but NVIDIA does it more often, then AMD might soon find itself back in the same position it has been.

UPDATE: There has been some confusion based on this story that I want to correct. AMD informed us that it is still planning on releasing other drivers throughout the year that will address performance updates for specific games and bug fixes for applications and titles released between today and the pending update for the next "special edition." AMD is NOT saying that they will only have a driver drop once a year.

But before we worry about what's going to happen in the future, let's look into what AMD has changed and added to the new Catalyst Omega driver released today.

Continue reading our overview of the new AMD Catalyst Omega driver!!

Manufacturer: PC Percpective

Overview

We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.

IMG_9328.JPG

As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.

Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.

screen refresh rate-.png

First a bit of misnomer correction / foundation laying:

  • The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
  • LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
  • In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
  • The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).

With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:

eve ss-2-.png

Measured panel section brightness over time during a 'stall' event. Click to enlarge.

The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.

Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.

A few final points before we go:

  • This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
  • The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
  • The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).

This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.

During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:

"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.

This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.

When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."

So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below. 

(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)

Awake Yet? Good! Optimizing Inverse Trig for AMD GPUs.

Subject: General Tech, Graphics Cards | December 2, 2014 - 03:11 AM |
Tagged: amd, GCN, dice, frostbite

Inverse trigonometric functions are difficult to compute. Their use is often avoided like the plague. If, however, the value is absolutely necessary, it will probably be solved by approximations or, if possible, replacing them with easier functions by clever use of trig identities.

arctrig-examples.png

If you want to see how the experts approach this problem, then Sébastien Lagarde, a senior developer of the Frostbite engine at DICE, goes into detail with a blog post. By detail, I mean you will see some GPU assembly being stepped through by the end of it. What makes this particularly interesting is the diagrams at the end, showing what each method outputs as represented by the shading of a sphere.

If you are feeling brave, take a look.

The MSI GTX 980 GAMING 4G and its fancy new fan

Subject: Graphics Cards | December 1, 2014 - 02:52 PM |
Tagged: msi, nvidia, GTX 980, GAMING 4G, factory overclocked, Twin Frozr V

MSI has updated their Twin Frozr V with Torx fans which are effective at moving a lot of air very quietly and 'S' shaped heatpipes which bear the name SuperSU.  Connectivity is provided by dual-link DVI-I, HDMI and three DisplayPort plugs which ought to provide enough flexibility for anyone.  It is clocked at 1216 - 1331MHz out of the box with GDDR5 running at 7GHz effective which [H]ard|OCP managed to increase to 1406 - 1533MHz and 7.16GHz on the memory which is rather impressive for a Maxwell chip with NVIDIA's power limits and shows just how much you can squeeze out of their new chip without needing to up the amount of juice you are providing it.  The overclocked card upped the full system wattage to 378W which was much lower than the R9 290 they tested against and the GPU temperature went as high as 70C when pushed to the limit which again is lower than the 290 however NVIDIA's selling price is certainly higher than AMD's.  Check out their full review here.

1417398114ROQehtibgG_1_1.jpg

"The MSI GTX 980 GAMING 4G video card has a factory overclock and the new Twin Frozr V cooling system. We'll push it to its highest custom overclock and pit it against the ASUS ROG R9 290X MATRIX Platinum overclocker, and determine the gaming bang for your buck. May the best card win."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP