AMD Announces Radeon R9 Nano Price Cut

Subject: Graphics Cards | January 11, 2016 - 08:32 AM |
Tagged: radeon, r9 nano, R9 Fury X, price cut, press release, amd

AMD has announced a price cut for the Radeon R9 Nano, which will now have a suggested price of $499, a $150 drop from the original $649 MSRP.

R9_Nano_PCPer.jpg

VideoCardz had the story this morning, quoting the official press release from AMD:

"This past September, the AMD Radeon™ R9 Nano graphics card launched to rave reviews, claiming the title of the world’s fastest and most power efficient Mini ITX gaming card, powered by the world’s most advanced and innovative GPU with on-chip High-Bandwidth Memory (HBM) for incredible 4K gaming performance. There was nothing like it ever seen before, and today, it remains in a class of its own, delivering smooth, true-to-life, premium 4K and VR gaming in a small form factor PC.

At a peak power of 175W and in a 6-inch form factor, it drives levels of performance that are on par with larger, more power-hungry GPUs from competitors, and blows away Mini ITX competitors with up to 30 percent better performance than the GTX 970 Mini ITX.

As of today, 11 January, this small card will have an even bigger impact on gamers around the world as AMD announces a change in the AMD Radeon™ R9 Nano graphics card’s SEP from $649 to $499. At the new price, the AMD Radeon™ R9 Nano graphics card will be more accessible than ever before, delivering incredible performance and leading technologies, with unbelievable efficiency in an astoundingly small form factor that puts it in a class all of its own."

The R9 Nano (reviewed here) had been the most interesting GPU released in 2015 to the team at PC Perspective. It was a compelling product for its tiny size, great performance, and high power efficiency, but the dialogue here probably mirrored that of a lot of potential buyers; for the price of a Fury X, did it make sense to buy the Nano? It was all going to depend on need, but very few enclosures on the market do not support a full-length GPU, as we discovered when testing out small R9 Nano builds.

Now that the price will move down $150 it becomes an easier choice: $499 will buy you a full R9 Fury X core for $150 less. The performance of a Fury X is only a few percentage points higher than the slighly lower-clocked Nano, so you're now getting most of the way there for much less. We have seen some R9 Fury X cards selling for $599, but even at $100 more would you buy the Fury X over a Nano? If nothing else the lower price makes the conversation a lot more interesting.

Source: VideoCardz

Far Cry Primal System Requirements Slightly Lower than 4?

Subject: Graphics Cards, Processors | January 9, 2016 - 07:00 AM |
Tagged: ubisoft, quad-core, pc gaming, far cry primal, dual-core

If you remember back when Far Cry 4 launched, it required a quad-core processor. It would block your attempts to launch the game unless it detected four CPU threads, either native quad-core or dual-core with two SMT threads per core. This has naturally been hacked around by the PC gaming community, but it is not supported by Ubisoft. It's also, apparently, a bad experience.

ubisoft-2015-farcryprimal.jpg

The follow-up, Far Cry Primal, will be released in late February. Oddly enough, it has similar, but maybe slightly lower, system requirements. I'll list them, and highlight the differences.

Minimum:

  • 64-bit Windows 7, 8.1, or 10 (basically unchanged from 4)
  • Intel Core i3-550 (down from i5-750)
    • or AMD Phenom II X4 955 (unchanged from 4)
  • 4GB RAM (unchanged from 4)
  • 1GB NVIDIA GTX 460 (unchanged from 4)
    • or 1GB AMD Radeon HD 5770 (down from HD 5850)
  • 20GB HDD Space (down from 30GB)

Recommended:

  • Intel Core i7-2600K (up from i5-2400S)
    • or AMD FX-8350 (unchanged from 4)
  • 8GB of RAM (unchanged from 4)
  • NVIDIA GeForce GTX 780 (up from GTX 680)
    • or AMD Radeon R9 280X (down from R9 290X)

While the CPU is interesting, the opposing directions of the recommended GPU is fascinating. Either the parts are within Ubisoft's QA margin of error, or they increased the GPU load, but were able to optimize AMD better than Far Cry 4, which was a net gain in performance (and explains the slight bump in CPU power required to feed the extra content). Of course, either way is just a guess.

Back on the CPU topic though, I would be interested to see the performance of Pentium Anniversary Edition parts. I wonder whether they removed the two-thread lock, and, especially if hacks are still required, whether it is playable anyway.

That is, in a month and a half.

Source: Ubisoft

CES 2016: AMD Shows Polaris Architecture and HDMI FreeSync Displays

Subject: Graphics Cards, Displays | January 8, 2016 - 02:56 PM |
Tagged: video, Polaris, hdmi, freesync, CES 2016, CES, amd

At its suite at CES this year, AMD was showing off a couple of new technologies. First, we got to see the upcoming Polaris GPU architecture in action running Star Wars Battlefront with some power meters hooked up. This is a similar demo to what I saw in Sonoma back in December, and it compares an upcoming Polaris GPU against the NVIDIA GTX 950. The result: total system power of just 86 watts on the AMD GPU and over 150 watts on the NVIDIA GPU.

Another new development from AMD on the FreeSync side of things was HDMI integration. The company took time at CES to showcase a pair of new HDMI-enabled monitors working with FreeSync variable refresh rate technology. 

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: AMD

Intel Pushes Device IDs of Kaby Lake GPUs

Subject: Graphics Cards, Processors | January 8, 2016 - 02:38 AM |
Tagged: Intel, kaby lake, linux, mesa

Quick post about something that came to light over at Phoronix. Someone noticed that Intel published a handful of PCI device IDs for graphics processors to Mesa and libdrm. It will take a few months for graphics drivers to catch up, although this suggests that Kaby Lake will be releasing relatively soon.

intel-2015-linux-driver-mesa.png

It also gives us hints about what Kaby Lake will be. Of the published batch, there will be six tiers of performance: GT1 has five IDs, GT1.5 has three IDs, GT2 has six IDs, GT2F has one ID, GT3 has three IDs, and GT4 has four IDs. Adding them up, we see that Intel plans 22 GPU devices. The Phoronix post lists what those device IDs are, but that is probably not interesting for our readers. Whether some of those devices overlap in performance or numbering is unclear, but it would make sense given how few SKUs Intel usually provides. I have zero experience in GPU driver development.

Source: Phoronix

AMD Radeon Software Crimson Edition 16.1 Hotfix arrives

Subject: Graphics Cards | January 7, 2016 - 06:55 PM |
Tagged: crimson, amd

That's right ladies and germs, not even a full week into 2016 and AMD has a new driver for you, or at least a hotfix version of Crimson 16.1.  If you are playing Elite:Dangerous, Fallout 4 or Just Cause 3 there are a number of fixes to known issues from the original 12.1 which will make it worth picking up as soon as you can.  So far in testing, game profiles do seem to last through an update so if you did take advantage of the game specific overclocking settings you should not lose all your hard work.

amd-radeon-crimson-graphics-driver-15-12-is-up-for-grabs-get-it-now-497817-2.jpg

They have also included fixes specific to VSR, odd HDMI setups and flickering in Freesync displays.  As always there are a few bugs still to be ironed out, which is why you should fill in the bug reporting tool at the bottom of the driver page instead of just throwing things at random passersby ... as fun and entertaining as that may be.

Source: AMD

At just over $200, can the XFX R9 380X Double Dissipation XXX OC 4GB unseat a GTX 960?

Subject: Graphics Cards | January 7, 2016 - 02:36 PM |
Tagged: XFX R9 380X Double Dissipation XXX OC 4GB, xfx, amd, 380x

Take a quick break from reading about the soon to be released technology at CES for a look at a GPU you can buy right now.  The XFX DD XXX series has been around for a few generations and the XFX R9 380X Double Dissipation XXX OC 4GB sports the same custom DD cooler you would expect.  The factory overclock is quite modest, 20MHz on the GPU taking it to 990MHz and retaining the default 5.7GHz memory clock.  Of course [H]ard|OCP were not going to leave that as is, they hit a 1040MHz core and 6.1GHz memory clock thanks to the custom cooling on the card, although with no way to adjust voltage they felt this card could be capable of more if that feature was added to the card.  Read on to see how this card compares against the ASUS STRIX GTX 960 DCU II OC in this ~$220 GPU showdown.

1451965220DbOIiYuZnI_1_1.jpg

"On our test bench today is the XFX R9 380X Double Dissipation XXX OC 4GB video card. It features the latest Ghost Thermal 3.0 cooling technology from XFX and a factory overclock. We will compare it to the ASUS STRIX GTX 960 DCU II OC 4GB in a battle of the $229 price point video cards to determine the better overall value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

CES 2016: Rise of the Tomb Raider NVIDIA Bundle

Subject: Graphics Cards, Shows and Expos | January 7, 2016 - 02:03 PM |
Tagged: square enix, nvidia, CES 2016, CES

NVIDIA has just announced a new game bundle. If you purchase an NVIDIA GeForce GTX 970, GTX 980 desktop or mobile, GTX 980 Ti, GTX 980M, or GTX 970M, then you will receive a free copy of Rise of the Tomb Raider. As always, make sure the retailer is selling the participating card. If the product has a download code, it will be specially marked. NVIDIA will not upgrade non-participating stock to the bundle.

nvidia-2016-tombraider-glp-header.jpg

Rise of the Tomb Raider will go live on January 29th. It was originally released in November as an Xbox One timed exclusive. It will also arrive on the PlayStation 4, but not until “holiday,” which is probably around Q4 (or maybe late Q3).

If you purchase the bundle, then you graphics card will obviously be powerful enough to run the game. At a minimum, you will require a GeForce GTX 650 (2GB) or an AMD HD 7770 (2GB). The CPU needs are light too, requiring just a Sandy Bridge Core i3 (Intel Core i3-2100) or AMD's equivalent. Probably the only concern would be the minimum of 6GB system RAM, which also requires a 64-bit operating system. Now that the Xbox 360 and PlayStation 3 have been deprecated, 32-bit gaming will be increasingly rare for “AAA” titles. That said, we've been ramping up to 64-bit for the last decade. one of the first games that supported x86-64 was Unreal Tournament 2004.

The Rise of the Tomb Raider NVIDIA bundle starts today.

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA

CES 2016: NVIDIA talks SHIELD Updates and VR-Ready Systems

Subject: Graphics Cards, Shows and Expos | January 5, 2016 - 09:39 PM |
Tagged: vr ready, VR, virtual reality, video, Oculus, nvidia, htc, geforce, CES 2016, CES

Other than the in-depth discussion from NVIDIA on the Drive PX 2 and its push into autonomous driving, NVIDIA didn't have much other news to report. We stopped by the suite and got a few updates on SHIELD and the company's VR Ready program to certify systems that meet minimum recommended specifications for a solid VR experience.

For the SHIELD,  NVIDIA is bringing Android 6.0 Marshmallow to the device, with new features like shared storage and the ability to customize the home screen of the Android TV interface. Nothing earth shattering and all of it is part of the 6.0 rollout. 

The VR Ready program from NVIDIA will validate notebooks, systems and graphics cards that have the amount of horsepower to meet the minimum performance levels for a good VR experience. At this point, the specs essentially match up with what Oculus has put forth: a GTX 970 or better on the desktop and a GTX 980 (full, not 980M) on mobile. 

Other than that, Ken and I took in some of the more recent VR demos including Epic's Bullet Train on the final Oculus Rift and Google's Tilt Brush on the latest iteration of the HTC Vive. Those were both incredibly impressive though the Everest demo that simulates a portion of the mountain climb was the one that really made me feel like I was somewhere else.

Check out the video above for more impressions!

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA

CES 2016: ASUS Announces XG Station 2 External Graphics

Subject: Graphics Cards, Mobile, Shows and Expos | January 5, 2016 - 04:47 PM |
Tagged: external graphics, CES 2016, CES, asus

While external graphics has been a thing for quite some time, it was rarely an available thing. Several companies, such as AMD, Lucid, and others, announced products that were never sold. ASUS had their XG Station for Windows Vista that allowed laptops to plug into a GeForce 8600 GT, which was only available in Australia. Only now are we beginning to see options from Alienware, MSI, and even Microsoft that are widely available.

asus-2016-ROG XG2_front.jpg

ASUS is jumping back in, too. Not much is known about the XG Station 2, except that it is “specially designed for ASUS laptops and graphics cards.” This sounds like it is using a proprietary connector, similar to Alienware and MSI, to connect to ASUS laptops. Also saying it's specifically for ASUS graphics cards is a bit confusing, though. If it is an open PCIe slot, I'm not sure why or how it would be limited to ASUS cards. If the graphics cards are pre-installed, then we don't know the list of potential GPUs.

Either way, ASUS states that the dock can be disconnected without shutting down the PC. I'm interested to see how the GPU is supposed to be unplugged, as Alienware's option can only be done when the system is off, and Microsoft's Surface Book has a software detach with a hardware latch. The connector will also charge the laptop, which is an interesting add-in.

Pricing and availability varies, like the other ASUS announcements, by region.

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: ASUS

Rumor: Polaris Is the next AMD Radeon Core Architecture

Subject: Graphics Cards | December 31, 2015 - 01:41 PM |
Tagged: rumor, report, radeon, Polaris, graphics card, gpu, GCN, amd

A report claims that Polaris will succeed GCN (Graphics Core Next) as the next AMD Radeon GPU core, which will power the 400-series graphics cards.

AMD-Polaris.jpg

Image via VideoCardz.com

As these rumors go, this is about as convoluted as it gets. VideoCardz has published the story, sourced from WCCFtech, who was reporting on a post with supposedly leaked slides at HardwareBattle. The primary slide in question has since been pulled, and appears below:

slide.png

Image via HWBattle.com

Of course the name does nothing to provide architectural information on this presumptive GCN replacement, and a new core for the 400-series GPUs was expected anyway after the 300-series was largely a rebranded 200-series (that's a lot of series). Let's hope actual details emerge soon, but for now we can speculate on mysterious tweets from certain interested parties:

 

Source: VideoCardz

Tessellation Support Expands for Intel's Open Linux Driver

Subject: Graphics Cards | December 29, 2015 - 07:05 AM |
Tagged: opengl, mesa, linux, Intel

The open-source driver for Intel is known to be a little behind on Linux. Because Intel does not provide as much support as they should, the driver still does not support OpenGL 4.0, although that is changing. One large chunk of that API is support for tessellation, which comes from DirectX 11, and recent patches are adding it for supported hardware. Proprietary drivers exist, at least for some platforms, but they have their own issues.

intel-2015-linux-driver-mesa.png

According to the Phoronix article, once the driver succeeds in supporting OpenGL 4.0, it will not be too long to open the path to 4.2. Tessellation is a huge hurdle, partially because it involves adding two whole shading stages to the rendering pipeline. Broadwell GPUs were recently added, but a patch that was committed yesterday will expand that to Ivy Bridge and Haswell. On Windows, Intel is far ahead -- pushing OpenGL 4.4 for Skylake-based graphics, although that platform only has proprietary drivers. AMD and NVIDIA are up to OpenGL 4.5, which is the latest version.

While all of this is happening, Valve is working on an open-source Vulkan driver for Intel on Linux. This API will be released adjacent to OpenGL, and is built for high-performance graphics and compute. (Note that OpenCL is more sophisticated than Vulkan "1.0" will be on the compute side of things.) As nice as it would be to get high-end OpenGL support, especially for developers who want a more simplified structure to communicate to GPUs with, Vulkan will probably be the API that matters most for high-end video games. But again, that only applies to games that are developed for it.

Source: Phoronix

NVIDIA GameWorks VR 1.1 arrives with support for OpenGL VR SLI support and the Oculus SDK

Subject: Graphics Cards | December 21, 2015 - 01:04 PM |
Tagged: GameWorks VR 1.1, nvidia, Oculus, opengl, vive

If you are blessed with the good fortune of already having a VR headset and happen to be running an NVIDIA GPU then there is a new driver you want to grab as soon as you can.  The driver includes a new OpenGL extension that enables NVIDIA SLI support for OpenGL apps that display on an Oculus or Vive.  NVIDIA's PR suggests you can expect your performance to improve 1.7 times, not quite doubling but certainly offering a noticeable performance improvement.  The update is for both GeForce and Quadro cards.

vrlock.jpg

They describe how the update will generate images in their blog post here.  They imply that the changes to your code in order to benefit from this update will be minimal and it will also reduce the CPU overhead required to display the images for the right and left eye.  Read on if you are interested in the developer side of this update, otherwise download your new driver and keep an eye out for application updates that enable support for SLI in VR.

Source: NVIDIA

Vulkan API Slips to 2016

Subject: Graphics Cards | December 21, 2015 - 07:25 AM |
Tagged: vulkan, Mantle, Khronos, dx12, DirectX 12

The Khronos Group announced on Friday that the Vulkan API will not ship until next year. The standards body was expecting to launch it at some point in 2015. In fact, when I was first briefed on it, they specifically said that 2015 was an “under-promise and over-deliver” estimate. Vulkan is an open graphics and compute standard that was derived from AMD's Mantle. It, like OpenCL 2.1, uses the SPIR-V language for compute and shading though, which can be compiled from subsets of a variety of languages.

khronos-vulkan-logo.png

I know that most people will be quick to blame The Khronos Group for this, because industry bodies moving slowly is a stereotype, but I don't think it applies. When AMD created Mantle, it bore some significant delays at all levels. Its drivers and software were held back, and the public release of its SDK was delayed out of existence. Again, it would be easy to blame AMD for this, but hold on. We now get to Microsoft. DirectX 12, which is maybe even closer to Mantle than Vulkan is due to its shading language, didn't roll out as aggressively as Microsoft expected, either. Software is still pretty much non-existent when they claimed, at GDC 2014, that about 50% of PC games would be DX12-compatible by Holiday 2015. We currently have... ... zero (excluding pre-release).

Say what you like about the three examples individually, but when all three show problems, then there might just be a few issues that took longer than expected to solve. Again, this is a completely different metaphor of translating voltages coming through a PCI Express bus into fancy graphics and GPU compute, and create all of the supporting ecosystems, too.

Speaking of ecosystems, The Khronos Group has also announced that Google has upgraded their membership to “Promoter” to get more involved with Vulkan development. Google has been sort-of hostile towards certain standards from The Khronos Group on Android in the past, such as disabling OpenCL on Nexus devices, and trying to steer developers into using Android Extension Pack and Renderscript. They seem to want to use Vulkan proper this time, which is always healthy for the API.

I guess look forward to Vulkan in 2016... hopefully early.

Thought you wouldn't see a new Radeon Crimson driver before the New Year, eh?

Subject: Graphics Cards | December 17, 2015 - 05:49 PM |
Tagged: radeon, crimson, amd

unnamed.jpg

That's right folks, the official AMD Radeon Software Crimson Edition 15.12 has just launched for you to install.  This includes the fixes for fan speeds when you are using AMD Overdrive, your settings will stick and the fans will revert to normal after you go back to the desktop from an intense gaming session.  There are multiple fixes for Star Wars Battlefront, Fallout 4 and several GUI fixes within the software itself.  As always there are still a few kinks being worked out but overall it is worth popping over to AMD to grab the new driver.  You should also have less issues upgrading from within Crimson after this update as well.

Source: AMD

NVIDIA Updates GeForce Experience with Beta Features

Subject: Graphics Cards | December 16, 2015 - 08:12 AM |
Tagged: nvidia, geforce experience, geforce

A new version of GeForce Experience was published yesterday. It is classified as a beta, which I'm guessing means that they wanted to release it before the Holidays, but they didn't want to have to fix potential, post-launch issues during the break. Thus, release it as a beta so users will just roll back if something doesn't work. On the other hand, NVIDIA is suggesting that it will be a recurring theme with their new "Early Access" program.

It has a few interesting features, though. First, it has a screenshot function that connects with Imgur. Steam's F12 function is pretty good for almost any title, but there are some times that you don't want to register the game with Steam, so a second option is welcome. They also have an overlay to control your stream, rather than just an indicator icon.

nvidia-2015-geforce-experience-streamstuff.png

They added the ability to choose the Twitch Ingest Server, which is the server that the broadcaster connects to and delivers your stream into Twitch's back-end. I haven't used ShadowPlay for a while, but you previously needed to use whatever GeForce Experience chose. If it's not the best connection (ex: across the continent) then you basically had to deal with it. OBS and OBS Studio have these features too of course. I'll be clear: NVIDIA is playing catch-up to open source software in that area.

The last feature to be mentioned might just be the most interesting, though. A while ago, we mentioned that NVIDIA wants to allow online co-op by a GameStream-like service. They now have it, and it's called GameStream Co-op. The viewer can watch, take over your input, or register as a second gamepad. It requires a GeForce GTX 650 (or 660M) or higher.

GeForce Experience Beta is available now.

Source: NVIDIA

Pushing the ASUS STRIX R9 380X DirectCU II OC to the limit

Subject: Graphics Cards | December 14, 2015 - 03:55 PM |
Tagged: amd, asus, STRIX R9 380X DirectCU II OC, overclock

Out of the box the ASUS STRIX R9 380X OC has a top GPU speed of 1030MHz and memory at 5.7GHz, enough to outperform a stock GTX 960 4GB at 1440p but not enough to provide satisfactory performance at that resolution.  After spending some time with the card, [H]ard|OCP determined that the best overclock they could coax out of this particular GPU was 1175MHz and 6.5GHz, so they set about testing the performance at 1440p again.  To make it fair they also overclocked their STRIX GTX 960 OC 4GB to 1527MHz and 8GHz.  Read the full review for the detailed results, you will see that overclocking your 380X does really increase the value you get for your money.

1450065721TtW76q7G9J_1_1.gif

"We take the new ASUS STRIX R9 380X DirectCU II OC based on AMD's new Radeon R9 380X GPU and overclock this video card to its highest potential. We'll compare performance in six games, including Fallout 4, to a highly overclocked ASUS GeForce GTX 960 4GB video card and find out who dominates 1440p gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

What is this, a GPU for Seattle fans? The MSI GTX 980 Ti SEA HAWK

Subject: Graphics Cards | December 8, 2015 - 03:56 PM |
Tagged: msi, GTX 980 Ti SEA HAWK, 980 Ti, 4k

[H]ard|OCP recently wrapped up a review of the MSI GTX 980 Ti SEA HAWK and are now revisiting the GPU, focusing on the performance at 4K resolutions.  The particular card that they have tops out at 1340MHz base and a 1441MHz boost clock which results in an in-game frequency of 1567MHz.  For comparison testing they have tested their overclocked card against the SEA HAWK at the factory overclock and a regular 980 Ti at 4k resolution.  Read on to see if this watercooled 980 Ti is worth the premium price in the full review.

1449478389yrHHlrQR28_1_1.jpg

"Our second installment with the MSI GTX 980 Ti SEA HAWK focuses in on gameplay at 4K and how viable it is with the fastest overclocked GTX 980 Ti we have seen yet. We will be using a reference GeForce GTX 980 Ti to show the full spectrum of the 980 Ti’s performance capabilities and emphasize the MSI GTX 980 Ti SEA HAWK’s value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD HSA Patches Hoping for GCC 6

Subject: Graphics Cards, Processors | December 8, 2015 - 08:07 AM |
Tagged: hsa, GCC, amd

Phoronix, the Linux-focused hardware website, highlighted patches for the GNU Compiler Collection (GCC) that implement HSA. This will allow newer APUs, such as AMD's Carrizo, to accelerate chunks of code (mostly loops) that have been tagged with a precompiler flag as valuable to be done on the GPU. While I have done some GPGPU development, many of the low-level specifics of HSA aren't areas that I have too much experience with.

amd-2015-carrizo-8.jpg

The patches have been managed by Martin Jambor of SUSE Labs. You can see a slideshow presentation of their work on the GNU website. Even though features froze about a month ago, they are apparently hoping that this will make it into the official GCC 6 release. If so, many developers around the world will be able to target HSA-compatible hardware in the first half of 2016. Technically, anyone can do so regardless, but they would need to specifically use the unofficial branch on the GCC Subversion repository. This probably means compiling it themselves, and it might even be behind on a few features in other branches that were accepted into GCC 6.

Source: Phoronix

Asetek Sends Cease and Desist for Water-Cooled GPUs

Subject: Graphics Cards | December 6, 2015 - 11:29 PM |
Tagged: gigabyte, cooler master, asetek, amd

AMD and Gigabyte have each received cease and desist letters from Asetek, regarding the Radeon Fury X and GeForce GTX 980 Water Force, respectively, for using a Cooler Master-based liquid cooling solution. The Cooler Master Seiden 120M is a self-contained block and water pump, which courts have ruled that it infringes on one of Asetek's patents. Asetek has been awarded 25.375% of Cooler Master's revenue from all affected products since January 1st, 2015.

amd-2015-coolermaster-furyxopen.JPG

This issue obviously affects NVIDIA less than AMD, since it applies to a single product from just one AIB partner. On AMD's side, however, it affects all Fury X products, but obviously not the air-cooled Fury and Fury Nano cards. It's also possible that future SKUs could be affected as well, especially since upcoming, top end GPUs will probably be in small packages adjacent HBM 2.0 memory. This dense form-factor lends itself well to direct cooling techniques, like closed-loop water.

gigabyte-2015-waterforce-asetekcoolermaster.jpg

Even more interesting is that we believe Asetek was expecting to get the Fury X contract. We reported on an Asetek press release that claimed they received their “Largest Ever Design Win” with an undisclosed OEM. We expected it to be the follow-up to the 290X, which we assumed was called 390X because, I mean, AMD just chose that branding, right? Then the Fury X launched and it contained a Cooler Master pump. I was confused. No other candidate for “Largest Ever Design Win” popped up from Asetek, either. I guess we were right? Question mark? The press release of Asetek's design win came out in August 2014 while Asetek won the patent case in December of that year.

Regardless, this patent war has been ongoing for several months now. If it even affects any future products, I'd hope that they'd have enough warning at this point.

Source: GamersNexus

NVIDIA Publishes 359.12 Hotfix Driver for GTX 860M

Subject: Graphics Cards, Mobile | December 6, 2015 - 08:10 AM |
Tagged: nvidia, graphics drivers, 860m

Users of notebooks with the GeForce GTX 860M GPU have apparently been experiencing crashes in many new titles. To remedy these issues, NVIDIA has published GeForce Hotfix Driver 359.12. If you do not have the GeForce GTX 860M, and all of your games work correctly, then you probably shouldn't install this. It has not been tested as much as official releases, by either Microsoft or NVIDIA, so other issues could have been introduced and no-one would know.

nvidia-geforce.png

If you do have that specific GPU though, and you are having problems running certain titles, then you can install the driver now. Otherwise, you can wait for future, WHQL-certified drivers too. Some users are apparently claiming that the issues were fixed, while others complain about crashes in games like Mad Max and Shadow of Mordor.

Source: NVIDIA