AMD R9 390X Confirmed Hawaii Rebrand via Leaked Box Shot

Subject: Graphics Cards | June 7, 2015 - 07:51 PM |
Tagged: r9 390x, leak, hbm, hawaii, GDDR5, Fiji, amd

On the XFX R9 290X Double Dissipation product page something very curious appears when you scroll all the way down to the bottom…

xfx_1.png

What’s this image over here on the right, I wonder…

R9-390X-8DFR_4.JPG

Well would you look at that. The box is clearly labeled for an AMD Radeon R9 390X with 8GB of GDDR5 memory, further indicating that the upcoming GPU will in fact be a Hawaii rebrand; and that the HBM-based flagship Fiji GPU we keep hearing about (and seeing pictures of) will have a new name. Whether that ends up being R9 490X or a name like “Fury” we will soon find out. As it is, it looks like we know at least part of what to expect from AMD’s gaming event at E3 on June 16.

e3.png

Hmm. What might this be about??

Of course we will have complete coverage when any official announcement is made, but for now enjoy the accidental product reveal!

Update: XFX has removed the R9 390X images from their R9 290X DD product page, but not before numerous sites took their own screenshots before posting the news as well. There has been some disagreement about what the leaked photos actually reveal, or if anything has genuinely been "confirmed", but it seems likely that the product named 390X will be a rebranded 290X with 8GB of GDDR5.

Source: XFX

NVIDIA VR Headset Patent Found

Subject: Graphics Cards, Mobile | June 6, 2015 - 04:05 PM |
Tagged: VR, nvidia, gameworks vr

So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.

nvidia-2015-vr-patent-1.png

Image Credit: Declassified

The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.

nvidia-2015-vr-patent-2.png

Image Credit: Declassified

VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?

Source: VRFocus

AMD Carrizo SKUs and Dual Graphics Options Announced

Subject: Graphics Cards, Processors, Mobile | June 4, 2015 - 04:58 PM |
Tagged: amd, carrizo

My discussion of the Carrizo architecture went up a couple of days ago. The post did not include specific SKUs because we did not have those at the time. Now we do, and there will be products: one A8-branded, one A10-branded, and one FX-branded.

amd-2015-carrizo-skus.jpg

All three will be quad-core parts that can range between 12W and 35W designs, although the A8 processor does not have a 35W mode listed in the AMD Dual Graphics table. The FX-8800P is an APU that has all eight GPU cores while the A-series APUs have six. The A10-8700P and the A8-8600P are separated by a couple hundred megahertz base and boost CPU clocks, and 80 MHz GPU clock.

amd-2015-carrizo-dualgpu.jpg

Also, we have been given a table of AMD Radeon R5 and R7 M-series GPUs that can be paired with Carrizo in an AMD Dual Graphics setup. These GPUs are the R7 M365, R7 M360, R7 M350, R7 M340, R5 M335, and R5 M330. They cannot be paired with every Carrizo APU, and some pairings only work in certain power envelopes. Thankfully, this table should only be relevant to OEMs, because end-users are receiving pre-configured systems.

Pricing and availability will depend on OEMs, of course.

Source: AMD

AMD's Massive Fiji GPU with HBM Gets Pictured

Subject: Graphics Cards | June 3, 2015 - 08:39 PM |
Tagged: amd, Fiji, radeon, R9, 390x, maybe

Sorry for all of these single item news posts I keep making, but this is how the information is coming out about AMD's upcoming Fiji GPU using new HBM (high bandwidth memory) technology. (And make no mistake this is exactly the way that AMD marketing dreamed it would happen.) Below we have an image of Fiji: the GPU die, the interposer and the four stacks of HBM.

amdfijidie.jpg

Look familiar?

04-fourstacked.jpg

That chip is massive, quite simply, measuring about 70mm x 70mm based on the information presented during our HBM technical session last month. That is gigantic when compared to other GPU dies alone but is smaller than previous generation GPUs and the required memories on the PCB separately. 

In case you missed it earlier today, AMD also released a teaser video of a CG Radeon card using Fiji. We'll know everything (maybe?) about AMD's latest flagship on June 16th.

Source: Imgur

AMD Teases New Radeon Fiji GPU on June 16th with Video

Subject: Graphics Cards | June 3, 2015 - 03:25 PM |
Tagged: radeon, hype train, hbm, Fiji, amd

The AMD Fiji hype train keeps rolling. Here is a video that AMD posted to its AMD Radeon Graphics Twitter account with a several second long teaser video.

First and foremost, it looks like all of the leaks about the cooler and card design of Fiji were at least mostly accurate. Also, note that AMD included the #AMD300 tag in the tweet, leading us to believe that the R9 390X is indeed going to be the branding.

Looks like we'll know more on June 16th during E3.

Source: AMD

Computex 2015: Zotac Reveals Custom Cooled GTX TITAN X ArcticStorm Graphics Card

Subject: Graphics Cards | June 2, 2015 - 05:56 PM |
Tagged: zotac, water cooling, titan x, gtx titan x, computex 2015, computex, arcticstorm

NVIDIA’s AIB partners are out in full force at Computex 2015 with new graphics cards and new coolers. Among the fray is Zotac with a customized GTX TITAN X card using the ArcticStorm Hybrid cooler which is an air cooler that also features a water block and can be integrated into your custom water cooling loop.

Zotac GeForce GTX Titan X ArcticStorm Graphics Card.jpg

Of course, the TITAN X is NVIDIA’s top end Maxwell (GM200) graphics processor built on a 28nm process. It has 3,072 CUDA cores, 192 texture units, 96 ROPs, and a 250W TDP. 

Zotac is factory overclocking this GPU to 1026 MHz base and 1114 MHz boost. The 12GB of GDDR5 memory sits on a 384-bit bus and is also (slightly) factory overclocked at 7010 MHz.

The card itself has the same array of video outputs (three DisplayPort, one HDMI, and one DL-DVI) and the same PCI-E power connectors (6-pin + 8-pin) as the reference card.

The ArcticStorm Hybrid cooler can work as an air cooler or as an air + water cooler similar to ASUS’ Poseidon cards. The Zotac cooler features a copper cold plate paired with heatpipes and a large aluminum fin array cooled by three 90mm shrouded fans.

Zotac GeForce GTX Titan X ArcticStorm Graphics Card PCI-E Power.jpg

This cooler should run quieter than the NVIDIA reference card and, especially when connected to your custom liquid cooling loop, and offer lower temperatures. Zotac did not opt for two 8-pin PCI-E so extreme overclocking might be out of the question, but the card should still be heavily overclockable in general should you get a good chip.

Naturally, Zotac is holding off on pricing and availability details of the GTX TITAN X ArcticStorm (ZT-90402-10P) until it is ready to ship which should be soon. Stay tuned to PC Perspective for more details.

Source: Zotac

Rounding up the GTX 980 Ti reviews

Subject: Graphics Cards | June 2, 2015 - 04:40 PM |
Tagged: video, nvidia, maxwell, GTX 980 Ti, gsync, gm200, geforce, gameworks vr, g-sync, dx12, 6Gb

Hopefully by now you have familiarized yourself with Ryan's review of the new GTX980 Ti and perhaps even some of the other reviews below.  One review that you should not miss is by Scott over at The Tech Report as they used an X99 system for benchmarking and covered a slightly different suite of games.  The games both sites tested show very similar results and in the case of BF4 and Crysis 3, showed that the R9 295 X2 is still a force to be reckoned with, especially when it is on sale at a price similar to the 980 Ti.  In testing the Witcher 3 and Project Cars, the 980Ti showed smoother performance with impressive minimum frame times.  Overall, The Tech Report gives the nod to the new GTX 980 Ti for more fluid gameplay but does offer the necessary reminder, AMD will be launching their new products very soon and could offer new competition.

card-front.jpg

"You knew it was coming. When Nvidia introduced the GeForce Titan X, it was only a matter of time before a slightly slower, less expensive version of that graphics card hit the market. That's pretty much how it always happens, and this year is no exception."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Computex 2015: EVGA Builds PrecisionX 16 with DirectX 12 Support

Subject: Graphics Cards | June 1, 2015 - 10:58 AM |
Tagged: evga, precisionx, dx12, DirectX 12

Another interesting bit of news surrounding Computex and the new GTX 980 Ti comes from EVGA and its PrecisionX software. This is easily our favorite tool for overclocking and GPU monitoring, so it's great to see the company continuing to push forward with features and capability. EVGA is the first to add full support for DX12 with an overlay.

precisionx16.jpg

What does that mean? It means as DX12 applications that find their way out to consumers and media, we will now have a tool that can help measure performance and monitor GPU speeds and feeds via the PrecisionX overlay. Before this release, we were running the dark with DX12 demos, so this is great news!

You can download the latest version over on EVGA's website!

Computex 2015: EVGA Shows GTX 980 Ti Variants - Classified, Hybrid and Hydro

Subject: Graphics Cards | June 1, 2015 - 06:00 AM |
Tagged: maxwell, hydro copper, GTX 980 Ti, gm200, evga, computex 2015, computex, classified, acx

With the release of the brand new GeForce GTX 980 Ti from NVIDIA stirring up the week just before Computex in Taipei, you can be sure that all of NVIDIA's partners are going to be out in force showing off their custom graphics card solutions.

980ti.jpg

EVGA has several lined up and they were able to share some information with us. First up is the standard but custom cooled GTX 980 Ti that uses the ACX 2.0+ cooler. This new version of the ACX 2.0 cooler includes a "memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 13%, and optimized Straight Heat Pipes (SHP) additionally reduce GPU temperature by 5C. ACX 2.0+ coolers also feature optimized swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU." We're looking forward to some hands-on testing with this card when it shows up on Monday morning.

Classified.jpg

Also due for an update is the EVGA Classified line, often considered one of the best cards you can buy for overclockers and extreme enthusiasts. Though the card is also using the ACX 2.0+ cooler it will include additional power delivery improvements on the PCB that help stretch available performance headroom.

Hybrid.jpg

Following in the footsteps of the recently released Titan X Hybrid comes the GTX 980 Ti version. This card will use a standard blower cooler for the memory and power delivery while attaching a self-contained water cooler for the GPU itself. This should keep the GPU temperature down quite a bit though the benefit to real-world overclocking is debatable with the voltage lock that NVIDIA has kept in place. If only they were to change that...

Hydro.jpg

Finally, for the water cooling fans among us we have the GTX 980 Ti Hydro Copper, using a water block from EK.

Interested in clock speeds?

  • EVGA 980 Ti ACX 2.0
    • Base: 1000 MHz
    • Boost: 1076 MHz
    • Memory: 7010 MHz
  • EVGA 980 Ti Classified
    • Base: 1152 MHz
    • Boost: 1241 MHz
    • Memory: 7010 MHz
  • EVGA 980 Ti Hybrid
    • Base: 1140 MHz
    • Boost: 1228 MHz
    • Memory: 7010 MHz

I am still waiting for pricing and availability information which we will pass on as soon as we get it!

Computex 2015: ASUS Announces ROG Poseidon GTX 980 Ti

Subject: Graphics Cards | June 1, 2015 - 02:00 AM |
Tagged: computex, Poseidon GTX 980 Ti, GTX 980 Ti, gpu, ASUS ROG, computex 2015

ASUS has already announced a Poseidon version of the new NVIDIA GeForce GTX 980 Ti graphics card, which is part of the company's Republic of Gamers (ROG) lineup.

poseidon980.png

No photo of the GTX 980 Ti available yet, so here's the GTX 980 version for reference

From ASUS:

"ROG Poseidon GTX 980 Ti incorporates the DirectCU H2O hybrid cooling solution with a combined vapor chamber and water channels to give users cooler temperatures along with improved noise reduction for 3x quieter performance. ASUS graphics cards are produced via exclusive Auto-Extreme technology, an industry-first 100% automated process, and feature aerospace-grade Super Alloy Power II components for unsurpassed quality and reliability. ROG Poseidon GTX 980Ti also features GPU Tweak II with XSplit Gamecaster for intuitive performance tweaks and gameplay streaming."

We'll keep you posted on pricing and availability (and actual product photos) once they're available!

Source: ASUS

AMD Catalyst 15.5 Beta Driver Improves Project Cars and Witcher 3 Performance

Subject: Graphics Cards | May 29, 2015 - 11:26 AM |
Tagged: catalyst, amd, 15.5 beta

Last night AMD released a new Catalyst driver, 15.5 Beta, that targets performance improvements in both Project Cars and The Witcher 3: Wild Hunt. You can pick up the new driver for both 32-bit and 64-bit versions of Windows 7 and Windows 8.1 right here. The full display driver version is 14.502.1014.1001.

catalyst.jpg

The specific changes?

Highlights of AMD Catalyst™ 15.5 Beta Windows Driver

Crossfire Profile Update for:

  • The Witcher 3 - Wild Hunt

Performance Improvements for the following :

  • The Witcher 3 - Wild Hunt :  Up to 10% performance increase on single GPU Radeon R9  and R7 Series graphics products
     
  • Project Cars - Up to 17% performance increase on single GPU Radeon R9 and R7 Series graphics products

Users of the Radeon R9 295X2 will be glad to see support for their hardware added for The Witcher 3 and just about everyone will be glad to see some dramatic performance improvements in Project Cars and Witcher 3. We know from testing internally as well as widely reported issues that AMD Radeon graphics cards had issues with Project Cars, resulting in performance well behind comparable NVIDIA GeForce hardware. With as much as 17% improvement for Radeon R9 and R7 hardware though, hopefully that frame rate increases quite a bit.

pc.jpg

We'll be doing some testing with this new driver here today, and if anything jumps out at us, we'll be sure to let you know!

Source: AMD

Rumor: AMD To Name Upcoming Flagship Fiji GPU "Radeon Fury"

Subject: Graphics Cards | May 29, 2015 - 11:05 AM |
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd

Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.

RAGE-FURY-MAXX.jpg

New box art revealed! (Or a product from early 2000.) Image: VideoCardz

The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:

"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."

Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?

Source: VideoCardz

Rumor: Entire Upcoming AMD Radeon 300-Series to Be Rebrands of Existing 200-Series?

Subject: Graphics Cards | May 26, 2015 - 05:29 PM |
Tagged: rumors, leaks, Hawaii XT, Fiji, amd radeon

This sounds like disappointing news, but it might be a little misleading as well. First of all the report, another from the folks at VideoCardz, details information about these upcoming GPUs from a leaked driver.

AMD_Radeon_480.png

Here’s the list VideoCardz came up with (200-series name on the right):

  • AMD6658.1 AMD Radeon(TM) R7 360 Graphics Bonaire XTX [Radeon R7 260X]
  • AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT [Radeon R9 290X]
  • AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO [Radeon R9 290]
  • AMD6810.1 AMD Radeon(TM) R7 370 Graphics Curacao XT [Radeon R9 270X]
  • AMD6810.2 AMD Radeon (TM) R7 300 Series Curacao XT [Radeon R9 270X]
  • AMD6811.1 AMD Radeon (TM) R7 300 Series Curacao PRO [Radeon R9 270/370]
  • AMD6939.1 AMD Radeon R9 300 Series Tonga PRO [Radeon R9 285]

VideoCardz further comments on this list:

“FIJI was not included in this driver release. Radeon R9 Hawaii is neither shown as R9 380 nor as R9 390. Right now it’s just R9 300, just like R9 285-rebrand. Both Hawaii cards will get a small bump in clock speeds and most importantly 8GB memory.”

And this is where the news might be misleading, as the “300-series” naming may very well not include a new GPU due to AMD shifting to a new scheme for their upcoming flagship, just as NVIDIA has used the “TITAN” and “TITAN X” names for flagship cards. If the rumored AMD Fiji card (which was recently pictured) is the all-new GPU featuring HBM memory that we’ve all been waiting for, expect iterations of this new card to follow as the technology trickles down to the more affordable segment.

Rumor: NVIDIA GeForce GTX 980 Ti Specifications Leaked

Subject: Graphics Cards | May 26, 2015 - 05:03 PM |
Tagged: rumors, nvidia, leaks, GTX 980 Ti, gpu, gm200

Who doesn’t love rumor and speculation about unreleased products? (Other than the manufacturers of such products, of course.) Today VideoCardz is reporting via HardwareBattle a GPUZ screenshot reportedly showing specs for an NIVIDIA GeForce GTX 980 Ti.

gpuz_screenshot.png

Image credit: HardwareBattle via VideoCardz.com

First off, the HardwareBattle logo conveniently obscures the hardware ID (as well as ROP/TMU counts). What is visible is the 2816 shader count, which places it between the GTX 980 (2048) and TITAN X (3072). The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X. As far as core clocks on this GPU (which seems likely to be a cut-down GM200), they are identical to those of the TITAN X as well with 1000 MHz Base and 1076 MHz Boost clocks shown in the screenshot.

videocardz_image.PNG

Image credit: VideoCardz.com

We await any official announcement, but from the frequency of the leaks it seems we won’t have to wait too long.

Podcast Pieces: A Discussion about HBM (High Bandwidth Memory) coming for AMD Fiji

Subject: Graphics Cards | May 23, 2015 - 09:46 AM |
Tagged: video, hbm, high bandwidth memory, amd, Fiji

During this week's podcast, Josh and the team went through an in-depth discussion of the new memory technology that AMD will be using on the upcoming Fiji GPU, HBM (high bandwidth memory). In case you don't regularly listen to our amazing PC Perspective Podcast, we have cut out the portion that focuses on HBM so that everyone can be educated on what this new technology will offer for coming GPUs.

06-tablecomparison.jpg

Enjoy! Be sure to subscribe to the PC Perspective YouTube channel for more videos like this!

Leaked AMD Fiji Card Images Show Small Form Factor, Water Cooler Integration

Subject: Graphics Cards | May 22, 2015 - 09:39 AM |
Tagged: wce, radeon, Fiji, amd, 390x

UPDATE (5/22/15): Johan Andersson tweeted out this photo this morning, with the line: "This new island is one seriously impressive and sweet GPU. wow & thanks @AMDRadeon ! They will be put to good use :)"  Looks like we can confirm that at least one of the parts AMD is releasing does have the design of the images we showed you before, though the water cooling implementation is missing or altered.

repi-amd.jpg

END UPDATE

File this under "rumor" for sure, but a cool one none the less...

After yesterday's official tidbit of information surrounding AMD's upcoming flagship graphics card for enthusiasts and its use of HBM (high bandwidth memory), it appears we have another leak on our hands. The guys over at Chiphell have apparently acquired some stock footage of the new Fiji flagship card (whether or not it will be called the 390X has yet to be seen) and it looks...awesome.

390x1.jpg

In that post from yesterday I noted that with an HBM design AMD could in theory build an add-in card that is of a different form factor than anything we have previously seen for a high end part. Based on the image above, if this turns out to be the high end Fiji offering, it appears the PCB will indeed be quite small as it no longer requires memory surrounding the GPU itself. You can also see that it will in fact be water cooled though it looks like it has barb inlets rather than a pre-attached cooler in this image.

390x2.jpg

The second leaked image shows display outputs consisting of three full-size DisplayPort connections and a single HDMI port.

All of this could be faked of course, but if it is, the joker did a damn good job of compiling all the information into one design. If it's real, I think AMD might finally have a match for the look and styling of the high-end GeForce offerings.

What do you think: real or fake? Cool or meh? Let us know!

Source: Chiphell

Could it be a 980 Ti, or does the bill of lading lie?

Subject: Graphics Cards | May 21, 2015 - 07:33 PM |
Tagged: rumour, nvidia, 980 Ti

980ti.PNG

The source of leaks and rumours is often unexpected, such as this import data of a shipment headed from China into India.  Could this 6GB card be the GTX 980 Ti that so many have theorized would be coming sometime around AMD's release of their new cards?  Does the fact that 60,709 Indian Rupees equal 954.447 US Dollars put a damper on your excitement or could it be that these 6 lonely cards are being sold at a higher rate overseas than they might be in the US? 

We don't know but we do know there is a mysterious card out there somewhere.

Source: Zauba

How about that High Bandwidth Memory

Subject: Graphics Cards | May 19, 2015 - 03:51 PM |
Tagged: memory, high bandwidth memory, hbm, Fiji, amd

Ryan and the rest of the crew here at PC Perspective are excited about AMD's new memory architecture and the fact that they will be first to market with it.  However as any intelligent reader is wont to look for; a second opinion on the topic is worth finding.  Look no further than The Tech Report who have also been briefed on AMD's new memory architecture.  Read on to see what they learned from Joe Macri and their thoughts on the successor to GDDR5 and HBM2 which is already in the works.

stack-diagram.jpg

"HBM is the next generation of memory for high-bandwidth applications like graphics, and AMD has helped usher it to market. Read on to find out more about HBM and what we've learned about the memory subsystem in AMD's next high-end GPU, code-named Fiji."

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

Subject: Graphics Cards | May 17, 2015 - 12:04 PM |
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd

I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.

Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.

gwlogo.jpg

Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX,  Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.

An example of The Witcher 3: Wild Hunt with HairWorks

One of the game's developers has been quoted as such:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.

I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:

We are not asking game developers do anything unethical.
 
GameWorks improves the visual quality of games running on GeForce for our customers.  It does not impair performance on competing hardware.
 
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
 
GameWorks licenses follow standard industry practice.  GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license. 
 
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
 
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
 
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.  

Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.

Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to."  And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.

It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings. 

NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.

In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.

Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.

wicher3-1.jpg

All arguing aside, this game looks amazing. Can we all agree on that?

The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.

Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.

So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!

NVIDIA Releases 352.84 WHQL Drivers for Windows 10

Subject: Graphics Cards | May 15, 2015 - 11:36 PM |
Tagged: windows 10, geforce, graphics drivers, nvidia, whql

The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.

nvidia-geforce.png

The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.

For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.

Source: NVIDIA