Subject: Graphics Cards | July 2, 2015 - 02:03 PM | Ryan Shrout
Tagged: amd, radeon, fury x, pump whine
According to a couple of users from the Anandtech forums and others, there is another wave of AMD Fury X cards making their way out into the world. Opening up the top of the Fury X card to reveal the Cooler Master built water cooler pump, there are two different configurations in circulation. One has a teal and white Cooler Master sticker, the second one has a shiny CM logo embossed on it.
This is apparently a different pump implementation than we have seen thus far.
You might have read our recent story looking at the review sample as well as two retail purchased Fury X cards where we discovered that the initial pump whine and noise that AMD claimed would be gone, in fact remained to pester gamers. As it turns out, all three of our cards have the teal/white CM logo.
Our three Fury X cards have the same sticker on them.
Based on at least a couple of user reports, this different pump variation does not have the same level of pump whine that we have seen to date. If that's the case, it's great news - AMD has started pushing out Fury X cards to the retail market that don't whine and squeal!
If this sticker/label difference is in fact the indicator for a newer, quieter pump, it does leave us with a few questions. Do current Fury X owners with louder coolers get to exchange them through RMA? Is it possible that these new pump decals are not indicative of a total pump change over and this is just chance? I have asked AMD for details on this new information already, and in fact have been asking for AMD's input on the issue since the day of retail release. So far, no one has wanted to comment on it publicly or offer me any direction as to what is changing and when.
I hope for the gamers' sake that this new pump sticker somehow will be the tell-tale sign that you have a changed cooler implementation. Unfortunately for now, the only way to know if you are buying one of these is to install it in your system and listen or to wait for it to arrive and take the lid off the Fury X. (It's a Hex 1.5 screw by the way.)
Though our budget is more than slightly stretched, I'm keeping an eye out for more Fury X cards to show up for sale to get some more random samples in-house!
Tick Tock Tick Tock Tick Tock Tock
A few websites have been re-reporting on a leak from BenchLife.info about Kaby Lake, which is supposedly a second 14nm redesign (“Tock”) to be injected between Skylake and Cannonlake.
UPDATE (July 2nd, 3:20pm ET): It has been pointed out that many hoaxes have come out of the same source, and that I should be more clear in my disclaimer. This is an unconfirmed, relatively easy to fake leak that does not have a second, independent source. I reported on it because (apart from being interesting enough) some details were listed on the images, but not highlighted in the leak, such as "GT0" and a lack of Iris Pro on -K. That suggests that the leaker got the images from somewhere, but didn't notice those details, which implies that the original source was hoaxed by an anonymous source, who only seeded the hoax to a single media outlet, or that it was an actual leak.
Either way, enjoy my analysis but realize that this is a single, unconfirmed source who allegedly published hoaxes in the past.
Image Credit: BenchLife.info
If true, this would be a major shift in both Intel's current roadmap as well as how they justify their research strategies. It also includes a rough stack of product categories, from 4.5W up to 91W TDPs, including their planned integrated graphics configurations. This leads to a pair of interesting stories:
How Kaby Lake could affect Intel's processors going forward. Since 2006, Intel has only budgeted a single CPU architecture redesign for any given fabrication process node. Taking two attempts on the 14nm process buys time for 10nm to become viable, but it could also give them more time to build up a better library of circuit elements, allowing them to assemble better processors in the future.
What type of user will be given Iris Pro? Also, will graphics-free options be available in the sub-Enthusiast class? When buying a processor from Intel, the high-end mainstream processors tend to have GT2-class graphics, such as the Intel HD 4600. Enthusiast architectures, such as Haswell-E, cannot be used without discrete graphics -- the extra space is used for more cores, I/O lanes, or other features. As we will discuss later, Broadwell took a step into changing the availability of Iris Pro in the high-end mainstream, but it doesn't seem like Kaby Lake will make any more progress. Also, if I am interpreting the table correctly, Kaby Lake might bring iGPU-less CPUs to LGA 1151.
Keeping Your Core Regular
To the first point, Intel has been on a steady tick-tock cycle since the Pentium 4 architecture reached the 65nm process node, which was a “tick”. The “tock” came from the Conroe/Merom architecture that was branded “Core 2”. This new architecture was a severe departure from the high clock, relatively low IPC design that Netburst was built around, which instantaneously changed the processor landscape from a dominant AMD to an Intel runaway lead.
After 65nm and Core 2 started the cycle, every new architecture alternated between shrinking the existing architecture to smaller transistors (tick) and creating a new design on the same fabrication process (tock). Even though Intel has been steadily increasing their R&D budget over time, which is now in the range of $10 to $12 billion USD each year, creating smaller, more intricate designs with new process nodes has been getting harder. For comparison, AMD's total revenue (not just profits) for 2014 was $5.51 billion USD.
Retail cards still suffer from the issue
In our review of AMD's latest flagship graphics card, the Radeon R9 Fury X, I noticed and commented on the unique sound that the card was producing during our testing. A high pitched whine, emanating from the pump of the self-contained water cooler designed by Cooler Master, was obvious from the moment our test system was powered on and remained constant during use. I talked with a couple of other reviewers about the issue before the launch of the card and it seemed that I wasn't alone. Looking around other reviews of the Fury X, most make mention of this squeal specifically.
Noise from graphics cards come in many forms. There is the most obvious and common noise from on-board fans and the air it moves. Less frequently, but distinctly, the sound of inductor coil whine comes up. Fan noise spikes when the GPU gets hot, causing the fans to need to spin faster and move more air across the heatsink, which keeps everything running cool. Coil whine changes pitch based on the frame rate (and the frequency of power delivery on the card) and can be alleviated by using higher quality components on the board itself.
But the sound of our Fury X was unique: it was caused by the pump itself and it was constant. The noise it produced did not change as the load on the GPU varied. It was also 'pitchy' - a whine that seemed to pierce through other sounds in the office. A close analog might be the sound of an older, CRT TV or monitor that is left powered on without input.
In our review process, AMD told us the solution was fixed. In an email sent to the media just prior to the Fury X launch, an AMD rep stated:
In regards to the “pump whine”, AMD received feedback that during open bench testing some cards emit a mild “whining” noise. This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump. Since the AMD Radeon™ R9 Fury X radiator fan is near silent, this pump noise is more noticeable.
The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump. This problem has been resolved and a fix added to production parts and is not an issue.
I would disagree that this is "normal" but even so, taking AMD at its word, I wrote that we heard the noise but also that AMD had claimed to have addressed it. Other reviewers noted the same comment from AMD, saying the result was fixed. But very quickly after launch some users were posting videos on YouTube and on forums with the same (or worse) sounds and noise. We had already started bringing in a pair of additional Fury X retail cards from Newegg in order to do some performance testing, so it seemed like a logical next step for us to test these retail cards in terms of pump noise as well.
First, let's get the bad news out of the way: both of the retail AMD Radeon R9 Fury X cards that arrived in our offices exhibit 'worse' noise, in the form of both whining and buzzing, compared to our review sample. In this write up, I'll attempt to showcase the noise profile of the three Fury X cards in our possession, as well as how they compare to the Radeon R9 295X2 (another water cooled card) and the GeForce GTX 980 Ti reference design - added for comparison.
Subject: Graphics Cards | June 30, 2015 - 01:16 PM | Sebastian Peak
Tagged: overclock, oc, GTX 980 Ti, DirectCU III, asus
ASUS has annouced a new STRIX edition of the GeForce GTX 980 Ti, and this is one massive card in not only size (measuring 12" x 6" x 1.57") but in potential performance as well.
First off, there is the new DirectCU III cooler, which offers 3 fans and a much larger overall design than that of the existing GTX 980 STRIX card. And there's good reason for the added cooling capacity: this card has one hefty overclock for a GTX 980 Ti, with a 1216 MHz Base and a whopping 1317 MHz Boost clock in "OC mode". The card's default mode is still quite a bit over reference with 1190 MHz Base and 1291 MHz Boost clocks (a reference 980 Ti has a Base of 1000 MHz and Boost clock of 1075 MHz). Memory with the STRIX 980 Ti is also overclocked, with 7200 MHz GDDR5 in both modes.
Features for this new card from ASUS:
- 1317MHz GPU boost clock in OC mode with 7200MHz factory-overclocked memory speed for outstanding gaming experience
- DirectCU III with Patented Triple Wing-Blade 0dB Fan Design delivers maximum air flow with 30% cooler and 3X quieter performance
- AUTO-EXTREME Technology with 12+2 phase Super Alloy Power II delivers premium aerospace-grade quality and reliability
- Pulsating STRIX LED makes a statement while adding style to your system
- STRIX GPU-Fortifier relieves physical stress around the GPU in order to protect it
- GPU Tweak II with Xsplit Gamecaster provides intuitive performance tweaking and lets you stream your gameplay instantly
The new DirectCU III cooler
The 0dB fans (zero-RPM mode under less demanding workloads) are back with a new "wing-blade" design that promises greater static pressure. Power delivery is also improved with the 14-phase "Super Alloy Power II" components, which ASUS claims will provide 50% cooler thermals while reducing "component buzzing" by up to 2x under load.
The previous DirectCU II cooler from the STRIX GTX 980
The new ASUS STRIX GTX 980 Ti Gaming card hasn't shown up on amazon yet, but it should be available soon for what I would expect to be around $699.
Subject: Graphics Cards | June 25, 2015 - 02:42 PM | Jeremy Hellstrom
Tagged: 4GB, amd, Fiji, Fury, fury x, hbm, R9, radeon
[H]ard|OCP used a slightly different configuration to test the new R9 Fury X, an i7-3770K on an ASUS PB287Q as opposed to an i7-3960X and an ASUS P9X79, the SSD is slightly different but the RAM remains the same at 16GB of DDR3-1600. [H] also used the same driver as we did and found similar difficulties using it with R9-2xx cards which is why that card was tested with the Catalyst 15.5 Beta. When testing The Witcher 3 the GTX 980 Ti came out on top overall but it is worth noting the Fury's 70% performance increase over the 290X when HairWorks was enabled. Their overall conclusions matched what Ryan saw, read them for yourself right here.
"We review AMD's new Fiji GPU comprising the new AMD Radeon R9 Fury X video card with stacked chip technology High Bandwidth Memory. We take this video card through its paces, make comparisons and find out what it can do for us in real world gameplay. Is this $649 video card competitive? Is it truly geared for 4K gaming as AMD says?"
Here are some more Graphics Card articles from around the web:
- AMD's Radeon R9 Fury X @ The Tech Report
- AMD R9 Fury X Review; Fiji Arrives @ Hardware Canucks
- AMD Fury X @ HardwareHeaven
- AMD Radeon R9 Fury X 4 GB @ techPowerUp
- MSI R9 390X GAMING 8G @ [H]ard|OCP
- MSI R7 370 GAMING 2G Review @ Neoseeker
- PowerColor PCS+ R9 390 8GB Review @ OCC
- PowerColor TurboDuo R9 290 4GB OC @ [H]ard|OCP
- EVGA GTX 980 Ti SC+ 6 GB @ techPowerUp
- EVGA GTX 970 SSC @ HardwareHeaven
Subject: General Tech, Graphics Cards | June 24, 2015 - 10:10 PM | Scott Michaud
Tagged: batman, wb games, consolitis, gameworks, pc gaming, nvidia, amd
Over the last few days, the PC version of Batman: Arkham Knight has been receiving a lot of flak. Sites like PC Gamer were unable to review the game because they allege that Warner Brothers would not provide pre-release copies to journalists except for the PS4 version. This is often met with cynicism that can be akin to throwing darts in an unlit room with the assumption that a dartboard is in there somewhere. Other times, it is validated.
Whether or not the lack of PC review copies was related, the consensus is that Arkham Knight is a broken game. After posting a troubleshooting guide on the forums to help users choose the appropriate settings, WB Games has pulled the plug and suspended the game's sales on Steam until the issues are patched.
TotalBiscuit weighs in on the issues with his latest "Port Report".
No-one seems to be talking about what the issue is. Fortunately or unfortunately, I don't have the game myself so I cannot look and speculate based on debug information (which they probably disabled from the released game anyway). I could wildly speculate about DX11 limits from the number of elements on screen, but that is not based on any actual numbers. They could be really good at instancing and other tricks to keep the chunks of work being sent to the GPU as large as possible. I don't know. Whatever the issue is, it sounds pretty bad.
A fury unlike any other...
Officially unveiled by AMD during E3 last week, we are finally ready to show you our review of the brand new Radeon R9 Fury X graphics card. Very few times has a product launch meant more to a company, and to its industry, than the Fury X does this summer. AMD has been lagging behind in the highest-tiers of the graphics card market for a full generation. They were depending on the 2-year-old Hawaii GPU to hold its own against a continuous barrage of products from NVIDIA. The R9 290X, despite using more power, was able to keep up through the GTX 700-series days, but the release of NVIDIA's Maxwell architecture forced AMD to move the R9 200-series parts into the sub-$350 field. This is well below the selling prices of NVIDIA's top cards.
The AMD Fury X hopes to change that with a price tag of $650 and a host of new features and performance capabilities. It aims to once again put AMD's Radeon line in the same discussion with enthusiasts as the GeForce series.
The Fury X is built on the new AMD Fiji GPU, an evolutionary part based on AMD's GCN (Graphics Core Next) architecture. This design adds a lot of compute horsepower (4,096 stream processors) and it also is the first consumer product to integrate HBM (High Bandwidth Memory) support with a 4096-bit memory bus!
Of course the question is: what does this mean for you, the gamer? Is it time to start making a place in your PC for the Fury X? Let's find out.
Subject: Graphics Cards | June 19, 2015 - 06:25 PM | Ryan Shrout
Tagged: radeon, r9 390, hawaii, catalyst, amd, 15.15
During the course of our review of the new Sapphire Nitro R9 390 8GB card earlier this week, a question came up on driver support. For testing the R9 300-series as well as the Fury X cards, AMD provided a new Catalyst 15.15 beta driver. The problem is that these drivers would not install on the Radeon R9 200-series cards. That's not totally uncommon on new GPU releases but it does seem a bit odd considering the similarities between the R9 390 and the R9 290, for example.
That meant that in our review we had to use the Catalyst 15.5 beta for the Radeon R9 290X and the Radeon R9 290 GPU while using the newer Catalyst 15.15 beta for the Sapphire Nitro R9 390. Eyebrows were raised as you would expect as any performance differences between the new cards and the old cards would have to take into account the driver changes as well. But since we couldn't install the new driver on the old hardware, we were stuck, and published what we had.
Since then, a driver with some INI modifications that allows Catalyst 15.15 to be installed on Radeon R9 290X/290 hardware was built and uploaded from the Guru3D Forums. Today I installed that on our XFX Radeon R9 290 4GB card used in our R9 390 review to re-run a few game tests to see what changes we saw, if any. This would help us address any concerns over the updated driver causing performance changes rather than the hardware changes.
(Note: I realize that using an INI hacked driver isn't exactly going to pass QA with AMD, but I think we are seeing results that are close enough.)
First up, let's look at Grand Theft Auto V.
In GTA V we see that the average frame rate at 2560x1440 goes from 39.5 FPS to 40.5 FPS, an increase of about 2-3%. That's minimal but it is interesting to see how the frame rate consistency changes as we move down the sliding scale; pay attention to the orange and pink lines in the FPS by Percentile graph to see what I am referencing. As you move into the slower frame times in our testing, the gap between the 15.5 and 15.15 driver begins to widen slightly, indicating a little more frame time consistency in 15.15 release.
But what about BF4 or Metro: Last Light?
Subject: Graphics Cards | June 19, 2015 - 01:58 PM | Jeremy Hellstrom
Tagged: TwinFrozr V, r9 390x, msi, GAMING 8G, factory overclocked, amd
For their R9 390X GAMING 8G card, MSI has introduced the TwinFrozr V cooling solution and built the card using high-c solid capacitors along with a custom PCB. This particular model is factory overclocked by 50MHz on the GPU and 100MHz on the VRAM bring the clocks to 1.1GHz and 6.1GHz. [H]ard|OCP tested the new card out and proclaimed it to be great for 1440p gaming but not so much for 4K, at least on its own. In a Crossfire configuration the horsepower will be enough to push 4K and the 8GB of memory will truly show off its use, something it does not have a chance to do at 1440p. They will be revisiting this card in the near future to provide overclocking results, which could prove to be very interesting if power consumption and heat production can be kept to reasonable levels.
Also, we have been informed than nobody does FCAT testing anymore so any evidence contrary to that opinion you see in Ryan's review must therefore be an hallucination.
"We've got an MSI R9 390X GAMING video card with 8GB of VRAM to put up against a Radeon R9 290X and GeForce GTX 980. Find out what the new AMD Radeon R9 390X is made of, and if the MSI R9 390X GAMING 8G video card can compete with GeForce GTX 980 performance, you might be surprised."
Here are some more Graphics Card articles from around the web:
- MSI R9 390X Gaming 8 GB @ techPowerUp
- Radeon R9 380 @ HardwareHeaven
- MSI R9 380 Gaming 2G Review @ OCC
- MSI R9-390 Gaming 8G – AMD 300 series with a custom kick from MSI @ Bjorn3d
- EVGA GTX 980Ti SC ACX 2.0 + Review, Titan X has a Son @ Bjorn3d
The new Radeon R9 300-series
The new AMD Radeon R9 and R7 300-series of graphics cards are coming into the world with a rocky start. We have seen rumors and speculation about what GPUs are going to be included, what changes would be made and what prices these would be shipping at for what seems like months, and in truth it has been months. AMD's Radeon R9 290 and R9 290X based on the new Hawaii GPU launched nearly 2 years ago, while the rest of the 200-series lineup was mostly a transition of existing products in the HD 7000-family. The lone exception was the Radeon R9 285, a card based on a mysterious new GPU called Tonga that showed up late to the game to fill a gap in the performance and pricing window for AMD.
AMD's R9 300-series, and the R7 300-series in particular, follows a very similar path. The R9 390 and R9 390X are still based on the Hawaii architecture. Tahiti is finally retired and put to pasture, though Tonga lives on as the Radeon R9 380. Below that you have the Radeon R7 370 and 360, the former based on the aging GCN 1.0 Curacao GPU and the latter based on Bonaire. On the surface its easy to refer to these cards with the dreaded "R-word"...rebrands. And though that seems to be the case there are some interesting performance changes, at least at the high end of this stack, that warrant discussion.
And of course, AMD partners like Sapphire are using this opportunity of familiarity with the GPU and its properties to release newer product stacks. In this case Sapphire is launching the new Nitro brand for a series of cards that it is aimed at what it considers the most common type of gamer: one that is cost conscious and craves performance over everything else.
The result is a stack of GPUs with prices ranging from about $110 up to ~$400 that target the "gamer" group of GPU buyers without the added price tag that some other lines include. Obviously it seems a little crazy to be talking about a line of graphics cards that is built for gamers (aren't they all??) but the emphasis is to build a fast card that is cool and quiet without the additional cost of overly glamorous coolers, LEDs or dip switches.
Today I am taking a look at the new Sapphire Nitro R9 390 8GB card, but before we dive head first into that card and its performance, let's first go over the changes to the R9-level of AMD's product stack.