Subject: Graphics Cards | July 4, 2015 - 02:39 PM | Tim Verry
Tagged: zotac, maxwell, gtx 980ti, factory overclocked
Zotac recently unleashed a monstrous new GTX 980Ti AMP! Extreme graphics card featuring a giant triple slot cooler and a very respectable factory overclock.
Specifically, the Zotac ZT-90505-10P card is a custom card with a factory overclocked NVIDIA GTX 980Ti GPU and GDDR5 memory. The card is a triple slot design that uses a dual fin stack IceStorm heatsink with three 90mm temperature controlled EKO fans. The cooler wraps the fans and HSF in a shroud and also uses a backplate on the bottom of the card. The card is powered by two 8-pin PCI-E power connectors and display outputs include three DisplayPort, one HDMI, and one DL-DVI.
Zotac was able to push the Maxwell GPU with its 2,816 CUDA cores to 1,253 MHz base and 1,355 MHz boost. Further, the 6GB GDDR5 memory also has a factory overclock of 7,220 MHz. These clockspeeds are a decent bump over the reference speeds of 1,000 MHz GPU base, 1,076 MHz GPU boost, and 7,012 MHz memory.
We’ll have to wait for reviews to know for sure, but on paper this card looks to be a nice card that should run fast and cool thanks to that triple fan cooler. The ZT-90505-10P will be available shortly with an MSRP of $700 and a 2 year warranty.
Definitely not a bad price compared to other GTX 980Ti cards on the market.
Subject: Graphics Cards | July 3, 2015 - 08:45 PM | Sebastian Peak
Tagged: strix, rumor, report, Radeon Fury, asus, amd
A report from VideoCardz.com shows three listings for an unreleased ASUS STRIX version of the AMD Radeon Fury (non-X).
Image credit: VideoCardz
The listings are from European sites, and all three list the same model name: ASUS-STRIX R9FURY-DC3-4G-GAMING. You can find the listing from the above photo here at the German site Computer-PC-Shop.
Image credit: VideoCardz
We can probably safely assume that this upcoming air-cooled card will make use of the new DirectCU III cooler introduced with the new STRIX GTX 980 Ti and STRIX R9 390X, and this massive triple-fan cooler should provide an interesting look at what Fury can do without the AIO liquid cooler from the Fury X. Air cooling will of course negate the issue of pump whine that many have complained about with certain Fury X units.
The ASUS STRIX R9 390X Gaming card with DirectCU III cooler
We await offical word on this new GPU, and what price we might expect this particular version to sell for here in the U.S.A.
Introduction and Technical Specifications
In our previous article here, we demonstrated how to mod the EVGA GTX 970 SC ACX 2.0 video card to get higher performance and significantly lower running temps. Now we decided to take two of these custom modded EVGA GTX 970 cards to see how well they perform in an SLI configuration. ASUS was kind enough to supply us with one of their newly introduced ROG Enthusiast SLI Bridges for our experiments.
ASUS ROG Enthusiast SLI Bridge
Courtesy of ASUS
Courtesy of ASUS
For the purposes of running the two EVGA GTX 970 SC ACX 2.0 video cards in SLI, we chose to use the 3-way variant of ASUS' ROG Enthusiast SLI Bridge so that we could run the tests with full 16x bandwidth across both cards (with the cards in PCIe 3.0 x16 slots 1 and 3 in our test board). This customized SLI adapter features a powered red-colored ROG logo embedded in its brushed aluminum upper surface. The adapter supports 2-way and 3-way SLI in a variety of board configurations.
Courtesy of ASUS
ASUS offers their ROG Enthusiast SLI Bridge in 3 sizes for various variations on 2-way, 3-way, and 4-way SLI configurations. All bridges feature the top brushed-aluminum cap with embedded glowing ROG logo.
Courtesy of ASUS
The smallest bridge supports 2-way SLI configurations with either a two or three slot separation. The middle sized bridge supports up to a 3-way SLI configuration with a two slot separation required between each card. The largest bridge support up to a 4-way SLI configuration, also requiring a two slot separation between each card used.
Technical Specifications (taken from the ASUS website)
|Dimensions||2-WAY: 97 x 43 x 21 (L x W x H mm)
3-WAY: 108 x 53 x 21 (L x W x H mm)
4-WAY: 140 x 53 x 21 (L x W x H mm)
|Weight||70 g (2-WAY)
91 g (3-WAY)
|Compatible GPU set-ups||2-WAY: 2-WAY-S & 2-WAY-M
3-WAY: 2-WAY-L & 3-WAY
|Contents||2-WAY: 1 x optional power cable & 2 PCBs included for varying configurations
3-WAY: 1 x optional power cable
4-WAY: 1 x optional power cable
Subject: Graphics Cards | July 2, 2015 - 02:03 PM | Ryan Shrout
Tagged: amd, radeon, fury x, pump whine
According to a couple of users from the Anandtech forums and others, there is another wave of AMD Fury X cards making their way out into the world. Opening up the top of the Fury X card to reveal the Cooler Master built water cooler pump, there are two different configurations in circulation. One has a teal and white Cooler Master sticker, the second one has a shiny CM logo embossed on it.
This is apparently a different pump implementation than we have seen thus far.
You might have read our recent story looking at the review sample as well as two retail purchased Fury X cards where we discovered that the initial pump whine and noise that AMD claimed would be gone, in fact remained to pester gamers. As it turns out, all three of our cards have the teal/white CM logo.
Our three Fury X cards have the same sticker on them.
Based on at least a couple of user reports, this different pump variation does not have the same level of pump whine that we have seen to date. If that's the case, it's great news - AMD has started pushing out Fury X cards to the retail market that don't whine and squeal!
If this sticker/label difference is in fact the indicator for a newer, quieter pump, it does leave us with a few questions. Do current Fury X owners with louder coolers get to exchange them through RMA? Is it possible that these new pump decals are not indicative of a total pump change over and this is just chance? I have asked AMD for details on this new information already, and in fact have been asking for AMD's input on the issue since the day of retail release. So far, no one has wanted to comment on it publicly or offer me any direction as to what is changing and when.
I hope for the gamers' sake that this new pump sticker somehow will be the tell-tale sign that you have a changed cooler implementation. Unfortunately for now, the only way to know if you are buying one of these is to install it in your system and listen or to wait for it to arrive and take the lid off the Fury X. (It's a Hex 1.5 screw by the way.)
Though our budget is more than slightly stretched, I'm keeping an eye out for more Fury X cards to show up for sale to get some more random samples in-house!
Tick Tock Tick Tock Tick Tock Tock
A few websites have been re-reporting on a leak from BenchLife.info about Kaby Lake, which is supposedly a second 14nm redesign (“Tock”) to be injected between Skylake and Cannonlake.
UPDATE (July 2nd, 3:20pm ET): It has been pointed out that many hoaxes have come out of the same source, and that I should be more clear in my disclaimer. This is an unconfirmed, relatively easy to fake leak that does not have a second, independent source. I reported on it because (apart from being interesting enough) some details were listed on the images, but not highlighted in the leak, such as "GT0" and a lack of Iris Pro on -K. That suggests that the leaker got the images from somewhere, but didn't notice those details, which implies that the original source was hoaxed by an anonymous source, who only seeded the hoax to a single media outlet, or that it was an actual leak.
Either way, enjoy my analysis but realize that this is a single, unconfirmed source who allegedly published hoaxes in the past.
Image Credit: BenchLife.info
If true, this would be a major shift in both Intel's current roadmap as well as how they justify their research strategies. It also includes a rough stack of product categories, from 4.5W up to 91W TDPs, including their planned integrated graphics configurations. This leads to a pair of interesting stories:
How Kaby Lake could affect Intel's processors going forward. Since 2006, Intel has only budgeted a single CPU architecture redesign for any given fabrication process node. Taking two attempts on the 14nm process buys time for 10nm to become viable, but it could also give them more time to build up a better library of circuit elements, allowing them to assemble better processors in the future.
What type of user will be given Iris Pro? Also, will graphics-free options be available in the sub-Enthusiast class? When buying a processor from Intel, the high-end mainstream processors tend to have GT2-class graphics, such as the Intel HD 4600. Enthusiast architectures, such as Haswell-E, cannot be used without discrete graphics -- the extra space is used for more cores, I/O lanes, or other features. As we will discuss later, Broadwell took a step into changing the availability of Iris Pro in the high-end mainstream, but it doesn't seem like Kaby Lake will make any more progress. Also, if I am interpreting the table correctly, Kaby Lake might bring iGPU-less CPUs to LGA 1151.
Keeping Your Core Regular
To the first point, Intel has been on a steady tick-tock cycle since the Pentium 4 architecture reached the 65nm process node, which was a “tick”. The “tock” came from the Conroe/Merom architecture that was branded “Core 2”. This new architecture was a severe departure from the high clock, relatively low IPC design that Netburst was built around, which instantaneously changed the processor landscape from a dominant AMD to an Intel runaway lead.
After 65nm and Core 2 started the cycle, every new architecture alternated between shrinking the existing architecture to smaller transistors (tick) and creating a new design on the same fabrication process (tock). Even though Intel has been steadily increasing their R&D budget over time, which is now in the range of $10 to $12 billion USD each year, creating smaller, more intricate designs with new process nodes has been getting harder. For comparison, AMD's total revenue (not just profits) for 2014 was $5.51 billion USD.
Retail cards still suffer from the issue
In our review of AMD's latest flagship graphics card, the Radeon R9 Fury X, I noticed and commented on the unique sound that the card was producing during our testing. A high pitched whine, emanating from the pump of the self-contained water cooler designed by Cooler Master, was obvious from the moment our test system was powered on and remained constant during use. I talked with a couple of other reviewers about the issue before the launch of the card and it seemed that I wasn't alone. Looking around other reviews of the Fury X, most make mention of this squeal specifically.
Noise from graphics cards come in many forms. There is the most obvious and common noise from on-board fans and the air it moves. Less frequently, but distinctly, the sound of inductor coil whine comes up. Fan noise spikes when the GPU gets hot, causing the fans to need to spin faster and move more air across the heatsink, which keeps everything running cool. Coil whine changes pitch based on the frame rate (and the frequency of power delivery on the card) and can be alleviated by using higher quality components on the board itself.
But the sound of our Fury X was unique: it was caused by the pump itself and it was constant. The noise it produced did not change as the load on the GPU varied. It was also 'pitchy' - a whine that seemed to pierce through other sounds in the office. A close analog might be the sound of an older, CRT TV or monitor that is left powered on without input.
In our review process, AMD told us the solution was fixed. In an email sent to the media just prior to the Fury X launch, an AMD rep stated:
In regards to the “pump whine”, AMD received feedback that during open bench testing some cards emit a mild “whining” noise. This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump. Since the AMD Radeon™ R9 Fury X radiator fan is near silent, this pump noise is more noticeable.
The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump. This problem has been resolved and a fix added to production parts and is not an issue.
I would disagree that this is "normal" but even so, taking AMD at its word, I wrote that we heard the noise but also that AMD had claimed to have addressed it. Other reviewers noted the same comment from AMD, saying the result was fixed. But very quickly after launch some users were posting videos on YouTube and on forums with the same (or worse) sounds and noise. We had already started bringing in a pair of additional Fury X retail cards from Newegg in order to do some performance testing, so it seemed like a logical next step for us to test these retail cards in terms of pump noise as well.
First, let's get the bad news out of the way: both of the retail AMD Radeon R9 Fury X cards that arrived in our offices exhibit 'worse' noise, in the form of both whining and buzzing, compared to our review sample. In this write up, I'll attempt to showcase the noise profile of the three Fury X cards in our possession, as well as how they compare to the Radeon R9 295X2 (another water cooled card) and the GeForce GTX 980 Ti reference design - added for comparison.
Subject: Graphics Cards | June 30, 2015 - 01:16 PM | Sebastian Peak
Tagged: overclock, oc, GTX 980 Ti, DirectCU III, asus
ASUS has annouced a new STRIX edition of the GeForce GTX 980 Ti, and this is one massive card in not only size (measuring 12" x 6" x 1.57") but in potential performance as well.
First off, there is the new DirectCU III cooler, which offers 3 fans and a much larger overall design than that of the existing GTX 980 STRIX card. And there's good reason for the added cooling capacity: this card has one hefty overclock for a GTX 980 Ti, with a 1216 MHz Base and a whopping 1317 MHz Boost clock in "OC mode". The card's default mode is still quite a bit over reference with 1190 MHz Base and 1291 MHz Boost clocks (a reference 980 Ti has a Base of 1000 MHz and Boost clock of 1075 MHz). Memory with the STRIX 980 Ti is also overclocked, with 7200 MHz GDDR5 in both modes.
Features for this new card from ASUS:
- 1317MHz GPU boost clock in OC mode with 7200MHz factory-overclocked memory speed for outstanding gaming experience
- DirectCU III with Patented Triple Wing-Blade 0dB Fan Design delivers maximum air flow with 30% cooler and 3X quieter performance
- AUTO-EXTREME Technology with 12+2 phase Super Alloy Power II delivers premium aerospace-grade quality and reliability
- Pulsating STRIX LED makes a statement while adding style to your system
- STRIX GPU-Fortifier relieves physical stress around the GPU in order to protect it
- GPU Tweak II with Xsplit Gamecaster provides intuitive performance tweaking and lets you stream your gameplay instantly
The new DirectCU III cooler
The 0dB fans (zero-RPM mode under less demanding workloads) are back with a new "wing-blade" design that promises greater static pressure. Power delivery is also improved with the 14-phase "Super Alloy Power II" components, which ASUS claims will provide 50% cooler thermals while reducing "component buzzing" by up to 2x under load.
The previous DirectCU II cooler from the STRIX GTX 980
The new ASUS STRIX GTX 980 Ti Gaming card hasn't shown up on amazon yet, but it should be available soon for what I would expect to be around $699.
Subject: Graphics Cards | June 25, 2015 - 02:42 PM | Jeremy Hellstrom
Tagged: 4GB, amd, Fiji, Fury, fury x, hbm, R9, radeon
[H]ard|OCP used a slightly different configuration to test the new R9 Fury X, an i7-3770K on an ASUS PB287Q as opposed to an i7-3960X and an ASUS P9X79, the SSD is slightly different but the RAM remains the same at 16GB of DDR3-1600. [H] also used the same driver as we did and found similar difficulties using it with R9-2xx cards which is why that card was tested with the Catalyst 15.5 Beta. When testing The Witcher 3 the GTX 980 Ti came out on top overall but it is worth noting the Fury's 70% performance increase over the 290X when HairWorks was enabled. Their overall conclusions matched what Ryan saw, read them for yourself right here.
"We review AMD's new Fiji GPU comprising the new AMD Radeon R9 Fury X video card with stacked chip technology High Bandwidth Memory. We take this video card through its paces, make comparisons and find out what it can do for us in real world gameplay. Is this $649 video card competitive? Is it truly geared for 4K gaming as AMD says?"
Here are some more Graphics Card articles from around the web:
- AMD's Radeon R9 Fury X @ The Tech Report
- AMD R9 Fury X Review; Fiji Arrives @ Hardware Canucks
- AMD Fury X @ HardwareHeaven
- AMD Radeon R9 Fury X 4 GB @ techPowerUp
- MSI R9 390X GAMING 8G @ [H]ard|OCP
- MSI R7 370 GAMING 2G Review @ Neoseeker
- PowerColor PCS+ R9 390 8GB Review @ OCC
- PowerColor TurboDuo R9 290 4GB OC @ [H]ard|OCP
- EVGA GTX 980 Ti SC+ 6 GB @ techPowerUp
- EVGA GTX 970 SSC @ HardwareHeaven
Subject: General Tech, Graphics Cards | June 24, 2015 - 10:10 PM | Scott Michaud
Tagged: batman, wb games, consolitis, gameworks, pc gaming, nvidia, amd
Over the last few days, the PC version of Batman: Arkham Knight has been receiving a lot of flak. Sites like PC Gamer were unable to review the game because they allege that Warner Brothers would not provide pre-release copies to journalists except for the PS4 version. This is often met with cynicism that can be akin to throwing darts in an unlit room with the assumption that a dartboard is in there somewhere. Other times, it is validated.
Whether or not the lack of PC review copies was related, the consensus is that Arkham Knight is a broken game. After posting a troubleshooting guide on the forums to help users choose the appropriate settings, WB Games has pulled the plug and suspended the game's sales on Steam until the issues are patched.
TotalBiscuit weighs in on the issues with his latest "Port Report".
No-one seems to be talking about what the issue is. Fortunately or unfortunately, I don't have the game myself so I cannot look and speculate based on debug information (which they probably disabled from the released game anyway). I could wildly speculate about DX11 limits from the number of elements on screen, but that is not based on any actual numbers. They could be really good at instancing and other tricks to keep the chunks of work being sent to the GPU as large as possible. I don't know. Whatever the issue is, it sounds pretty bad.
A fury unlike any other...
Officially unveiled by AMD during E3 last week, we are finally ready to show you our review of the brand new Radeon R9 Fury X graphics card. Very few times has a product launch meant more to a company, and to its industry, than the Fury X does this summer. AMD has been lagging behind in the highest-tiers of the graphics card market for a full generation. They were depending on the 2-year-old Hawaii GPU to hold its own against a continuous barrage of products from NVIDIA. The R9 290X, despite using more power, was able to keep up through the GTX 700-series days, but the release of NVIDIA's Maxwell architecture forced AMD to move the R9 200-series parts into the sub-$350 field. This is well below the selling prices of NVIDIA's top cards.
The AMD Fury X hopes to change that with a price tag of $650 and a host of new features and performance capabilities. It aims to once again put AMD's Radeon line in the same discussion with enthusiasts as the GeForce series.
The Fury X is built on the new AMD Fiji GPU, an evolutionary part based on AMD's GCN (Graphics Core Next) architecture. This design adds a lot of compute horsepower (4,096 stream processors) and it also is the first consumer product to integrate HBM (High Bandwidth Memory) support with a 4096-bit memory bus!
Of course the question is: what does this mean for you, the gamer? Is it time to start making a place in your PC for the Fury X? Let's find out.