Subject: Graphics Cards | January 8, 2014 - 05:25 PM | Josh Walrath
Tagged: triple fans, R9 290X, r9 290, powercolor, liquid cooling, cooling, CES 2014, amd
The nice folks at PowerColor were foolish enough to invite us into their suite full of video cards. Unhappily, we were unable to abscond with a few items that we will list here. PowerColor has a smaller US presence than other manufacturers, but they are not afraid to experiment with unique cooling solutions for their cards.
A sharp looking card that is remarkably heavy.
Cooling is provided by EKWB.
In their suite they were showing off two new products based on the AMD R9 290X chips. The first was actually released back in December, 2013. This is the liquid cooling version of the AMD R9 290X. This little number comes in at a hefty $799. When we think about this price, it really is not that out of line. It features a very high end liquid cooling block that is extremely heavy and well built. The PCB looks like it mimics the reference design, but the cooling is certainly the unique aspect of this card. Again, this card is extremely heavy and well built.
Three fans are too much!
The display outputs are the same as the reference design, which is not a bad thing.
The second card is probably much more interesting to most users. This is a new cooling solution from PowerColor that attaches to the AMD R9 290X. The PCS+ cooler features three fans and is over two slots wide (we can joke about it being 2.5 slots wide, but I doubt anyone can use that extra half slot that is left over). PCS+ stands for Professional Cooling Systems. The board again looks like it is based on the reference PCB, but the cooler is really where the magic lies. This particular product should be able to compete with the other 3rd party coolers that we have seen applied to this particular chip from AMD. As such, it should be able to not only keep the clockspeed at a steady state throughout testing/gaming, but it should also allow a measure of overclocking to be applied.
The back is protected/supported by a large and stiff plate. Cooling holes help maximize performance.
This card will be offered at $679 US and will be available on January 15. The amount of units shipped will likely be fairly small, so keep a good eye out. AMD is ultimately in charge of providing partners with chips to integrate into their respective products, and so far I think those numbers have been a little bit more limited than hoped. It also doesn’t help that the market price has been inflated by all the coin miners that have been purchasing up the latest GCN based AMD cards for the past several months.
There is no denying that this is a large cooler. Hopefully cooling performance will match or exced that of products Ryan has already reviewed.
We also expect to see the R9 290 version of this card around the same timeframe. This is supposed to be released around the same time as the bigger, more expensive R9 290X. There should be more PowerColor content at PCPer over the next few months, so please stay tuned!
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards, Displays | January 8, 2014 - 01:01 AM | Ryan Shrout
Tagged: pq321q, PQ321, nvidia, gsync, g-sync, CES 2014, CES, asus, 4k
Just before CES Allyn showed you the process of modifying the ASUS VG248QE to support NVIDIA G-Sync variable refresh rate technology. It wasn't the easiest mod we have ever done but even users without a lot of skill will be able to accomplish it.
But at the NVIDIA booth at CES this year the company was truly showing off G-Sync technology to its fullest capability. By taking the 3840x2160 ASUS PQ321Q monitor and modifying it with the same G-Sync module technology we were able to see variable refresh rate support in 4K glory.
Obviously you can't see much from the photo above about the smoothness of the animation, but I can assure you that in person this looks incredible. In fact, 4K might be the perfect resolution for G-Sync to shine as running games at that high of a resolution will definitely bring your system to its knees, dipping below that magical 60 Hz / FPS rate. But when it does with this modified panel, you'll still get smooth game play and a a tear-free visual experience.
The mod is actually using the same DIY kit that Allyn used in his story though it likely has a firmware update for compatibility. Even with the interesting debate from AMD about the support for VRR in the upcoming DisplayPort 1.3 standard, it's impossible to not see the ASUS PQ321Q in 4K with G-Sync and instantly fall in love with PCs again.
Sorry - there are no plans to offer this upgrade kit for ASUS PQ321Q owners!
Follow all of our coverage of the show at http://pcper.com/ces!
DisplayPort to Save the Day?
During an impromptu meeting with AMD this week, the company's Corporate Vice President for Visual Computing, Raja Koduri, presented me with an interesting demonstration of a technology that allowed the refresh rate of a display on a Toshiba notebook to perfectly match with the render rate of the game demo being shown. The result was an image that was smooth and with no tearing effects. If that sounds familiar, it should. NVIDIA's G-Sync was announced in November of last year and does just that for desktop systems and PC gamers.
Since that November unveiling, I knew that AMD would need to respond in some way. The company had basically been silent since learning of NVIDIA's release but that changed for me today and the information discussed is quite extraordinary. AMD is jokingly calling the technology demonstration "FreeSync".
Variable refresh rates as discussed by NVIDIA.
During the demonstration AMD's Koduri had two identical systems side by side based on a Kabini APU . Both were running a basic graphics demo of a rotating windmill. One was a standard software configuration while the other model had a modified driver that communicated with the panel to enable variable refresh rates. As you likely know from our various discussions about variable refresh rates an G-Sync technology from NVIDIA, this setup results in a much better gaming experience as it produces smoother animation on the screen without the horizontal tearing associated with v-sync disabled.
Obviously AMD wasn't using the same controller module that NVIDIA is using on its current G-Sync displays, several of which were announced this week at CES. Instead, the internal connection on the Toshiba notebook was the key factor: Embedded Display Port (eDP) apparently has a feature to support variable refresh rates on LCD panels. This feature was included for power savings on mobile and integrated devices as refreshing the screen without new content can be a waste of valuable battery resources. But, for performance and gaming considerations, this feature can be used to initiate a variable refresh rate meant to smooth out game play, as AMD's Koduri said.
Subject: Graphics Cards, Cases and Cooling | January 6, 2014 - 01:00 PM | Morry Teitelman
Tagged: water cooling, ROG, nvidia, gtx 780, gtx 770, CES 2014, CES, asus
In keeping with their tradition of pushing the innovation and performance boundaries through their ROG product line, ASUS today released NVIDIA-based 7-seried video cards featuring as part of their Poseidon series of liquid cooled products. ASUS released both a GTX 780 and GTX 770-based product with the hybrid Poseidon cooling solution.
Courtesy of ASUS
All Poseidon series graphics cards come with a hybrid cooling solution, using a combination of fan-based and water-based cooling to propel these cards to new performance heights. The card's GPU is cooled with the DirectCU H20 cooler with water pathed through the integrated barbs to the copper-based cooler. The water inlets are threaded, accepting G1/4" sized male fittings. The memory and VRM components are cooled by a massive dual-fan heat pipe, exhausting air through the rear panel port. Both cards feature the red and black ROG coloration with the Poseidon series name displayed prominently along the right front edge of the card.
Courtesy of ASUS
Both Poseidon series graphics cards, the ROG Poseidon GTX 770 and ROG Poseidon GTX 780, include ASUS' DIGI+ VRM and Super Allow Power power circuitry to ensure stability and component life under the most grueling conditions. When paired with the Poseidon cooling, the GPU ran 20% cooler and 3 times quieter than a comparable reference card with card operating temperatures 16 C lower than the same reference solution.
Follow all of our coverage of the show at http://pcper.com/ces!
Introduction and Unboxing
We've been covering NVIDIA's new G-Sync tech for quite some time now, and displays so equipped are finally shipping. With all of the excitement going on, I became increasingly interested in the technology, especially since I'm one of those guys who is extremely sensitive to input lag and the inevitable image tearing that results from vsync-off gaming. Increased discussion on our weekly podcast, coupled with the inherent difficulty of demonstrating the effects without seeing G-Sync in action in-person, led me to pick up my own ASUS VG248QE panel for the purpose of this evaluation and review. We've generated plenty of other content revolving around the G-Sync tech itself, so lets get straight into what we're after today - evaluating the out of box installation process of the G-Sync installation kit.
All items are well packed and protected.
Included are installation instructions, a hard plastic spudger for opening the panel, a couple of stickers, and all necessary hardware bits to make the conversion.
Subject: General Tech, Graphics Cards | December 31, 2013 - 02:46 PM | Scott Michaud
Tagged: linux, iris pro, iris, Intel, haswell
'Tis the season to be sharing and giving (unless you are Disney).
According to Phoronix, Intel has shipped (heh heh heh, "boatload") over 5,000 pages of documentation and technical specs for their Haswell iGPUs including the HD, Iris, and Iris Pro product lines. The intricacies of the 3D engine, GPGPU computation, and video acceleration are laid bare for the open source community. Video acceleration is something that is oft omit from the manuals of other companies.
Phoronix believes that Intel is, and has been, the best GPU vendor for open-source drivers. AMD obviously supports their open-source community but Intel edges them out on speed. The Radeon HD 7000 series is just beginning to mature, according to their metric, and Hawaii is far behind that. For NVIDIA, the Nouveau driver is still developed primarily by reverse-engineering. That said, documentation was released a few months ago.
Of course all of these comparisons are only considering the open-source drivers.
NVIDIA prides itself on their proprietary driver offering and AMD pretty much offers both up for the user to chose between. Phoronix claims that Intel employs over two-dozen open-source Linux graphics developers but, of course, that is their only graphics driver for Linux. That is not a bad thing, of course, because a launch open-source GPU driver is really identical to what they would launch for a proprietary driver just without slapping the wrists of anyone who tries to tweak it. It does make sense for Intel, however, because community support will certainly do nothing but help their adoption.
If you would like to check out the documentation, it is available at Intel's 01.org.
Sapphire Triple Fan Hawaii
It was mid-December when the very first custom cooled AMD Radeon R9 290X card hit our offices in the form of the ASUS R9 290X DirectCU II. It was cooler, quieter, and faster than the reference model; this is a combination that is hard to pass up (if you could buy it yet). More and more of these custom models, both in the R9 290 and R9 290X flavor, are filtering their way into PC Perspective. Next on the chopping block is the Sapphire Tri-X model of the R9 290X.
Sapphire's triple fan cooler already made quite an impression on me when we tested a version of it on the R9 280X retail round up from October. It kept the GPU cool but it was also the loudest of the retail cards tested at the time. For the R9 290X model, Sapphire has made some tweaks to the fan speeds and the design of the cooler which makes it a better overall solution as you will soon see.
The key tenets for any AMD R9 290/290X custom cooled card is to beat AMD's reference cooler in performance, noise, and variable clock rates. Does Sapphire meet these goals?
The Sapphire R9 290X Tri-X 4GB
While the ASUS DirectCU II card was taller and more menacing than the reference design, the Sapphire Tri-X cooler is longer and appears to be more sleek than the competition thus far. The bright yellow and black color scheme is both attractive and unique though it does lack the LED light that the 280X showcased.
Sapphire has overclocked this model slightly, to 1040 MHz on the GPU clock, which puts it in good company.
|AMD Radeon R9 290X||ASUS R9 290X DirectCU II||Sapphire R9 290X Tri-X|
|Rated Clock||1000 MHz||1050 MHz||1040 MHz|
|Memory Clock||5000 MHz||5400 MHz||5200 MHz|
|TDP||~300 watts||~300 watts||~300 watts|
|Peak Compute||5.6 TFLOPS||5.6+ TFLOPS||5.6T TFLOPS|
There are three fans on the Tri-X design, as the name would imply, but each are the same size unlike the smaller central fan design of the R9 280X.
Subject: Graphics Cards | December 31, 2013 - 12:13 PM | Jeremy Hellstrom
Tagged: amd, asus, ASUS ROG, MATRIX PLATINUM R9 280X
There is a lot to love about the ASUS ROG Matrix Platinum R9 280X, from its looks and faster and quieter performance when compared to the reference model. Still, [H]ard|OCP doesn't recommend running the fan at 100% except in brief moments when you need to dump heat quickly as it is quite loud but 50-60% will keep you below 90C and is not very noticeable. It overclocks very well, they hit 1270MHz core and 6.6GHz VRAM easily and noticed performance improvements when they did so, pushing past the GTX 770 occasionally. There is one major disappointment with this card, the price is currently over 50% higher than the base 280X MSRP so you might want to hold off for a while before thinking of purchasing this card.
"Today we take ASUS' elite R9 280X product, the ASUS ROG MATRIX PLATINUM R9 280X video card, and put its advanced power phasing and board components to the test as we harshly overclock it. We compare it to an ASUS GeForce GTX 770 DirectCU II. Then we will put it head to head against the overclocked SAPPHIRE TOXIC R9 280X."
Here are some more Graphics Card articles from around the web:
- Powercolor Radeon R9-290X Review @ Bjorn3D
- Gigabyte R9 270X Windforce OC @ Kitguru
- Sapphire R9 290X Tri-X OC @ Kitguru
- The $109 Console-killer GPU: AMD's Radeon R7 260 Graphics Card Reviewed @ Techgage
- AMD Catalyst 2013 Linux Graphics Driver Year-In-Review @ Phoronix
- Asus GTX 780 Ti DirectCU II OC @ Kitguru
- Zotac GeForce GTX 780 Ti AMP! Review @ Bjorn3D
Subject: Graphics Cards | December 30, 2013 - 11:32 AM | Ryan Shrout
Tagged: amd, Mantle, hawaii, BF4, battlefield 4
If you have been following the mess than has been Battlefield 4 since its release, what with the crashing on both PCs and consoles, you know that EA and DICE have decided that fixing the broken game is the number 1 priority. Gee, thanks.
While they work on that though, there is another casualty of development other than the pending DLC packs: AMD's Mantle version of the game. If you remember way back in September of 2013, along with the announcement of AMD's Hawaii GPUs, AMD and DICE promised a version of the BF4 game running on Mantle as a free update in December. If you are counting, that is just 1 more day away from being late.
Today we got this official statement from AMD:
After much consideration, the decision was made to delay the Mantle patch for Battlefield 4. AMD continues to support DICE on the public introduction of Mantle, and we are tremendously excited about the coming release for Battlefield 4! We are now targeting a January release and will have more information to share in the New Year.
Well, it's not a surprise but it sure is a bummer. One of the killer new features for AMD's GPUs was supposed to be the ability to use this new low-level API to enhance performance for PC games. As Josh stated in our initial article on the subject, "It bypasses DirectX (and possibly the hardware abstraction layer) and developers can program very close to the metal with very little overhead from software. This lowers memory and CPU usage, it decreases latency, and because there are fewer “moving parts” AMD claims that they can do 9x the draw calls with Mantle as compared to DirectX. This is a significant boost in overall efficiency."
It seems that buyers of the AMD R9 series of graphics need to wait at least another month to really see what the promise of Mantle is really all about. Will the wait be worth it?
Subject: General Tech, Graphics Cards | December 24, 2013 - 01:15 AM | Scott Michaud
Tagged: R9 290X, r9 290, msi
MSI just announced their two customized Hawaii GPUs. One of the two new boards will be based on the R9 290 and the other based on the R9 290X. The design is based around the Twin Frozr IV Advanced two-fan model found on previous cards.
The specifications of the 290X version include three different modes: OC, Gaming and Silent. The Silent mode will run at 1000 MHz which is the same clock speed as the reference models were set at. In Gaming mode the card will run at 1030 MHz and in OC mode it will clock at 1040 MHz. Obviously there will be some slight noise level variances between them but I am pretty sure that the difference between Gaming and OC mode is going to be negligible.
The R9 290 version has the same three settings, but the clock speeds are 947 MHz, 977 MHz and 1007 MHz respectively.
As a final note: MSI's press release claims, "Available Now". It does not appear to be available on either Amazon or Newegg but NCIX claims that it is estimated to arrive February 26th, 2014 for $700. I seriously hope that there are a few typoes... maybe they meant December 26? Maybe they meant not a more-than-$100 premium?
It is, unfortunately, still a wait and see game with these custom AIBs.
Get notified when we go live!