Author:
Manufacturer: ASUS

The First Custom R9 290X

It has been a crazy launch for the AMD Radeon R9 series of graphics cards.  When we first reviewed both the R9 290X and the R9 290, we came away very impressed with the GPU and the performance it provided.  Our reviews of both products resulted in awards of the Gold class.  The 290X was a new class of single GPU performance while the R9 290 nearly matched performance at a crazy $399 price tag.

But there were issues.  Big, glaring issues.  Clock speeds had a huge amount of variance depending on the game and we saw a GPU that was rated as "up to 1000 MHz" running at 899 MHz in Skyrim and 821 MHz in Bioshock Infinite.  Those are not insignificant deltas in clock rate that nearly perfectly match deltas in performance.  These speeds also changed based on the "hot" or "cold" status of the graphics card - had it warmed up and been active for 10 minutes prior to testing?  If so, the performance was measurably lower than with a "cold" GPU that was just started. 

IMG_9043.JPG

That issue was not necessarily a deal killer; rather, it just made us rethink how we test GPUs. The fact that many people were seeing lower performance on retail purchased cards than with the reference cards sent to press for reviews was a much bigger deal.  In our testing in November the retail card we purchased, that was using the exact same cooler as the reference model, was running 6.5% slower than we expected. 

The obvious hope was the retail cards with custom PCBs and coolers would be released from AMD partners and somehow fix this whole dilemma.  Today we see if that was correct.

Continue reading our review of the ASUS R9 290X DirectCU II Graphics Card!!

ASUS Announced GTX 780 Ti DirectCU II Graphics Card

Subject: General Tech, Graphics Cards | December 18, 2013 - 04:25 AM |
Tagged: GeForce GTX 780 Ti, DirectCU II, asus

There has not been too many custom coolers for top-end NVIDIA graphics cards as of late. Starting with the GeForce GTX 690, NVIDIA allegedly demands AIB partners stick to the reference designs for certain models. Obviously, this is a problem as it limits the innovation realized by partners when they are forced to compete on fewer metrics (although the reference designs were pretty good regardless). This is especially true because the affected models are the upper high-end where pricing is more flexible if the product is worth it.

ASUS-GTX780TI-DC2OC-3GD5-with-box.jpg

This is apparently not the case for the top end GTX 780 Ti. ASUS has just announced the GeForce GTX 780 Ti DirectCU II graphics card. ASUS claims this will lead to 30% cooler operating with 3x less noise. A 6% bump to performance (as measured in Battlefield 4) will accompany that cooler and quieter operation as the full GK110 GPU will boost to 1020MHz.

ASUS makes custom GPUs for both AMD and NVIDIA. Be sure to check out our review of another high-end DirectCU II card, with 100% less NVIDIA, very soon. It will definitely be a great read and maybe even an excellent podcast topic.

Source: ASUS

GeForce Experience 1.8.1 Released with Twitch Streaming

Subject: General Tech, Graphics Cards | December 17, 2013 - 05:02 PM |
Tagged: nvidia, ShadowPlay, geforce experience

Another update to GeForce Experience brings another anticipated ShadowPlay feature. The ability to stream live gameplay to Twitch, hardware accelerated by Kepler, was demoed at the NVIDIA event in Montreal from late October. They showed Batman Origins streaming at 1080p 60FPS without capping or affecting the in-game output settings.

nvidia-shadowplay-twitch.png

GeForce Experience 1.8.1 finally brings that feature, in beta of course, to the general public. When set up, Alt + F8 will launch the Twitch stream and Alt + F6 will activate your webcam. Oh, by the way, one feature they kept from us (or at least me) is the ability to overlay your webcam atop your gameplay.

Nice touch NVIDIA.

Of course the upstream bandwidth requirements of video are quite high: 3.5Mbps on the top end, a more common 2Mbps happy medium, and a 0.75Mbps minimum. NVIDIA has been trying to ensure that your machine will not lag but there's nothing a GPU can do about your internet connection.

GeForce Experience 1.8.1 is available now at the GeForce website.

Author:
Manufacturer: ASUS

A slightly smaller MARS

The NVIDIA GeForce GTX 760 was released in June of 2013.  Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously.  My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD. 

As for ASUS, they have a storied history with the MARS brand.  Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them.  The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course).  The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide.  It was heavy, expensive and fast; blazing fast.  But at a price of $1200+ it wasn't on the radar of most PC gamers.

IMG_9023.JPG

Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate.  Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.

Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market.  Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them.  The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.

Continue reading our review of the ASUS ROG MARS 760 Dual GPU Graphics Card!!

Monster Madness: A First For Web Standards

Subject: General Tech, Graphics Cards | December 13, 2013 - 05:43 PM |
Tagged: webgl, ue4, UE3, asm.js

asm.js is a special division of Javascript, for numerical calculations, which can be heavily optimized and easily output by compilers for other languages such as C++. Both Mozilla Firefox and Google Chrome optimize asm.js code albeit there are differences in their implementations. In both cases, performance has become very close to the same applications compiled into native code for the host operating systems.

Its shortcoming is the difficulty and annoyance when hand coding (without compiling it from another language). The browser is used more by encouraging the adoption of web standards through discouraging the usage of web standards. You can see where the politics can enter.

Still, it makes for great demos such as the cloth physics applet from James Long of Mozilla or, more amazingly, Unreal Engine 3. The upcoming UE4 is expected to be officially supported by Epic Games on asm.js (and obviously WebGL will be necessary too) but, of course, Epic will not prevent UE3 licensees from doing their own leg-work.

NomNom Games, a group within Trendy Entertainment (Trendy is known for Dungeon Defenders), became the first company to release a commercial 3D title on these standards. Monster Madness, powered by Unreal Engine 3, runs in web browsers like Mozilla Firefox and Google Chrome without plugins (although it will fail-down to Flash 11.6 if your browser is unsupported for the web-native version). Monster Madness is a top-down cell shaded shoot'em-up.

You can play, for free, with an anonymous token here. You can also visit their website to learn more about the closed beta for registered accounts. It is natively supported on Firefox, Chrome, and Opera. I am not entirely sure why IE11 is not supported, now that Microsoft supports WebGL, but there is probably a customer support or performance reason for it.

Source: Mozilla

Video Perspective: GPU Shortages and Litecoin Mining Discussion

Subject: Graphics Cards | December 12, 2013 - 05:20 PM |
Tagged: video, amd, radeon, hawaii, r9 290, R9 290X, bitcoin, litecoin, mining

If you already listened to this weeks PC Perspective Podcast, then feel free to disregard this post.  For the rest of you - subscribe to our damned weekly podcast would you already?!?

In any event, I thought it might be interesting to extract this 6 minute discussion we had during last nights live streamed podcast about how the emergence of Litecoin mining operations is driving up prices of GPUs, particularly the compute-capable R9 290 and R9 290X Hawaii-based cards from AMD.

Check out these prices currently on Amazon!

The price of the GTX 770 is a bit higher than it should be while the GTX 780 and GTX 780 Ti are priced in the same range they have been for the last month or so.  The same cannot be said for the AMD cards listed here - the R9 280X is selling for $130 more than its expected MSRP at a minimum but you'll see quite a few going for much higher on Amazon, Ebay (thanks TR) and others.  The Radeon R9 290 has an MSRP of $399 from AMD but the lowest price we found on Amazon was $499 and anything on Newegg.com is showing at the same price, but sold out.  The R9 290X is even more obnoxiously priced when you can find them.

03.jpg

Do you have any thoughts on this?  Do you think Litecoin mining is really causing these price inflations and what does that mean for AMD, NVIDIA and the gamer?

Author:
Manufacturer: NVIDIA

Quality time with G-Sync

Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology.  When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers.  This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync. 

IMG_8938.JPG

NVIDIA's Prototype G-Sync Monitor

We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics.  All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.

Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed.  This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.

In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync.  In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better.  It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.

The story today is more about extensive hands-on testing with the G-Sync prototype monitors.  The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future.  These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market.  However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user. 

Continue reading our tech preview of NVIDIA G-Sync!!

AMD Expects Beta Eyefinity Fix in January

Subject: General Tech, Graphics Cards | December 11, 2013 - 05:58 PM |
Tagged: frame pacing, frame rating, amd, southern islands, 4k, eyefinity, crossfire, microstutter

The frame pacing issue has been covered at our website for almost a year now. It stems from the original "microstutter" problem which dates back over a year before we could quantify it. We like to use the term "Frame Rating" to denote the testing methodology we now use for our GPU tests.

amd-crossfire.JPG

AMD fared worse at these tests than NVIDIA (although even they had some problems in certain configurations). They have dedicated a lot of man-hours to the problem resulting in a driver updates for certain scenarios. Crossfire while utilizing Eyefinity or 4K MST was one area they did not focus on. The issue has been addressed in Hawaii and AMD asserted that previous cards will get a software fix soon.

The good news is that we have just received word from AMD that they plan on releasing a beta driver for Southern Islands and earlier GPUs (AMD believes it should work for anything that's not "legacy"). As usual, until it ships anything could change, but it looks good for now.

The beta "frame pacing" driver addressing Crossfire with 4K and Eyefinity, for supported HD-series and Southern Islands-based Rx cards, is expected to be public sometime in January.

Source: AMD

ASUS Teases R9 290X DirectCU II

Subject: General Tech, Graphics Cards | December 9, 2013 - 03:03 AM |
Tagged: R9 290X, DirectCU II, asus

The AMD Radeon R9 290X is a very good graphics processor whose reference design is marred with a few famous design choices. AMD specs the GPU to run at a maximum of 95C, perpetually, and will push its frequency up to 1 GHz if it can stay at or under that temperature. Its cooler in the typical, "Quiet", default setting is generally unable to keep this frequency for more than a handful of minutes. This lead to countless discussions about what it means to be a default and what are the components actual specifications.

ASUS-R9290X-DirectCU2.jpg

All along we note that custom designs from add-in board (AIB) partners could change everything.

ASUS seems to be first to tease their custom solution. This card, based on their DirectCU II design, uses two fans and multiple 10mm nickel plated heatpipes directly atop the processor. The two fans should be able to move more air at a slower rate of rotation and thus be more efficient per decibel. The heatsink itself might also be able to pull heat, quicker, altogether. I am hoping that ASUS provisioned the part to remain at a stable 1GHz under default settings or perhaps even more!

The real test for Hawaii will be when the wave of custom editions washes on shore. We know the processor is capable of some pretty amazing performance figures when it can really open up. This, and other partner boards, would make for possibly the most interesting AIB round-up we have ever had.

No word, yet, on pricing or availability.

(Phoronix) Four Monitors on Linux: AMD and NVIDIA

Subject: General Tech, Graphics Cards | December 5, 2013 - 03:17 AM |
Tagged: linux, nvidia surround, eyefinity

Could four 1080p monitors be 4K on the cheap? Probably not... but keep reading.

phoronix-linux-4monitor.jpg

Image Credit: Phoronix

Phoronix published an article for users interested in quad monitor gaming on Linux. Sure, you might think this is a bit excessive especially considering the bezel at the center of your screen. On the other hand, imagine you are playing a four player split-screen game. That would definitely get some attention. Each player would be able to tilt their screen out of the view of their opponents while only using a single computer.

In his 8-page editorial, Michael Larabel tests the official and popular open source drivers for both AMD and NVIDIA. The winner was NVIDIA's proprietary driver although the open source solution, Nouveau, seemed to fair the worst of the batch. This is the typical trade-off with NVIDIA. It was only just recent that The Green Giant opened up documentation for the other chefs in the kitchen... so these results may change soon.

If you are interested in gaming on Linux, give the article a read.

Source: Phoronix