Author:
Manufacturer: MSI

Lightning Returns

With the GPU landscape mostly settled for 2014, we have the ability to really dig in and evaluate the retail models that continue to pop up from NVIDIA and AMD board partners. One of our favorite series of graphics cards over the years comes from MSI in the form of the Lightning brand. These cards tend to take the engineering levels to a point other designers simply won't do - and we love it! Obviously the target of this capability is additional overclocking headroom and stability, but what if the GPU target has issues scaling already?

That is more or less the premise of the Radeon R9 290X Lightning from MSI. AMD's Radeon R9 290X Hawaii GPU is definitely a hot and power hungry part and that caused quite a few issues at the initial release. Since then though, both AMD and its add-in card partners have worked to improve the coolers installed on these cards to improve performance reliability and decrease the LOUD NOISES produced by the stock, reference cooler.

Let's dive into the latest to hit our test bench, the MSI Radeon R9 290X Lightning.

The MSI Radeon R9 290X Lightning

IMG_9897.JPG

MSI continues to utilize the yellow and black color scheme that many of the company's high end parts integrate and I love the combination. I know that both NVIDIA and AMD disapprove of the distinct lack of "green" and "red" in the cooler and box designs, but good on MSI for sticking to its own thing. 

IMG_9900.JPG

The box for the Lightning card is equal to the prominence of the card itself and you even get a nifty drawer for all of the included accessories.

IMG_9901.JPG

We originally spotted the MSI R9 290X Lightning at CES in January and the design remains the same. The cooler is quite large (and damn heavy) and is cooled by a set of three fans. The yellow fan in the center is smaller and spins a bit faster, creating more noise than I would prefer. All fan speeds can be adjusted with MSI's included fan control software.

Continue reading our review of the MSI Radeon R9 290X Lightning Graphics Card!!

AMD Demonstrates Prototype FreeSync Monitor with DisplayPort Adaptive Sync Feature

Subject: Graphics Cards, Displays | June 4, 2014 - 12:40 AM |
Tagged: gsync, g-sync, freesync, DisplayPort, computex 2014, computex, adaptive sync

AMD FreeSync is likely a technology or brand or term that is going to be used a lot between now and the end of 2014. When NVIDIA introduced variable refresh rate monitor technology to the world in October of last year, one of the immediate topics of conversation was the response that AMD was going to have. NVIDIA's G-Sync technology is limited to NVIDIA graphics cards and only a few (actually just one still as I write this) monitors actually have the specialized hardware to support it. In practice though, variable refresh rate monitors fundamentally change the gaming experience for the better

freesync1.jpg

At CES, AMD went on the offensive and started showing press a hacked up demo of what they called "FreeSync", a similar version of the variable refresh technology working on a laptop. At the time, the notebook was a requirement of the demo because of the way AMD's implementation worked. Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates - a significantly smoother gaming experience without the side effects of Vsync.

Our video preview of NVIDIA G-Sync Technology

Since that January preview, things have progressed for the "FreeSync" technology. Taking the idea to the VESA board responsible for the DisplayPort standard, in April we found out that VESA had adopted the technology and officially and called it Adaptive Sync

So now what? AMD is at Computex and of course is taking the opportunity to demonstrate a "FreeSync" monitor with the DisplayPort 1.2a Adaptive Sync feature at work. Though they aren't talking about what monitor it is or who the manufacturer is, the demo is up and running and functions with frame rates wavering between 40 FPS and 60 FPS - the most crucial range of frame rates that can adversely affect gaming experiences. AMD has a windmill demo running on the system, perfectly suited to showing Vsync enabled (stuttering) and Vsync disabled (tearing) issues with a constantly rotating object. It is very similar to the NVIDIA clock demo used to show off G-Sync.

freesync2.jpg

The demo system is powered by an AMD FX-8350 processor and Radeon R9 290X graphics card. The monitor is running at 2560x1440 and is the very first working prototype of the new standard. Even more interesting, this is a pre-existing display that has had its firmware updated to support Adaptive Sync. That's potentially exciting news! Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

freesync3.jpg

The time frame for retail available monitors using DP 1.2a is up in the air but AMD has told us that the end of 2014 is entirely reasonable. Based on the painfully slow release of G-Sync monitors into the market, AMD has less of a time hole to dig out of than we originally thought, which is good. What is not good news though is that this feature isn't going to be supported on the full range of AMD Radeon graphics cards. Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology. Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards.

freesync4.jpg

All that aside, seeing the first official prototype of "FreeSync" is awesome and is getting me pretty damn excited about the variable refresh rate technologies once again! Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate.

Richard Huddy Departs Intel, Rejoins AMD

Subject: Graphics Cards, Processors | June 3, 2014 - 02:10 PM |
Tagged: Intel, amd, richard huddy

Interesting news is crossing the ocean today as we learn that Richard Huddy, who has previously had stints at NVIDIA, ATI, AMD and most recently, Intel, is teaming up with AMD once again. Richard brings with him years of experience and innovation in the world of developer relations and graphics technology. Often called "the Godfather" of DirectX, AMD wants to prove to the community it is taking PC gaming seriously.

richardhuddy.jpg

The official statement from AMD follows:

AMD is proud to announce the return of the well-respected authority in gaming, Richard Huddy. After three years away from AMD, Richard returns as AMD's Gaming Scientist in the Office of the CTO - he'll be serving as a senior advisor to key technology executives, like Mark Papermaster, Raja Koduri and Joe Macri. AMD is extremely excited to have such an industry visionary back. Having spent his professional career with companies like NVIDIA, Intel and ATI, and having led the worldwide ISV engineering team for over six years at AMD, Mr. Huddy has a truly unique perspective on the PC and Gaming industries.

Mr. Huddy rejoins AMD after a brief stint at Intel, where he had a major impact on their graphics roadmap.  During his career Richard has made enormous contributions to the industry, including the development of DirectX and a wide range of visual effects technologies.  Mr. Huddy’s contributions in gaming have been so significant that he was immortalized as ‘The Scientist’ in Max Payne (if you’re a gamer, you’ll see the resemblance immediately). 

Kitguru has a video from Richard Huddy explaining his reasoning for the move back to AMD.

Source: Kitguru.net

This move points AMD in a very interesting direction going forward. The creation of the Mantle API and the debate around AMD's developer relations programs are going to be hot topics as we move into the summer and I am curious how quickly Huddy thinks he can have an impact.

I have it on good authority we will find out very soon.

Computex 2014: ASUS Announces ROG ARES III Water-Cooled Gaming Graphics Card

Subject: Graphics Cards | June 2, 2014 - 11:41 PM |
Tagged: computex, radeon, r9 295x2, Hawaii XT, dual gpu, computex 2014, ASUS ROG, asus, Ares, amd

The latest installment in the ASUS ARES series of ultra-powerful, limited-edition graphics cards has been announced, and the Ares III is set to be the “world’s fastest” video card.

IMG_20140602_190930.jpg

The ARES III features a full EK water block

The dual-GPU powerhouse is driven by two “hand-selected” Radeon Hawaii XT GPUs (R9 290X cores) with 8GB of GDDR5 memory. The card is overclockable according to ASUS, and will likely arrive factory overclocked as they claim it will be faster out of the box than the reference R9-295x2. The ARES III features a custom-designed EK water block, so unlike the R9 295x2 the end user will need to supply the liquid cooling loop.

ASUS claims that the ARES III will “deliver 25% cooler performance than reference R9 295X designs“, but to achieve this ASUS “highly” recommends a high flow rate loop with at least a 120x3 radiator “to extract maximum performance from the card,” and they “will provide a recommended list of water cooling systems at launch”.

Only 500 of the ARES III will be made, and are individually numbered. No pricing has been announced, but ASUS says to expect it to be more than a 295x2 ($1499) - but less than a TITAN Z ($2999). The ASUS ROG ARES III will be available in Q3 2014.

For more Computex 2014 coverage, please check out our feed!

Source: ASUS

NVIDIA Launches GeForce Experience 2.1

Subject: General Tech, Graphics Cards | June 2, 2014 - 05:52 PM |
Tagged: nvidia, geforce, geforce experience, ShadowPlay

NVIDIA has just launched another version of their GeForce Experience, incrementing the version to 2.1. This release allows video of up to "2500x1600", which I assume means 2560x1600, as well as better audio-video synchronization in Adobe Premiere. Also, because why stop going after FRAPS once you start, it also adds an in-game framerate indicator. It also adds push-to-talk for recording the microphone.

nvidia-geforce-experience.png

Another note: when GeForce Experience 2.0 launched, it introduced streaming of the user's desktop. This allowed recording of OpenGL and windowed-mode games by simply capturing an entire monitor. This mode was not capable of "Shadow Mode", which I believed was because they thought users didn't want a constant rolling video to be taken of their desktop in the event that they wanted to save a few minutes of it at some point. Turns out that I was wrong; the feature was coming and it arrived with GeForce Experience 2.1.

GeForce Experience 2.1 is now available at NVIDIA's website, unless it already popped up a notification for you.

Source: NVIDIA

AMD Catalyst Driver 14.6 BETA released

Subject: Graphics Cards | May 28, 2014 - 07:17 PM |
Tagged: driver, Catalyst 14.4 beta, amd

Get the latest Catalyst for your Radeon!

images.jpg

  • Starting with AMD Catalyst 14.6 Beta, AMD will no longer support Windows 8.0 (and the WDDM 1.2 driver) Windows 8.0 users should upgrade (for Free) to Windows 8.1 to take advantage of the new features found in the AMD Catalyst 14.6 Beta
  • AMD Catalyst 14.4 will remain available for users who wish to remain on Windows 8 A future AMD Catalyst release will allow for the WDDM 1.1 (Windows 7 driver) to be installed under Windows 8.0 for those users unable to upgrade to Windows 8.1
  • The AMD Catalyst 14.6 Beta Driver can be downloaded from the following links: AMD Catalyst 14.6 Beta Driver for Windows
  • NOTE! This Catalyst Driver is provided "AS IS", and under the terms and conditions of the End User License Agreement provided with it.

Featured Improvements

  • Performance improvements
    • Watch Dogs performance improvements AMD Radeon R9 290X - 1920x1080 4x MSAA – improves up to 25%
    • AMD Radeon R9290X - 2560x1600 4x MSAA – improves up to 28%
    • AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 92% scaling
  • Murdered Soul Suspect performance improvements
    • AMD Radeon R9 290X – 2560x1600 4x MSAA – improves up to 16%
    • AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 93% scaling
  • AMD Eyefinity enhancements: Mixed Resolution Support
    • A new architecture providing brand new capabilities
    • Display groups can be created with monitors of different resolution (including difference sizes and shapes)
    • Users have a choice of how surface is created over the display group
      • Fill – legacy mode, best for identical monitors
      • Fit – create the Eyefinity surface using best available rectangular area with attached displays.
      • Expand – create a virtual Eyefinity surface using desktops as viewports onto the surface.
  • Eyefinity Display Alignment
    • Enables control over alignment between adjacent monitors
    • One-Click Setup Driver detects layout of extended desktops
    • Can create Eyefinity display group using this layout in one click!
    • New user controls for video color and display settings
  • Greater control over Video Color Management:
    • Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
    • Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
    • Allows users to select different color depths per resolution and display
  • AMD Mantle enhancements
    • Mantle now supports AMD Mobile products with Enduro technology
    • Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) – 21% gain
    • Thief: AMD Radeon HD 8970M (1920x1080; high settings) – 14% gain
    • Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) – 274% gain
    • Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)

... and much more, grab it here.

Source: AMD
Author:
Manufacturer: Various

The AMD Argument

Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:

Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.

The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).

It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.

watch_dogs_ss9_99866.jpg

Watch_Dogs is the latest GameWorks title released this week.

I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.

The AMD Stance

Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.

Continue reading our editorial on the verbal battle between AMD and NVIDIA about the GameWorks program!!

Futuremark Announces 3DMark Sky Diver Benchmark

Subject: General Tech, Graphics Cards | May 28, 2014 - 01:49 PM |
Tagged: benchmarking, 3dmark

HELSINKI, FINLAND – May 28, 2014 – Futuremark today announced 3DMark Sky Diver, a new DirectX 11 benchmark test for gaming laptops and mid-range PCs. 3DMark Sky Diver is the ideal test for benchmarking systems with mainstream DirectX 11 graphics cards, mobile GPUs, or integrated graphics. A preview trailer for the new benchmark shows a wingsuited woman skydiving into a mysterious, uncharted location. The scene is brought to life with tessellation, particles and advanced post-processing effects. Sky Diver will be shown in full at Computex from June 3-7, or find out more on the Futuremark website.

Jukka Mäkinen, Futuremark CEO said, "Some people think that 3DMark is only for high-end hardware and extreme overclocking. Yet millions of PC gamers rely on 3DMark to choose systems that best balance performance, efficiency and affordability. 3DMark Sky Diver complements our other tests by providing the ideal benchmark for gaming laptops and mainstream PCs."

3DMark - The Gamer's Benchmark for all your hardware
3DMark is the only benchmark that offers a range of tests for different classes of hardware:

  • Fire Strike, for high performance gaming PCs (DirectX 11, feature level 11)
  • Sky Diver, for gaming laptops and mid-range PCs (DirectX 11, feature level 11)
  • Cloud Gate, for notebooks and typical home PCs (DirectX 11 feature level 10)
  • Ice Storm, for tablets and entry level PCs (DirectX 11 feature level 9)

With 3DMark, you can benchmark the full performance range of modern DirectX 11 graphics hardware. Where Fire Strike is like a modern game on ultra high settings, Sky Diver is closer to a DirectX 11 game played on normal settings. This makes Sky Diver the best choice for benchmarking entry level to mid-range systems and Fire Strike the perfect benchmark for high performance gaming PCs.

See 3DMark Sky Diver in full at Computex
3DMark Sky Diver will be on display on the ASUS, MSI, GIGABYTE, Galaxy, Inno3D, and G-Skill booths at Computex, June 3-7.

S.Y. Shian, ASUS Vice President & General Manager of Notebook Business Unit said,

"We are proud to partner with Futuremark to show 3DMark Sky Diver at Computex. Sky Diver helps PC gamers choose systems that offer great performance and great value. We invite everyone to visit our stand to experience 3DMark Sky Diver on a range of new ASUS products."

unnamed.jpg

Sky Diver will be released as an update for all editions of 3DMark, including the free 3DMark Basic Edition. 

Source: Futuremark

NVIDIA Finally Launches GeForce GTX Titan Z Graphics Card

Subject: Graphics Cards | May 28, 2014 - 11:19 AM |
Tagged: titan z, nvidia, gtx, geforce

Though delayed by a month, today marks the official release of NVIDIA's Titan Z graphics card, the dual GK110 beast with the $3000 price tag. The massive card was shown for the first time in March at NVIDIA's GPU Technology Conference and our own Tim Verry was on the grounds to get the information

The details remain the same:

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). For the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

The difference now of course is that all the clock speeds and pricing are official. 

titanzspecs.png

A base clock speed of 705 MHz with a Boost rate of 876 MHz places it well behind the individual GPU performance of a GeForce GTX 780 Ti or GTX Titan Black (rated at 889/980 MHz). The memory clock speed remains the same at 7.0 Gbps and you are still getting a massive 6GB of memory per GPU.

Maybe most interesting with the release of the GeForce GTX Titan Z is that NVIDIA seems to have completely fixated on non-DIY consumers with the card. We did not receive a sample of the Titan Z (nor did we get one of the Titan Black) and when I inquired as to why, NVIDIA PR stated that they were "only going to CUDA developers and system builders."

geforce-gtx-titan-z-3qtr.png

I think it is more than likely that after the release of AMD's Radeon R9 295X2 dual GPU graphics card on April 8th, with a price tag of $1500 (half of the Titan Z), the target audience was redirected. NVIDIA already had its eye on the professional markets that weren't willing to dive into the Quadro/Tesla lines (CUDA developers will likely drop $3k at the drop of a hat to get this kind of performance density). But a side benefit of creating the best flagship gaming graphics card on the planet was probably part of the story - and promptly taken away by AMD.

geforce-gtx-titan-z-bracket.png

I still believe the Titan Z will be an impressive graphics card to behold both in terms of look and style and in terms of performance. But it would take the BIGGEST NVIDIA fans to be able to pass up buying a pair of Radeon R9 295X2 cards for a single GeForce GTX Titan Z. At least that is our assumption until we can test one for ourselves.

I'm still working to get my hands on one of these for some testing as I think the ultra high end graphics card coverage we offer is incomplete without it. 

Several of NVIDIA's partners are going to be offering the Titan Z including EVGA, ASUS, MSI and Zotac. Maybe the most intersting though is EVGA's water cooled option!

evgatitanz.jpg

So, what do you think? Anyone lining up for a Titan Z when they show up for sale?

AMD Catalyst 14.6 Beta Driver Now Available, Adds Mixed Resolution Eyefinity

Subject: General Tech, Graphics Cards | May 27, 2014 - 12:00 AM |
Tagged: radeon, R9, R7, eyefinity, amd

AMD has just launched their Catalyst 14.6 Beta drivers for Windows and Linux. This driver will contain performance improvements for Watch Dogs, launching today in North America, and Murdered: Soul Suspect, which arrives next week. On Linux, the driver now supports Ubuntu 14.04 and its installation process has been upgraded for simplicity and user experience.

amd-146-eyefinity.png

Unless performance improvements are more important to you, the biggest feature is the support for Eyefinity with mixed resolutions. With Catalyst 14.6, you no longer need a grid of identical monitors. One example use case, suggested by AMD, is a gamer who purchases an ultra-wide 2560x1080 monitor. They will be able to add a pair of 1080p monitors on either side to create a 6400x1080 viewing surface.

amd-146-eyefinity2.png

If the monitors are very mismatched, the driver will allow users to letterbox to the largest rectangle contained by every monitor, or "expand" to draw the largest possible rectangle (which will lead to some assets drawing outside of any monitor). A third mode, fill, behaves like Eyefinity currently does. I must give AMD a lot of credit for leaving the choice to the user.

Returning to performance with actual figures, AMD claims "up to" 25% increases in Watch Dogs at 1080p or 28% at 1600p, compared to the previous version. The new CrossFire profile also claims up to 99% scaling in that game, at 2560x1600 with 8x MSAA. Murdered: Soul Suspect will see "up to" 16% improvements on a single card, and "up to" 93% scaling. Each of these results were provided by AMD, which tested on Radeon R9 290X cards. If these CrossFire profiles (well, first, are indicative of actual performance, and) see 99% scaling across two cards, that is pretty remarkable.

amd-146-jpeg.png

A brief mention, AMD has also expanded their JPEG decoder to Kabini. Previously, it was available to Kaveri, as of Catalyst 14.1. This allows using the GPU to display images, with their test showing a series of images being processed in about half of the time. While not claimed by AMD, I expect that the GPU will also be more power-efficient (as the processor can go back to its idle state much quicker, despite activitating another component to do so). Ironically, the three images I used for this news post are encoded in PNG. You might find that amusing.

AMD Catalyst 14.6 Beta Drivers should be now available at their download site.

Source: AMD