AMD To Release FreeSync Ready Driver on March 19th

Subject: Graphics Cards, Displays | March 5, 2015 - 11:46 AM |
Tagged: freesync, amd

Hey everyone, we just got a quick note from AMD with an update on the FreeSync technology rollout we have been expecting since CES in January (and honestly, even before that). Here is the specific quote:

AMD is very excited that monitors compatible with AMD FreeSync™ technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon™ GPUs and AMD A-Series APUs. We’re pleased to announce that a compatible AMD Catalyst™ graphics driver to enable AMD FreeSync™ technology for single-GPU configurations will be publicly available on starting March 19, 2015. Support for AMD CrossFire™ configurations will be available the following month in April 2015.

A couple of interesting things: first, it appears that FreeSync monitors are already shipping in the EMEA regions and that is the cause for this news blast to the media. If you are buying a monitor with a FreeSync logo on it and can't use the technology, that is clearly a bit frustrating. You have just another two weeks to wait for the software to enabled your display's variable refresh rate.


That also might be a clue as to when you can expect review embargoes and/or the release of FreeSync monitors in North America. The end is in sight!

GDC 15: Intel and Raptr Partner for PC Gaming Optimization Software

Subject: Graphics Cards | March 4, 2015 - 09:26 PM |
Tagged: GDC, gdc 15, Intel, raptr, GFE, geforce experience

One of NVIDIA's biggest achievements in the past two years has been the creation and improvement of GeForce Experience. The program started with one goal: to help PC gamers optimize game settings to match their hardware and make sure they are getting the top quality settings the hardware can handle and thus the best gaming experience. AMD followed suit shortly after with a partnership with Raptr, a company that crowd-sources data to achieve the same goal: great optimal game settings for all users of AMD hardware.

Today Intel is announcing a partnership with Raptr as well, bringing the same great feature set of Raptr to users of machines with Intel HD Graphics systems. High-end users might chuckle at the news but I actually think this feature is going to be more important for those gamers that utilize integrated graphics. Where GPU horsepower is at premium, compared to discrete graphics cards, using the in-game settings to get all available performance will likely result in the most improvement in experience of all the three major vendors.


Raptr will continue to include game streaming capability and it will also alert the users to when updated Intel graphics drivers are available - a very welcome change to how Intel distributes them.


Intel announced a partnership to deliver an even better gaming experience on Intel Graphics. Raptr, a leading PC gaming utility now available on Intel Graphics for the first time, delivers one-button customized optimizations that improve performance on existing hardware and the games being played, even older games. With just a little tweaking of their PC settings a user may be able to dial up the frame rate and details or even play a game they didn’t think possible.

The Raptr software scans the user’s PC and compares a given game’s performance across tens of millions of other gamers’ PCs, finding and applying the best settings for their Raptr Record system. And Raptr’s gameplay recording tools leverage the video encoding in Intel® Quick Sync technology to record and stream gameplay with virtually no impact on system performance. Driver updates are a snap too, more on Raptr for Intel available here.

Hopefully we'll see this application pre-installed on notebooks going forward (can't believe I'm saying that) but I'd like to see as many PC gamers as possible, even casual ones, get access to the best gaming experience their hardware can provide.

GDC 15: Intel shows 3DMark API Overhead Test at Work

Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM |
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark

It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.


While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.

The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.

It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.

GDC 15: PhysX Is Now Shared Source to UE4 Developers

Subject: General Tech, Graphics Cards, Shows and Expos | March 4, 2015 - 05:52 PM |
Tagged: GDC, gdc 15, nvidia, epic games, ue4, unreal engine 4, PhysX, apex

NVIDIA and Epic Games have just announced that Unreal Engine 4 developers can view and modify the source of PhysX. This also includes the source for APEX, which is NVIDIA's cloth and destruction library. It does not include any of the other libraries that are under the GameWorks banner, but Unreal Engine 4 does not use them anyway.


This might even mean that good developers can write their own support for third-party platforms, like OpenCL. That would probably be a painful process, but it should be possible now. Of course, that support would only extend to their personal title, and anyone who they share their branch with.

If you are having trouble finding it, you will need to switch to a branch that has been updated to PhysX 3.3.3 with source, which is currently just “Master”. “Promoted” and earlier seem to be back at PhysX 3.3.2, which is still binary-only. It will probably take a few months to trickle down to an official release. If you are still unable to find it, even though you are on the “Master” branch, the path to NVIDIA's source code is: “Unreal Engine/Engine/Source/ThirdParty/PhysX/”. From there you can check out the various subdirectories for PhysX and APEX.

NVIDIA will be monitoring pull requests sent to that area of Unreal Engine. Enhancements might make it back upstream to PhysX proper, which would then be included in future versions of Unreal Engine and anywhere else that PhysX is used.

In other news, Unreal Engine 4 is now free of its subscription. The only time Epic will ask for money is when you ship a game and royalties are due. This is currently 5% of gross revenue, with the first $3000 (per product, per calendar quarter) exempt. This means that you can make legitimately free (no price, no ads, no subscription, no microtransactions, no Skylander figurines, etc.) game in UE4 for free now!

Source: Epic Games

Pushing the GTX 980 to the limits; the ASUS ROG Poseidon

Subject: Graphics Cards | March 4, 2015 - 03:02 PM |
Tagged: asus, ROG Poseidon GTX 980, GTX 980, factory overclocked

On the box the ASUS ROG Poseidon GTX 980 Platinum states a base of 1178MHz and a boost clock of 1279MHz but in testing [H]ard|OCP saw the card sitting at 1328MHz in game while on air cooling.  They then proceeded to hook up a Koolance Exos-2 V2 and pushed the card to 1580MHz in game, though the RAM would only increase by 1.1GHz to 8.1GHz.  As you would expect this had a noticeable impact on the performance and while it might not compete with the just announced Titan X at $640 it is also far less expensive though still $200 more than the Sapphire Vapor-X 290X it was tested against and $90 more than the 8GB version of that card.  If you have the budget this GTX 980 is the fastest single GPU card on the planet right now.


"The highest overclocked GeForce GTX 980 based video card just landed. If the ASUS ROG Poseidon GTX 980 Platinum video card with a hybrid air and liquid cooling system doesn't impress you, we are not sure what will when it comes to GPU. We push the Poseidon clocks and pit it against the AMD Radeon R9 290X for an ultimate showdown."

Here are some more Graphics Card articles from around the web:

Graphics Cards


Source: [H]ard|OCP

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.


JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.


Any guesses on performance or price?



Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.


Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.


Manufacturer: NVIDIA

Finally, a SHIELD Console

NVIDIA is filling out the family of the SHIELD brand today with the announcement of SHIELD, a set-top box powered by the Tegra X1 processor. SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming device. Selling for $199 and available in May of this year, there is a lot to discuss.

Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk and bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movies and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.


Here is a full breakdown of the device's specifications.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

Obviously the most important feature is the Tegra X1 SoC, built on an 8-core 64-bit ARM processor and a 256 CUDA Core Maxwell architecture GPU. This gives the SHIELD set-top more performance than basically any other mobile part on the market, and demos showing Doom 3 and Crysis 3 running natively on the hardware drive the point home. With integrated HEVC decode support the console is the first Android TV device to offer the support for 4K video content at 60 FPS.

Even though storage is only coming in at 16GB, the inclusion of an MicroSD card slot enabled expansion to as much as 128GB more for content and local games.


The first choice for networking will be the Gigabit Ethernet port, but the 2x2 dual-band 802.11ac wireless controller means that even those of us that don't have hardwired Internet going to our TV will be able to utilize all the performance and features of SHIELD.

Continue reading our preview of the NVIDIA SHIELD set-top box!!

GDC 15: Native versions of Doom 3, Crysis 3 running on Android, Tegra X1

Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM |
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3

Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.

I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.

Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!

While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.

Manufacturer: AMD

Liquid...get it?

As GDC progresses here in San Francisco, AMD took the wraps off of a new SDK for game developers to use to improve experiences with virtual reality (VR) headsets. Called LiquidVR, the goal is provide a smooth and stutter free VR experience that is universal across all headset hardware and to keep the wearer, be it a gamer or professional user, immersed.


AMD's CTO of Graphics, Raja Koduri spoke with us about the three primary tenets of the LiquidVR initiative. The 'three Cs' as it is being called are Comfort, Compatibility and Compelling Content. Ignoring the fact that we have four C's in that phrase, the premise is straight forward. Comfortable use of VR means there is little to no issues with neusea and that can be fixed with ultra-low latency between motion (of your head) and photons (hitting your eyes). For compatibility, AMD would like to assure that all VR headsets are treated equally and all provide the best experience. Oculus, HTC and others should operate in a simple, plug-and-play style. Finally, the content story is easy to grasp with a focus on solid games and software to utilize VR but AMD also wants to ensure that the rendering is scalable across different hardware and multiple GPUs.


To address these tenets AMD has built four technologies into LiquidVR: late data latching, asynchronous shaders, affinity multi-GPU, and direct-to-display.


The idea behind late data latching is to get the absolute most recent raw data from the VR engine to the users eyes. This means that rather than asking for the head position of a gamer at the beginning of a render job, LiquidVR will allow the game to ask for it at the end of the rendering pipeline, which might seem counter-intuitive. Late latch means the users head movement is tracked until the end of the frame render rather until just the beginning, saving potentially 5-10ms of delay.


Continue reading our first impressions of the new AMD LiquidVR SDK for virtual reality!!

GDC 15: Khronos Acknowledges Mantle's Start of Vulkan

Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM |
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd


Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.

Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.

So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.

Update: AMD has released a statement independently, but related to Mantle's role in Vulkan