Subject: Graphics Cards | March 5, 2015 - 04:46 PM | Ryan Shrout
Tagged: GDC, gdc 15, amd, radeon, R9, 390x, VR, Oculus
Don't get too excited about this news, but AMD tells me that its next flagship Radeon R9 graphics card is up and running at GDC, powering an Oculus-based Epic "Showdown" demo.
Inside the box...
During my meeting with AMD today I was told that inside that little PC sits the "upcoming flagship Radeon R9 graphics card" but, of course, no other information was given. The following is an estimated transcript of the event:
Ryan: Can I see it?
Ryan: I can't even take the side panel off it?
Ryan. How can I know you're telling the truth then? Can I open up the driver or anything?
Ryan: GPU-Z? Anything?
Well, I tried.
Is this the rumored R9 390X with the integrated water cooler? Is it something else completely? AMD wouldn't even let me behind the system to look for a radiator so I'm afraid that is where my speculation will end.
Hooked up to the system was a Crescent Bay Oculus headset running the well-received Epic "Showdown" demo. The experience was smooth though of course there were no indications of frame rate, etc. while it was going on. After our discussion with AMD earlier in the week about its LiquidVR SDK, AMD is clearly taking the VR transition seriously. NVIDIA's GPUs might be dominating the show-floor demos but AMD wanted to be sure it wasn't left out of the discussion.
Can I just get this Fiji card already??
Subject: Graphics Cards, Displays | March 5, 2015 - 11:46 AM | Ryan Shrout
Tagged: freesync, amd
Hey everyone, we just got a quick note from AMD with an update on the FreeSync technology rollout we have been expecting since CES in January (and honestly, even before that). Here is the specific quote:
AMD is very excited that monitors compatible with AMD FreeSync™ technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon™ GPUs and AMD A-Series APUs. We’re pleased to announce that a compatible AMD Catalyst™ graphics driver to enable AMD FreeSync™ technology for single-GPU configurations will be publicly available on AMD.com starting March 19, 2015. Support for AMD CrossFire™ configurations will be available the following month in April 2015.
A couple of interesting things: first, it appears that FreeSync monitors are already shipping in the EMEA regions and that is the cause for this news blast to the media. If you are buying a monitor with a FreeSync logo on it and can't use the technology, that is clearly a bit frustrating. You have just another two weeks to wait for the software to enabled your display's variable refresh rate.
That also might be a clue as to when you can expect review embargoes and/or the release of FreeSync monitors in North America. The end is in sight!
Subject: Graphics Cards | March 4, 2015 - 09:26 PM | Ryan Shrout
Tagged: GDC, gdc 15, Intel, raptr, GFE, geforce experience
One of NVIDIA's biggest achievements in the past two years has been the creation and improvement of GeForce Experience. The program started with one goal: to help PC gamers optimize game settings to match their hardware and make sure they are getting the top quality settings the hardware can handle and thus the best gaming experience. AMD followed suit shortly after with a partnership with Raptr, a company that crowd-sources data to achieve the same goal: great optimal game settings for all users of AMD hardware.
Today Intel is announcing a partnership with Raptr as well, bringing the same great feature set of Raptr to users of machines with Intel HD Graphics systems. High-end users might chuckle at the news but I actually think this feature is going to be more important for those gamers that utilize integrated graphics. Where GPU horsepower is at premium, compared to discrete graphics cards, using the in-game settings to get all available performance will likely result in the most improvement in experience of all the three major vendors.
Raptr will continue to include game streaming capability and it will also alert the users to when updated Intel graphics drivers are available - a very welcome change to how Intel distributes them.
Intel announced a partnership to deliver an even better gaming experience on Intel Graphics. Raptr, a leading PC gaming utility now available on Intel Graphics for the first time, delivers one-button customized optimizations that improve performance on existing hardware and the games being played, even older games. With just a little tweaking of their PC settings a user may be able to dial up the frame rate and details or even play a game they didn’t think possible.
The Raptr software scans the user’s PC and compares a given game’s performance across tens of millions of other gamers’ PCs, finding and applying the best settings for their Raptr Record system. And Raptr’s gameplay recording tools leverage the video encoding in Intel® Quick Sync technology to record and stream gameplay with virtually no impact on system performance. Driver updates are a snap too, more on Raptr for Intel available here.
Hopefully we'll see this application pre-installed on notebooks going forward (can't believe I'm saying that) but I'd like to see as many PC gamers as possible, even casual ones, get access to the best gaming experience their hardware can provide.
Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM | Ryan Shrout
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark
It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.
While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.
The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.
It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.
Subject: General Tech, Graphics Cards, Shows and Expos | March 4, 2015 - 05:52 PM | Scott Michaud
Tagged: GDC, gdc 15, nvidia, epic games, ue4, unreal engine 4, PhysX, apex
NVIDIA and Epic Games have just announced that Unreal Engine 4 developers can view and modify the source of PhysX. This also includes the source for APEX, which is NVIDIA's cloth and destruction library. It does not include any of the other libraries that are under the GameWorks banner, but Unreal Engine 4 does not use them anyway.
This might even mean that good developers can write their own support for third-party platforms, like OpenCL. That would probably be a painful process, but it should be possible now. Of course, that support would only extend to their personal title, and anyone who they share their branch with.
If you are having trouble finding it, you will need to switch to a branch that has been updated to PhysX 3.3.3 with source, which is currently just “Master”. “Promoted” and earlier seem to be back at PhysX 3.3.2, which is still binary-only. It will probably take a few months to trickle down to an official release. If you are still unable to find it, even though you are on the “Master” branch, the path to NVIDIA's source code is: “Unreal Engine/Engine/Source/ThirdParty/PhysX/”. From there you can check out the various subdirectories for PhysX and APEX.
NVIDIA will be monitoring pull requests sent to that area of Unreal Engine. Enhancements might make it back upstream to PhysX proper, which would then be included in future versions of Unreal Engine and anywhere else that PhysX is used.
In other news, Unreal Engine 4 is now free of its subscription. The only time Epic will ask for money is when you ship a game and royalties are due. This is currently 5% of gross revenue, with the first $3000 (per product, per calendar quarter) exempt. This means that you can make legitimately free (no price, no ads, no subscription, no microtransactions, no Skylander figurines, etc.) game in UE4 for free now!
Subject: Graphics Cards | March 4, 2015 - 03:02 PM | Jeremy Hellstrom
Tagged: asus, ROG Poseidon GTX 980, GTX 980, factory overclocked
On the box the ASUS ROG Poseidon GTX 980 Platinum states a base of 1178MHz and a boost clock of 1279MHz but in testing [H]ard|OCP saw the card sitting at 1328MHz in game while on air cooling. They then proceeded to hook up a Koolance Exos-2 V2 and pushed the card to 1580MHz in game, though the RAM would only increase by 1.1GHz to 8.1GHz. As you would expect this had a noticeable impact on the performance and while it might not compete with the just announced Titan X at $640 it is also far less expensive though still $200 more than the Sapphire Vapor-X 290X it was tested against and $90 more than the 8GB version of that card. If you have the budget this GTX 980 is the fastest single GPU card on the planet right now.
"The highest overclocked GeForce GTX 980 based video card just landed. If the ASUS ROG Poseidon GTX 980 Platinum video card with a hybrid air and liquid cooling system doesn't impress you, we are not sure what will when it comes to GPU. We push the Poseidon clocks and pit it against the AMD Radeon R9 290X for an ultimate showdown."
Here are some more Graphics Card articles from around the web:
- Gigabyte WaterForce 3 Way SLI Review: A Force to Recon With @ Modders-Inc
- Gainward Phantom GLH GeForce GTX 960 2GB @ eTEknix
- MSI GTX960 Gaming 2G OC Edition @ Kitguru
- NVIDIA GTX 960 5-Way Roundup @ Hardware Canucks
- NVIDIA Game Ready 347.52 Driver Analysis @ eTeknix
- Sapphire Tri-x Radeon R9 290x 8GB @ eTeknix
Subject: Graphics Cards | March 4, 2015 - 01:10 PM | Ryan Shrout
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC
For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.
The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.
JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.
Any guesses on performance or price?
Jen-Hsun signs the world's first TITAN X for Tim Sweeney.
Kite Demo running on TITAN X
UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.
Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.
Finally, a SHIELD Console
NVIDIA is filling out the family of the SHIELD brand today with the announcement of SHIELD, a set-top box powered by the Tegra X1 processor. SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming device. Selling for $199 and available in May of this year, there is a lot to discuss.
Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk and bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movies and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.
Here is a full breakdown of the device's specifications.
|NVIDIA SHIELD Specifications|
|Processor||NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM|
|Video Features||4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)|
|Audio||7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
|Wireless||802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Two USB 3.0 (Type A)
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
|Gaming Features||NVIDIA GRID™ streaming service
|SW Updates||SHIELD software upgrades directly from NVIDIA|
|Power||40W power adapter|
|Weight and Size||Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
|OS||Android TV™, Google Cast™ Ready|
|In the box||NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
|Requirements||TV with HDMI input, Internet access|
|Options||SHIELD controller, SHIELD remove, SHIELD stand|
Obviously the most important feature is the Tegra X1 SoC, built on an 8-core 64-bit ARM processor and a 256 CUDA Core Maxwell architecture GPU. This gives the SHIELD set-top more performance than basically any other mobile part on the market, and demos showing Doom 3 and Crysis 3 running natively on the hardware drive the point home. With integrated HEVC decode support the console is the first Android TV device to offer the support for 4K video content at 60 FPS.
Even though storage is only coming in at 16GB, the inclusion of an MicroSD card slot enabled expansion to as much as 128GB more for content and local games.
The first choice for networking will be the Gigabit Ethernet port, but the 2x2 dual-band 802.11ac wireless controller means that even those of us that don't have hardwired Internet going to our TV will be able to utilize all the performance and features of SHIELD.
Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3
Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.
I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.
Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!
While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.
As GDC progresses here in San Francisco, AMD took the wraps off of a new SDK for game developers to use to improve experiences with virtual reality (VR) headsets. Called LiquidVR, the goal is provide a smooth and stutter free VR experience that is universal across all headset hardware and to keep the wearer, be it a gamer or professional user, immersed.
AMD's CTO of Graphics, Raja Koduri spoke with us about the three primary tenets of the LiquidVR initiative. The 'three Cs' as it is being called are Comfort, Compatibility and Compelling Content. Ignoring the fact that we have four C's in that phrase, the premise is straight forward. Comfortable use of VR means there is little to no issues with neusea and that can be fixed with ultra-low latency between motion (of your head) and photons (hitting your eyes). For compatibility, AMD would like to assure that all VR headsets are treated equally and all provide the best experience. Oculus, HTC and others should operate in a simple, plug-and-play style. Finally, the content story is easy to grasp with a focus on solid games and software to utilize VR but AMD also wants to ensure that the rendering is scalable across different hardware and multiple GPUs.
To address these tenets AMD has built four technologies into LiquidVR: late data latching, asynchronous shaders, affinity multi-GPU, and direct-to-display.
The idea behind late data latching is to get the absolute most recent raw data from the VR engine to the users eyes. This means that rather than asking for the head position of a gamer at the beginning of a render job, LiquidVR will allow the game to ask for it at the end of the rendering pipeline, which might seem counter-intuitive. Late latch means the users head movement is tracked until the end of the frame render rather until just the beginning, saving potentially 5-10ms of delay.