Subject: Graphics Cards | March 14, 2015 - 02:12 PM | Ryan Shrout
Tagged: nvidia, quadro, m6000, deadmau5, gtc 2015
Sometimes information comes from the least likely of sources. Deadmau5, one of the world's biggest names in house music, posted an interesting picture to his Instagram feed a couple of days ago.
Well, that's interesting. A quick hunt on Google for the NVIDIA M6000 reveals rumors of it being a GM200-based 12GB graphics card. Sound familiar? NVIDIA recently announced the GeForce GTX TITAN X based on an identical configuration at GDC last week.
A backup of the Instagram image...in case it gets removed.
With NVIDIA's GPU Technology Conference coming up starting this Tuesday, it would appear we have more than one version of GM200 incoming.
Subject: Graphics Cards | March 12, 2015 - 11:13 PM | Sebastian Peak
Tagged: nvidia, maxwell, GTX 960M, GTX 950M, gtx 860m, gtx 850m, gm107, geforce
NVIDIA has announced new GPUs to round out their 900-series mobile lineup, and the new GTX 960M and GTX 950M are based on the same GM107 core as the previous 860M/850M parts.
Both GPUs feature 640 CUDA Cores and are separated by Base clock speed, with the GTX 960M operating at 1096 MHz and GTX 950M at 914 MHz. Both have unlisted maximum Boost frequencies that will likely vary based on thermal constraints. The memory interface is the other differentiator between the GPUs, with the GTX 960M sporting dedicated GDDR5 memory, and the GTX 950M can be implemented with either DDR3 or GDDR5 memory. Both GTX 960M and 950M use the same 128-bit memory interface and support up to 4GB of memory.
As reported by multiple sources the core powering the 960M/950M is a GM107 Maxwell GPU, which means that we are essentially talking about rebadged 860M/850M products, though the unlisted Boost frequencies could potentially be higher with these parts with improved silicon on a mature 28nm process. In contrast the previously announced GTX 965M is based on a cut down Maxwell GM204 GPU, with its 1024 CUDA Cores representing half of the GPU core introduced with the GTX 980.
New notebooks featuring the GTX 960M have already been announced by NVIDIA's partners, so we will soon see if there is any performance improvement to these refreshed GM107 parts.
Subject: Graphics Cards | March 10, 2015 - 07:35 PM | Sebastian Peak
Tagged: nvidia, msi, gtx 960, geforce, 960 Gaming, 4GB GTX 960
Manufacturers announcing 4GB versions of the GeForce GTX 960 has become a regular occurrence of late, and today MSI has announced their own 4GB GTX 960, adding a model with this higher memory capacity to their popular MSI Gaming graphics card lineup.
The GTX 960 Gaming 4GB features an overclocked core in addition to the doubled frame buffer, with 1241MHz Base, 1304MHz Boost clocks (compared to the stock GTX 960 1127MHz Base, 1178MHz Boost clocks). The card also features their proprietary Twin Frozr V (5 for non-Romans) cooler, which they claim surpasses previous generations of their Twin Frozr coolers "by a large margin", with a new design featuring their SuperSU heat pipes and a pair of 100mm Torx fans with alternating standard/dispersion fan blades.
The card is set to be shown at the Intel Extreme Masters gaming event in Poland later this week, and pricing/availability have not been announced.
Subject: Graphics Cards | March 10, 2015 - 05:48 PM | Jeremy Hellstrom
Tagged: strix, gtx 960, factory overclocked, DirectCU II, asus, 4GB
The ASUS Strix Series is popular around PC Perspective thanks to the hefty factory overclocks and the quiet and efficient DirectCU II cooling. We have given away 2GB versions of the GTX 960 and Josh recently wrapped up a review of the tiny GTX 750 Ti for SFF builds.
Today ASUS announced the STRIX-GTX960-DC2OC-4GD5, a 4GB version of the Strix GTX 960 with a base clock of 165MHz higher than the default at 1291MHz and with a 1317MHz boost clock and memory clocked at 7010MHz. The DirectCU II cooling solution has proven to live up to the hype that surrounds it, indeed the cooler is whisper quiet and even under load which heavily overclocked it is much less noticeable than other solutions, especially when attached to Maxwell.
The outputs are impressive, DVI, HDMI and three DisplayPort outputs will have you gaming on a variety of monitors and it will support 4k resolutions, at reasonable graphics settings of course. Along with the card you get the familiar GPU Tweak utility for tweaking your card and for a limited time the card will come with a free one year XSplit Premium License to allow you to share your best and worst moments with the world. So far only the 2GB model is showing up at Amazon, NewEgg and B&H so you might want to hold off for a few days but it is worth noting that these cards will get you a free pre-ordered copy of Witcher 3.
Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 07:00 AM | Scott Michaud
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC
Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.
This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.
A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.
According to Tom's Hardware, source code will be released “in the near future”.
Subject: Graphics Cards | March 5, 2015 - 04:46 PM | Ryan Shrout
Tagged: GDC, gdc 15, amd, radeon, R9, 390x, VR, Oculus
Don't get too excited about this news, but AMD tells me that its next flagship Radeon R9 graphics card is up and running at GDC, powering an Oculus-based Epic "Showdown" demo.
Inside the box...
During my meeting with AMD today I was told that inside that little PC sits the "upcoming flagship Radeon R9 graphics card" but, of course, no other information was given. The following is an estimated transcript of the event:
Ryan: Can I see it?
Ryan: I can't even take the side panel off it?
Ryan. How can I know you're telling the truth then? Can I open up the driver or anything?
Ryan: GPU-Z? Anything?
Well, I tried.
Is this the rumored R9 390X with the integrated water cooler? Is it something else completely? AMD wouldn't even let me behind the system to look for a radiator so I'm afraid that is where my speculation will end.
Hooked up to the system was a Crescent Bay Oculus headset running the well-received Epic "Showdown" demo. The experience was smooth though of course there were no indications of frame rate, etc. while it was going on. After our discussion with AMD earlier in the week about its LiquidVR SDK, AMD is clearly taking the VR transition seriously. NVIDIA's GPUs might be dominating the show-floor demos but AMD wanted to be sure it wasn't left out of the discussion.
Can I just get this Fiji card already??
Subject: Graphics Cards, Displays | March 5, 2015 - 11:46 AM | Ryan Shrout
Tagged: freesync, amd
Hey everyone, we just got a quick note from AMD with an update on the FreeSync technology rollout we have been expecting since CES in January (and honestly, even before that). Here is the specific quote:
AMD is very excited that monitors compatible with AMD FreeSync™ technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon™ GPUs and AMD A-Series APUs. We’re pleased to announce that a compatible AMD Catalyst™ graphics driver to enable AMD FreeSync™ technology for single-GPU configurations will be publicly available on AMD.com starting March 19, 2015. Support for AMD CrossFire™ configurations will be available the following month in April 2015.
A couple of interesting things: first, it appears that FreeSync monitors are already shipping in the EMEA regions and that is the cause for this news blast to the media. If you are buying a monitor with a FreeSync logo on it and can't use the technology, that is clearly a bit frustrating. You have just another two weeks to wait for the software to enabled your display's variable refresh rate.
That also might be a clue as to when you can expect review embargoes and/or the release of FreeSync monitors in North America. The end is in sight!
Subject: Graphics Cards | March 4, 2015 - 09:26 PM | Ryan Shrout
Tagged: GDC, gdc 15, Intel, raptr, GFE, geforce experience
One of NVIDIA's biggest achievements in the past two years has been the creation and improvement of GeForce Experience. The program started with one goal: to help PC gamers optimize game settings to match their hardware and make sure they are getting the top quality settings the hardware can handle and thus the best gaming experience. AMD followed suit shortly after with a partnership with Raptr, a company that crowd-sources data to achieve the same goal: great optimal game settings for all users of AMD hardware.
Today Intel is announcing a partnership with Raptr as well, bringing the same great feature set of Raptr to users of machines with Intel HD Graphics systems. High-end users might chuckle at the news but I actually think this feature is going to be more important for those gamers that utilize integrated graphics. Where GPU horsepower is at premium, compared to discrete graphics cards, using the in-game settings to get all available performance will likely result in the most improvement in experience of all the three major vendors.
Raptr will continue to include game streaming capability and it will also alert the users to when updated Intel graphics drivers are available - a very welcome change to how Intel distributes them.
Intel announced a partnership to deliver an even better gaming experience on Intel Graphics. Raptr, a leading PC gaming utility now available on Intel Graphics for the first time, delivers one-button customized optimizations that improve performance on existing hardware and the games being played, even older games. With just a little tweaking of their PC settings a user may be able to dial up the frame rate and details or even play a game they didn’t think possible.
The Raptr software scans the user’s PC and compares a given game’s performance across tens of millions of other gamers’ PCs, finding and applying the best settings for their Raptr Record system. And Raptr’s gameplay recording tools leverage the video encoding in Intel® Quick Sync technology to record and stream gameplay with virtually no impact on system performance. Driver updates are a snap too, more on Raptr for Intel available here.
Hopefully we'll see this application pre-installed on notebooks going forward (can't believe I'm saying that) but I'd like to see as many PC gamers as possible, even casual ones, get access to the best gaming experience their hardware can provide.
Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM | Ryan Shrout
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark
It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.
While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.
The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.
It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.
Subject: General Tech, Graphics Cards, Shows and Expos | March 4, 2015 - 05:52 PM | Scott Michaud
Tagged: GDC, gdc 15, nvidia, epic games, ue4, unreal engine 4, PhysX, apex
NVIDIA and Epic Games have just announced that Unreal Engine 4 developers can view and modify the source of PhysX. This also includes the source for APEX, which is NVIDIA's cloth and destruction library. It does not include any of the other libraries that are under the GameWorks banner, but Unreal Engine 4 does not use them anyway.
This might even mean that good developers can write their own support for third-party platforms, like OpenCL. That would probably be a painful process, but it should be possible now. Of course, that support would only extend to their personal title, and anyone who they share their branch with.
If you are having trouble finding it, you will need to switch to a branch that has been updated to PhysX 3.3.3 with source, which is currently just “Master”. “Promoted” and earlier seem to be back at PhysX 3.3.2, which is still binary-only. It will probably take a few months to trickle down to an official release. If you are still unable to find it, even though you are on the “Master” branch, the path to NVIDIA's source code is: “Unreal Engine/Engine/Source/ThirdParty/PhysX/”. From there you can check out the various subdirectories for PhysX and APEX.
NVIDIA will be monitoring pull requests sent to that area of Unreal Engine. Enhancements might make it back upstream to PhysX proper, which would then be included in future versions of Unreal Engine and anywhere else that PhysX is used.
In other news, Unreal Engine 4 is now free of its subscription. The only time Epic will ask for money is when you ship a game and royalties are due. This is currently 5% of gross revenue, with the first $3000 (per product, per calendar quarter) exempt. This means that you can make legitimately free (no price, no ads, no subscription, no microtransactions, no Skylander figurines, etc.) game in UE4 for free now!