Watch NVIDIA Reveal the GTX TITAN X at GTC 2015

Subject: Graphics Cards, Shows and Expos | March 17, 2015 - 10:31 AM |
Tagged: nvidia, video, GTC, gtc 2015

NVIDIA is streaming today's keynote from the GPU Technology Conference (GTC) on Ustream, and we have the embed below for you to take part. NVIDIA CEO Jen-Hsun Huang will reveal the details about the new GeForce GTX TITAN X but there are going to be other announcements as well, including one featuring Tesla CEO Elon Musk.

Should be interesting!

Source: NVIDIA

PCPer Live! GeForce GTX TITAN X Live Stream and Giveaway!

Subject: Graphics Cards | March 16, 2015 - 07:13 PM |
Tagged: video, tom petersen, titan x, nvidia, maxwell, live, gtx titan x, gtx, gm200, geforce

UPDATE 2: If you missed the live stream, we now have the replay available below!

UPDATE: The winner has been announced: congrats to Ethan M. for being selected as the random winner of the GeForce GTX TITAN X graphics card!!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout! This time the focus is going to be NVIDIA's brand-new GeForce GTX TITAN X graphics card, first teased a couple of weeks back at GDC. NVIDIA's Tom Petersen will be joining us live from the GPU Technology Conference show floor to discuss the GM200 GPU, it's performance and to show off some demos of the hardware in action.


And what's a live stream without a prize? One lucky live viewer will win a GeForce GTX TITAN X 12GB graphics card of their very own! That's right - all you have to do is tune in for the live stream tomorrow afternoon and you could win a Titan X!!


NVIDIA GeForce GTX TITAN X Live Stream and Giveaway

1pm PT / 4pm ET - March 17th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, March 17th at 1pm PT / 4pm ET at There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 1pm PT / 4pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Huge thanks to ASUS for supplying a new G751JY notebook, featuring an Intel Core i7-4710HQ and a GeForce GTX 980M 4GB GPU to power our live stream from GTC!!

NVIDIA Quadro M6000 Leaks via Deadmau5

Subject: Graphics Cards | March 14, 2015 - 02:12 PM |
Tagged: nvidia, quadro, m6000, deadmau5, gtc 2015

Sometimes information comes from the least likely of sources. Deadmau5, one of the world's biggest names in house music, posted an interesting picture to his Instagram feed a couple of days ago.

Well, that's interesting. A quick hunt on Google for the NVIDIA M6000 reveals rumors of it being a GM200-based 12GB graphics card. Sound familiar? NVIDIA recently announced the GeForce GTX TITAN X based on an identical configuration at GDC last week.


A backup of the Instagram case it gets removed.

With NVIDIA's GPU Technology Conference coming up starting this Tuesday, it would appear we have more than one version of GM200 incoming.

Source: Deadmau5

NVIDIA Releases GeForce GTX 960M and GTX 950M Mobile Graphics

Subject: Graphics Cards | March 12, 2015 - 11:13 PM |
Tagged: nvidia, maxwell, GTX 960M, GTX 950M, gtx 860m, gtx 850m, gm107, geforce

NVIDIA has announced new GPUs to round out their 900-series mobile lineup, and the new GTX 960M and GTX 950M are based on the same GM107 core as the previous 860M/850M parts.


Both GPUs feature 640 CUDA Cores and are separated by Base clock speed, with the GTX 960M operating at 1096 MHz and GTX 950M at 914 MHz. Both have unlisted maximum Boost frequencies that will likely vary based on thermal constraints. The memory interface is the other differentiator between the GPUs, with the GTX 960M sporting dedicated GDDR5 memory, and the GTX 950M can be implemented with either DDR3 or GDDR5 memory. Both GTX 960M and 950M use the same 128-bit memory interface and support up to 4GB of memory.

As reported by multiple sources the core powering the 960M/950M is a GM107 Maxwell GPU, which means that we are essentially talking about rebadged 860M/850M products, though the unlisted Boost frequencies could potentially be higher with these parts with improved silicon on a mature 28nm process. In contrast the previously announced GTX 965M is based on a cut down Maxwell GM204 GPU, with its 1024 CUDA Cores representing half of the GPU core introduced with the GTX 980.

New notebooks featuring the GTX 960M have already been announced by NVIDIA's partners, so we will soon see if there is any performance improvement to these refreshed GM107 parts.

Source: NVIDIA

MSI Announces 4GB Version of NVIDIA GeForce GTX 960 Gaming Graphics Card

Subject: Graphics Cards | March 10, 2015 - 07:35 PM |
Tagged: nvidia, msi, gtx 960, geforce, 960 Gaming, 4GB GTX 960

Manufacturers announcing 4GB versions of the GeForce GTX 960 has become a regular occurrence of late, and today MSI has announced their own 4GB GTX 960, adding a model with this higher memory capacity to their popular MSI Gaming graphics card lineup.


The GTX 960 Gaming 4GB features an overclocked core in addition to the doubled frame buffer, with 1241MHz Base, 1304MHz Boost clocks (compared to the stock GTX 960 1127MHz Base, 1178MHz Boost clocks). The card also features their proprietary Twin Frozr V (5 for non-Romans) cooler, which they claim surpasses previous generations of their Twin Frozr coolers "by a large margin", with a new design featuring their SuperSU heat pipes and a pair of 100mm Torx fans with alternating standard/dispersion fan blades.


The card is set to be shown at the Intel Extreme Masters gaming event in Poland later this week, and pricing/availability have not been announced.

Source: MSI

The new 4GB ASUS Strix GeForce GTX 960

Subject: Graphics Cards | March 10, 2015 - 05:48 PM |
Tagged: strix, gtx 960, factory overclocked, DirectCU II, asus, 4GB

The ASUS Strix Series is popular around PC Perspective thanks to the hefty factory overclocks and the quiet and efficient DirectCU II cooling.  We have given away 2GB versions of the GTX 960 and Josh recently wrapped up a review of the tiny GTX 750 Ti for SFF builds. 


Today ASUS announced the STRIX-GTX960-DC2OC-4GD5, a 4GB version of the Strix GTX 960 with a base clock of 165MHz higher than the default at 1291MHz and with a 1317MHz boost clock and memory clocked at 7010MHz.  The DirectCU II cooling solution has proven to live up to the hype that surrounds it, indeed the cooler is whisper quiet and even under load which heavily overclocked it is much less noticeable than other solutions, especially when attached to Maxwell.


The outputs are impressive, DVI, HDMI and three DisplayPort outputs will have you gaming on a variety of monitors and it will support 4k resolutions, at reasonable graphics settings of course.  Along with the card you get the familiar GPU Tweak utility for tweaking your card and for a limited time the card will come with a free one year XSplit Premium License to allow you to share your best and worst moments with the world.  So far only the 2GB model is showing up at Amazon, NewEgg and B&H so you might want to hold off for a few days but it is worth noting that these cards will get you a free pre-ordered copy of Witcher 3.


Source: ASUS

GDC 15: Imagination Technologies Shows Vulkan Driver

Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 07:00 AM |
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC

Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.


This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.

A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.

According to Tom's Hardware, source code will be released “in the near future”.

GDC 15: Upcoming Flagship AMD Radeon R9 GPU Powering VR Demo

Subject: Graphics Cards | March 5, 2015 - 04:46 PM |
Tagged: GDC, gdc 15, amd, radeon, R9, 390x, VR, Oculus

Don't get too excited about this news, but AMD tells me that its next flagship Radeon R9 graphics card is up and running at GDC, powering an Oculus-based Epic "Showdown" demo.


Inside the box...

During my meeting with AMD today I was told that inside that little PC sits the "upcoming flagship Radeon R9 graphics card" but, of course, no other information was given. The following is an estimated transcript of the event:

Ryan: Can I see it?

AMD: No.

Ryan: I can't even take the side panel off it?

AMD: No.

Ryan. How can I know you're telling the truth then? Can I open up the driver or anything?

AMD: No.

Ryan: GPU-Z? Anything?

AMD: No.

Well, I tried.

Is this the rumored R9 390X with the integrated water cooler? Is it something else completely? AMD wouldn't even let me behind the system to look for a radiator so I'm afraid that is where my speculation will end.

Hooked up to the system was a Crescent Bay Oculus headset running the well-received Epic "Showdown" demo. The experience was smooth though of course there were no indications of frame rate, etc. while it was going on. After our discussion with AMD earlier in the week about its LiquidVR SDK, AMD is clearly taking the VR transition seriously. NVIDIA's GPUs might be dominating the show-floor demos but AMD wanted to be sure it wasn't left out of the discussion.

Can I just get this Fiji card already??

AMD To Release FreeSync Ready Driver on March 19th

Subject: Graphics Cards, Displays | March 5, 2015 - 11:46 AM |
Tagged: freesync, amd

Hey everyone, we just got a quick note from AMD with an update on the FreeSync technology rollout we have been expecting since CES in January (and honestly, even before that). Here is the specific quote:

AMD is very excited that monitors compatible with AMD FreeSync™ technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon™ GPUs and AMD A-Series APUs. We’re pleased to announce that a compatible AMD Catalyst™ graphics driver to enable AMD FreeSync™ technology for single-GPU configurations will be publicly available on starting March 19, 2015. Support for AMD CrossFire™ configurations will be available the following month in April 2015.

A couple of interesting things: first, it appears that FreeSync monitors are already shipping in the EMEA regions and that is the cause for this news blast to the media. If you are buying a monitor with a FreeSync logo on it and can't use the technology, that is clearly a bit frustrating. You have just another two weeks to wait for the software to enabled your display's variable refresh rate.


That also might be a clue as to when you can expect review embargoes and/or the release of FreeSync monitors in North America. The end is in sight!

GDC 15: Intel and Raptr Partner for PC Gaming Optimization Software

Subject: Graphics Cards | March 4, 2015 - 09:26 PM |
Tagged: GDC, gdc 15, Intel, raptr, GFE, geforce experience

One of NVIDIA's biggest achievements in the past two years has been the creation and improvement of GeForce Experience. The program started with one goal: to help PC gamers optimize game settings to match their hardware and make sure they are getting the top quality settings the hardware can handle and thus the best gaming experience. AMD followed suit shortly after with a partnership with Raptr, a company that crowd-sources data to achieve the same goal: great optimal game settings for all users of AMD hardware.

Today Intel is announcing a partnership with Raptr as well, bringing the same great feature set of Raptr to users of machines with Intel HD Graphics systems. High-end users might chuckle at the news but I actually think this feature is going to be more important for those gamers that utilize integrated graphics. Where GPU horsepower is at premium, compared to discrete graphics cards, using the in-game settings to get all available performance will likely result in the most improvement in experience of all the three major vendors.


Raptr will continue to include game streaming capability and it will also alert the users to when updated Intel graphics drivers are available - a very welcome change to how Intel distributes them.


Intel announced a partnership to deliver an even better gaming experience on Intel Graphics. Raptr, a leading PC gaming utility now available on Intel Graphics for the first time, delivers one-button customized optimizations that improve performance on existing hardware and the games being played, even older games. With just a little tweaking of their PC settings a user may be able to dial up the frame rate and details or even play a game they didn’t think possible.

The Raptr software scans the user’s PC and compares a given game’s performance across tens of millions of other gamers’ PCs, finding and applying the best settings for their Raptr Record system. And Raptr’s gameplay recording tools leverage the video encoding in Intel® Quick Sync technology to record and stream gameplay with virtually no impact on system performance. Driver updates are a snap too, more on Raptr for Intel available here.

Hopefully we'll see this application pre-installed on notebooks going forward (can't believe I'm saying that) but I'd like to see as many PC gamers as possible, even casual ones, get access to the best gaming experience their hardware can provide.

GDC 15: Intel shows 3DMark API Overhead Test at Work

Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM |
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark

It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.


While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.

The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.

It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.

GDC 15: PhysX Is Now Shared Source to UE4 Developers

Subject: General Tech, Graphics Cards, Shows and Expos | March 4, 2015 - 05:52 PM |
Tagged: GDC, gdc 15, nvidia, epic games, ue4, unreal engine 4, PhysX, apex

NVIDIA and Epic Games have just announced that Unreal Engine 4 developers can view and modify the source of PhysX. This also includes the source for APEX, which is NVIDIA's cloth and destruction library. It does not include any of the other libraries that are under the GameWorks banner, but Unreal Engine 4 does not use them anyway.


This might even mean that good developers can write their own support for third-party platforms, like OpenCL. That would probably be a painful process, but it should be possible now. Of course, that support would only extend to their personal title, and anyone who they share their branch with.

If you are having trouble finding it, you will need to switch to a branch that has been updated to PhysX 3.3.3 with source, which is currently just “Master”. “Promoted” and earlier seem to be back at PhysX 3.3.2, which is still binary-only. It will probably take a few months to trickle down to an official release. If you are still unable to find it, even though you are on the “Master” branch, the path to NVIDIA's source code is: “Unreal Engine/Engine/Source/ThirdParty/PhysX/”. From there you can check out the various subdirectories for PhysX and APEX.

NVIDIA will be monitoring pull requests sent to that area of Unreal Engine. Enhancements might make it back upstream to PhysX proper, which would then be included in future versions of Unreal Engine and anywhere else that PhysX is used.

In other news, Unreal Engine 4 is now free of its subscription. The only time Epic will ask for money is when you ship a game and royalties are due. This is currently 5% of gross revenue, with the first $3000 (per product, per calendar quarter) exempt. This means that you can make legitimately free (no price, no ads, no subscription, no microtransactions, no Skylander figurines, etc.) game in UE4 for free now!

Source: Epic Games

Pushing the GTX 980 to the limits; the ASUS ROG Poseidon

Subject: Graphics Cards | March 4, 2015 - 03:02 PM |
Tagged: asus, ROG Poseidon GTX 980, GTX 980, factory overclocked

On the box the ASUS ROG Poseidon GTX 980 Platinum states a base of 1178MHz and a boost clock of 1279MHz but in testing [H]ard|OCP saw the card sitting at 1328MHz in game while on air cooling.  They then proceeded to hook up a Koolance Exos-2 V2 and pushed the card to 1580MHz in game, though the RAM would only increase by 1.1GHz to 8.1GHz.  As you would expect this had a noticeable impact on the performance and while it might not compete with the just announced Titan X at $640 it is also far less expensive though still $200 more than the Sapphire Vapor-X 290X it was tested against and $90 more than the 8GB version of that card.  If you have the budget this GTX 980 is the fastest single GPU card on the planet right now.


"The highest overclocked GeForce GTX 980 based video card just landed. If the ASUS ROG Poseidon GTX 980 Platinum video card with a hybrid air and liquid cooling system doesn't impress you, we are not sure what will when it comes to GPU. We push the Poseidon clocks and pit it against the AMD Radeon R9 290X for an ultimate showdown."

Here are some more Graphics Card articles from around the web:

Graphics Cards


Source: [H]ard|OCP

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.


JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.


Any guesses on performance or price?



Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.


Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.


GDC 15: Native versions of Doom 3, Crysis 3 running on Android, Tegra X1

Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM |
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3

Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.

I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.

Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!

While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.

GDC 15: Khronos Acknowledges Mantle's Start of Vulkan

Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM |
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd


Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.

Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.

So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.

Update: AMD has released a statement independently, but related to Mantle's role in Vulkan

EVGA and Inno3D Announce the First 4GB NVIDIA GeForce GTX 960 Cards

Subject: Graphics Cards | March 3, 2015 - 02:44 PM |
Tagged: video cards, nvidia, gtx 960, geforce, 4GB

They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.


EVGA's new 4GB NVIDIA GTX 960 SuperSC

Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.


Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.


The existing Inno3D GTX 960 OC card

The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.

Source: EVGA

ARM and Geomerics Show Enlighten 3 Lighting, Integrate with Unity 5

Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM |
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm

Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.


Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.

ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.


It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.

At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.

All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.

Geomerics Enlighten 3 Subway Demo

Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.


Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.

GDC 15: AMD Mantle Might Be Dead as We Know It: No Public SDK Planned

Subject: Graphics Cards | March 2, 2015 - 02:31 PM |
Tagged: sdk, Mantle, dx12, API, amd

The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.

Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.


Something might have changed there.

A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:

AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.

That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:

AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:

AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.

  1. Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
  2. Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
  3. The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.

Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.


Battlefield 4 is likely to be the only major title to use AMD Mantle

AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.

So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.

Source: AMD

So Long Adware, and Thanks for All the Fish!

Subject: Graphics Cards | March 1, 2015 - 07:30 AM |
Tagged: superfish, Lenovo, bloatware, adware

Obviously, this does not forget the controversy that Lenovo got themselves into, but it is certainly the correct response (if they act how they imply). Adware and bloatware is common to find on consumer PCs, which makes the slowest of devices even more sluggish as demos and sometimes straight-up advertisements claim their share of your resources. This does not even begin to discuss the security issues that some of these hitchhikers drag in. Again, I refer you to the aforementioned controversy.


In response, albeit a delayed one, Lenovo has announced that, by the launch of Windows 10, they will only pre-install the OS and “related software”. Lenovo classifies this related software as drivers, security software, Lenovo applications, and applications for “unique hardware” (ex: software for an embedded 3D camera).

It looks to be a great step, but I need to call out “security software”. Windows 10 should ship with Microsoft's security applications in many regions, which really questions why a laptop provider would include an alternative. If the problem is that people expect McAfee or Symantec, then advertise pre-loaded Microsoft anti-malware and keep it clean. Otherwise, it feels like keeping a single finger in the adware take-a-penny dish.

At least it is not as bad as trying to install McAfee every time you update Flash Player. I consider Adobe's tactic the greater of two evils on that one. I mean, unless Adobe just thinks that Flash Player is so insecure that you would be crazy to install it without a metaphorical guard watching over your shoulder.

And then of course we reach the divide between “saying” and “doing”. We will need to see Lenovo's actual Windows 10 devices to find out if they kept their word, and followed its implications to a tee.

Source: Lenovo