GTC 19: NVIDIA Announces GameWorks RTX; Unreal Engine and Unity Gain DXR Support

Subject: General Tech | March 18, 2019 - 10:00 PM |
Tagged: gameworks, unreal engine, Unity, rtx, ray tracing, nvidia, GTC 19, GTC, dxr, developers

Today at GTC NVIDIA announced GameWorks RTX and the implementation of real-time ray tracing in the upcoming Unreal Engine 4.22 and the latest version of Unity, currently in 2019.03.

NVIDIA Announces GameWorks RTX

While Pascal and non-RTX Turing support for real-time ray tracing is something of a bombshell from NVIDIA, the creation of GameWorks tools for such effects is not surprising.

“NVIDIA GameWorks RTX is a comprehensive set of tools that help developers implement real time ray-traced effects in games. GameWorks RTX is available to the developer community in open source form under the GameWorks license and includes plugins for Unreal Engine 4.22 and Unity’s 2019.03 preview release.”

NVIDIA lists these components of GameWorks RTX:


  • RTX Denoiser SDK – a library that enables fast, real-time ray tracing by providing denoising techniques to lower the required ray count and samples per pixel. It includes algorithms for ray traced area light shadows, glossy reflections, ambient occlusion and diffuse global illumination.
  • Nsight for RT – a standalone developer tool that saves developers time by helping to debug and profile graphics applications built with DXR and other supported APIs.

Unreal Engine and Unity Gaining Real-Time Ray Tracing Support


And while not specific to NVIDIA hardware, news of more game engines offering integrated DXR support was also announced during the keynote:

“Unreal Engine 4.22 is available in preview now, with final release details expected in Epic’s GDC keynote on Wednesday. Starting on April 4, Unity will offer optimized, production-focused, realtime ray tracing support with a custom experimental build available on GitHub to all users with full preview access in the 2019.03 Unity release. Real-time ray tracing support from other first-party AAA game engines includes DICE/EA’s Frostbite Engine, Remedy Entertainment’s Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others.”

RTX may have been off to a slow start, but this will apparently be the year of real-time ray tracing after all - especially with the upcoming NVIDIA driver update adding support to the GTX 10-series and new GTX 16-series GPUs.

Source: NVIDIA

NVIDIA to Add Real-Time Ray Tracing Support to Pascal GPUs via April Driver Update

Subject: Graphics Cards | March 18, 2019 - 09:41 PM |
Tagged: unreal engine, Unity, turing, rtx, ray tracing, pascal, nvidia, geforce, GTC 19, GTC, gaming, developers

Today at GTC NVIDIA announced a few things of particular interest to gamers, including GameWorks RTX and the implementation of real-time ray tracing in upcoming versions of both Unreal Engine and Unity (we already posted the news that CRYENGINE will be supporting real-time ray tracing as well). But there is something else... NVIDIA is bringing ray tracing support to GeForce GTX graphics cards.


This surprising turn means that hardware RT support won’t be limited to RTX cards after all, as the install base of NVIDIA ray-tracing GPUs “grows to tens of millions” with a simple driver update next month, adding the feature to both to previous-gen Pascal and the new Turing GTX GPUs.

How is this possible? It’s all about the programmable shaders:

“NVIDIA GeForce GTX GPUs powered by Pascal and Turing architectures will be able to take advantage of ray tracing-supported games via a driver expected in April. The new driver will enable tens of millions of GPUs for games that support real-time ray tracing, accelerating the growth of the technology and giving game developers a massive installed base.

With this driver, GeForce GTX GPUs will execute ray traced effects on shader cores. Game performance will vary based on the ray-traced effects and on the number of rays cast in the game, along with GPU model and game resolution. Games that support the Microsoft DXR and Vulkan APIs are all supported.

However, GeForce RTX GPUs, which have dedicated ray tracing cores built directly into the GPU, deliver the ultimate ray tracing experience. They provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores.”

A very important caveat is that “2-3x faster ray tracing performance” for GeForce RTX graphics cards mentioned in the last paragraph, so expectations will need to be tempered as RT features will be less efficient running on shader cores (Pascal and Turing) than they are with dedicated cores, as demonstrated by these charts:




It's going to be a busy April.

Source: NVIDIA

NVIDIA Announces PhysX 4.0: Open Source (3-Clause BSD)!

Subject: General Tech | December 3, 2018 - 08:46 PM |
Tagged: pc gaming, PhysX, nvidia, physx 4.0, Unity, unreal engine 4

NVIDIA has just announced a new major version to their popular physics middleware: PhysX 4.0. They also announced that it (both 4.0 and 3.4) will be re-licensed as 3-line BSD. In terms of open-source licenses, this is about a permissive as you can get. You are basically free to do whatever you want – commercial, modified, unmodified, whatever – if you follow the guidelines (which are things like “no warranty”, “don’t sue us for liability”, “give us credit by leaving a copy of the license in all binary and source releases”, and “we’re not endorsing your product so don’t pretend that we are”).

For gamers? It will take a little while before this comes around to you. Unity is currently preparing to update to PhysX 3.4 with their upcoming 2018.3 release; that was the first major PhysX update since Unity 5.0 upgraded from PhysX 2.x to PhysX 3.3 back in March 2015. Epic Games seems to be a little quicker to update to a new PhysX version, but there’s nothing announced on their side either as far as I can tell.

On the technical side: this release of PhysX is interesting.

As mentioned, Unity 5.0 was the point when their PhysX implementation jumped from 2.x to 3.3. This was not a clean transition. NVIDIA changed the way that many of their solvers worked, making them much faster but also less stable (as in simulation stability – so, like, oscillating and breaking apart). While this was acceptable (because most simulations are cosmetic and, if it mattered, you had more performance to just increase the physics tick-rate to compensate) it upset developers who relied upon the stability of PhysX 2, forcing them to work around the glitches.

According to NVIDIA’s promotional video, this version is both more stable and faster. This means that it should be less work to setup things like ragdolls and ball-and-chain systems, while also supposedly being faster. In terms of stability, they intentionally showed a simulation of three balls and chains with varying masses. In PhysX 3.x, this tends to be a degenerate case where joints freak out and split (unless you compensate with smaller physics time steps). Even if it’s on-par with PhysX 3.x, this is a huge win for indie game developers.

PhysX 4.0 will be available for developers on December 20th. It’s unclear when any given engine will integrate it, however.

Source: NVIDIA

Unite LA 2018: Unity Presents MegaCity Demo

Subject: General Tech | October 24, 2018 - 08:04 PM |
Tagged: Unity, pc gaming

Near the end of their keynote at the Unite LA conference, Unity showed off “MegaCity”. This scene, created by their internal Demo team, contains about 4.5 million rendered objects and plays at 60 FPS. About 5,000 moving vehicles are present in the environment as well. They also added 100,000 audio sources, because why not. Spoiler: They then pulled out a (high-end) phone and launched it there too.

This demo was designed to show off two things: Prefab Workflows and ECS.

The Prefab Workflows portion showed attendees, who are developers of Unity-based apps and games, how to cleanly maintain large scenes. The prefab editor allows components to be manipulated in isolation. Nesting allows that “isolation” to be tiered into a hierarchy. Variants allows the artist to override parts of prefabs to tweak without starting from scratch. The punchline is that the entire scene was made in about two months with just two artists.

The ECS side, on the other hand, showed that Unity’s new framework will soon make it a serious performance contender. The programming paradigm diverts from object-oriented principles, instead operating on combinations of lists of thin slices of data that, altogether, represent your system. This is good for CPUs because it allows linear memory access and massive parallelism, including vectorization, which keeps your processor at peak efficiency.

Note that, in terms of draw calls, the system does a lot of instancing to submit them to the GPU together, so this post isn't "Unity does millions of draw calls!" because that's not true. It's distinct objects in the scene that are indexed and sent to the driver in groups. That said, it's still a strong point that ECS is fast enough to effectively batch, LOD, and cull millions of objects into something the driver can handle; the GPU driver just got a lot of attention with Mantle, Vulkan, and DirectX 12. (And yes that's important too!)

Realtime Raytracing Commit Spotted in Unity GitHub

Subject: General Tech | September 14, 2018 - 10:32 PM |
Tagged: rtx, Unity, ray tracing, directx raytracing, DirectX 12

As Ken wrote up his take in a separate post, NVIDIA has made Turing architecture details public, which will bring real-time ray tracing to PC gaming later this month. When it was announced, NVIDIA had some demos in Unreal Engine 4, and a few partnered games (Battlefield V, Shadow of the Tomb Raider, and Metro Exodus) showed off their implementations.

As we expected, Unity is working on supporting it too.


Not ray tracing, but from the same project at Unity.

The first commit showed up on Unity’s GitHub for their Scriptable Render Pipelines project, dated earlier today. Looking through the changes, it appears to just generate the acceleration structure based on the objects of type renderer in the current scene (as well as define the toggle properties of course). It looks like we are still a long way out.

I’m looking forward to ray tracing implementations, though. I tend to like art styles with anisotropic metal trims and soft shadows, which is difficult to get right with rasterization alone due to the reliance on other objects in the scene. In the case of metal, reflections dominate the look and feel of the material. In the case of soft shadows, you really need to keep track of how much of a light has been blocked between the rendered fragment and the non-point light.

And yes, it will depend on the art style, but mine just happens to be computationally expensive.

Unity 2018.2 Released

Subject: General Tech | July 10, 2018 - 10:35 PM |
Tagged: Unity, pc gaming

The second Unity update of 2018 has been published to their website today. This version continues their work on Scriptable Render Pipelines, including their own Lightweight Render Pipeline (LWRP) and High Definition Render Pipeline (HDRP) implementations. Both are still considered a preview, but the aim is to replace the standard shader with two optimized graphics pipelines, one tuned for performance (mobile, VR, and extra performance on higher-end devices) and one tuned for high-end effects (multiple aligned transparent objects, reflections, etc.).


This splits Unity’s customer base from “one-size-fits-all-sort-of” to two sizes, although developers can also create their own scriptable render pipeline. This will let them tune the graphics pipeline to whatever their game is, although it seems to mean that they will need to make a lot of their own graphics technologies if they do. (This seems clearly targeted at mid- to large-sized studios, but that’s just my opinion.) Of course, they can also continue to use the standard shader, but some Unity talks has already suggested that not all new features will come to the old pipeline.

2018.2 also continues development of the C# Job System, ECS design pattern, and their Burst compiler. A separate announcement was made about the Burst compiler – that it is now available as an official preview package.

Source: Unity

Unity 2018.1 Released

Subject: General Tech | May 2, 2018 - 08:13 PM |
Tagged: Unity, video games

The first entry in Unity’s 2018.x line of releases has just been published to their website. Developers can now choose to migrate their projects to it and expect official support until 2018.2, or they can stick with 2017.4 and for two years – but not get new engine features. That said, if you have a big project that is expected to ship within a handful of months, then you may just want things to stay constant and stable until you ship (and maybe publish a wave of DLC).


There’s a few big additions with this version, but the ones I care about most are still in preview. The first is, of course, the ECS design pattern with the C# Job System. It is still super early days with this one, but I’m very interested in a rigid, optimized, data-driven workflow that makes it easy to batch tasks together. Give me constraints – it’s okay – if I can get value from it.

Then we get to the Scriptable Render Pipeline and its two presets: High-Definition Render Pipeline and Lightweight Render Pipeline. This allows the developer to control how their content is rendered, including how it is culled, how materials and lights are processed, what materials and lights can do, can so forth. They also say that some features will only come to the High-Definition Render Pipeline to get people off the standard workflow into the new render path… but I wonder how that will affect developers who create their own scriptable render pipeline. It’s reasonable to assume that a developer who makes their own path will need to do some level of tweaking to get new features, but I wonder how much effort Unity will put into helping those developers.

There is also a new, beta update to the Post Processing stack. This should be familiar to users of Unreal Engine 4. Unity has continued to push a bunch of effects, like color grading, bloom, reflection, ambient occlusion, certain antialiasing techniques, and so forth, into a canonical suite of filters. They have also added volumes that developers can place in their scene to add a hierarchy for smooth transitions between effects.

From a practical standpoint, the new package manager also looks very interesting. There’s not much to write about for it, in an enthusiast PC hardware site at least, but it could be a nice way of delivering features to users. Instead of waiting for a whole new Unity release, you can fiddle with new features on a one-by-one basis. Maybe even third-party content, typically found in the asset store, can find its way on there – with a network of dependencies that it just sorts out for you.

Check it out on Unity’s website.

Source: Unity

zSpace and Unity Announces XR Resources for Education

Subject: General Tech | November 5, 2017 - 08:14 PM |
Tagged: Unity, zspace, xr, AR, VR

The Unity Educator Toolkit was created by Unity3D to integrate learning game development into the K-12 public curriculum. Now zSpace, which we’ve mentioned a few times, is joining in to the initiative with their mixed-reality platform. The company is known for creating displays that, when viewed with their glasses, track where you are and make the object appear to be in front of you. They also have a stylus that lets you interact with the virtual object.


They are focused on the educational side of VR and AR.

It’s not entirely clear what this means, because a lot of the details are behind a sign-up process. That said, if you’re an educator, then check out the package to see if it’s relevant for you. Creating games is an interesting, albeit challenging and somewhat daunting, method of expressing oneself. Giving kids the tools to make little game jam-style expressions, or even using the technology in your actual lessons, will reach a new group of students.

Source: zSpace

OTOY Discussed AI Denoising at Unite Austin

Subject: General Tech | October 4, 2017 - 08:59 PM |
Tagged: 3D rendering, otoy, Unity, deep learning

When raytracing images, sample count has a massive impact on both quality and rendering performance. This corresponds to the number of rays within a pixel that were cast, which, when averaged out over many, many rays, eventually matches what the pixel should be. Think of it this way: if your first ray bounces directly into a bright light, and the second ray bounces into the vacuum of space, should the color be white? Black? Half-grey? Who knows! However, if you send 1000 rays with some randomized pattern, then the average is probably a lot closer to what it should be (which depends on how big the light is, what it bounces off of, etc.).

At Unite Austin, which started today, OTOY showed off an “AI temporal denoiser” algorithm for raytraced footage. Typically, an artist chooses a sample rate that looks good enough to the end viewer. In this case, the artist only needs to choose enough samples that an AI can create a good-enough video for the end user. While I’m curious how much performance is required in the inferencing stage, I do know how much a drop in sample rate can affect render times, and it’s a lot.

Check out OTOY’s video, embed above.

Unity 2017.2.0f1 Released

Subject: General Tech | September 23, 2017 - 12:22 PM |
Tagged: pc gaming, Unity

While it’s not technically released yet, Unity has flipped the naming scheme of Unity 2017.2 to Unity 2017.2.0f1. The “f” stands for final, so we will probably see a blog post on it soon. This version has a handful of back-end changes, such as improved main-thread performance when issuing commands to graphics APIs, but the visible changes are mostly in two areas: XR (VR + AR) and baked lighting.


From the XR standpoint, a few additions stand out. First, this version now supports Google Tango and Windows Mixed Reality, the latter of which is tied to the Windows 10 Fall Creators Update, so it makes sense that Unity would have support in the version before that gets released (October 17th). In terms of features, the editor now supports emulating a Vive headset, so you can test some VR elements without having a headset. I expect this will mostly be good for those who want to do a bit of development in places where they don’t have access to their headset, although that’s blind speculation from my standpoint.

The other area that got a boost is baked global illumination. Unity started introducing their new Progressive Lightmapping feature in Unity 5.6, and it bakes lighting into the scenes in the background as you work. This update allows you to turn shadows on and off on a per-object basis, and it supports double-sided materials. You cannot have independent lighting calculations for the front and back of a triangle... if you want that, then you will need to give some volume to your models. This is mostly for situations like the edge of a level, so you don’t need to create a second wall facing away from the playable area to block light coming in from outside the playable area.

I’m not sure when the official release is, but it looks like the final, supported build is out now.

Source: Unity