GTC 19: NVIDIA Announces GameWorks RTX; Unreal Engine and Unity Gain DXR Support

Subject: General Tech | March 18, 2019 - 10:00 PM |
Tagged: gameworks, unreal engine, Unity, rtx, ray tracing, nvidia, GTC 19, GTC, dxr, developers

Today at GTC NVIDIA announced GameWorks RTX and the implementation of real-time ray tracing in the upcoming Unreal Engine 4.22 and the latest version of Unity, currently in 2019.03.

NVIDIA Announces GameWorks RTX

While Pascal and non-RTX Turing support for real-time ray tracing is something of a bombshell from NVIDIA, the creation of GameWorks tools for such effects is not surprising.

“NVIDIA GameWorks RTX is a comprehensive set of tools that help developers implement real time ray-traced effects in games. GameWorks RTX is available to the developer community in open source form under the GameWorks license and includes plugins for Unreal Engine 4.22 and Unity’s 2019.03 preview release.”

NVIDIA lists these components of GameWorks RTX:

GW_RTX.PNG

  • RTX Denoiser SDK – a library that enables fast, real-time ray tracing by providing denoising techniques to lower the required ray count and samples per pixel. It includes algorithms for ray traced area light shadows, glossy reflections, ambient occlusion and diffuse global illumination.
  • Nsight for RT – a standalone developer tool that saves developers time by helping to debug and profile graphics applications built with DXR and other supported APIs.

Unreal Engine and Unity Gaining Real-Time Ray Tracing Support

DXR_GAME_ENGINES.png

And while not specific to NVIDIA hardware, news of more game engines offering integrated DXR support was also announced during the keynote:

“Unreal Engine 4.22 is available in preview now, with final release details expected in Epic’s GDC keynote on Wednesday. Starting on April 4, Unity will offer optimized, production-focused, realtime ray tracing support with a custom experimental build available on GitHub to all users with full preview access in the 2019.03 Unity release. Real-time ray tracing support from other first-party AAA game engines includes DICE/EA’s Frostbite Engine, Remedy Entertainment’s Northlight Engine and engines from Crystal Dynamics, Kingsoft, Netease and others.”

RTX may have been off to a slow start, but this will apparently be the year of real-time ray tracing after all - especially with the upcoming NVIDIA driver update adding support to the GTX 10-series and new GTX 16-series GPUs.

Source: NVIDIA

NVIDIA to Add Real-Time Ray Tracing Support to Pascal GPUs via April Driver Update

Subject: Graphics Cards | March 18, 2019 - 09:41 PM |
Tagged: unreal engine, Unity, turing, rtx, ray tracing, pascal, nvidia, geforce, GTC 19, GTC, gaming, developers

Today at GTC NVIDIA announced a few things of particular interest to gamers, including GameWorks RTX and the implementation of real-time ray tracing in upcoming versions of both Unreal Engine and Unity (we already posted the news that CRYENGINE will be supporting real-time ray tracing as well). But there is something else... NVIDIA is bringing ray tracing support to GeForce GTX graphics cards.

DXR_GPUs.png

This surprising turn means that hardware RT support won’t be limited to RTX cards after all, as the install base of NVIDIA ray-tracing GPUs “grows to tens of millions” with a simple driver update next month, adding the feature to both to previous-gen Pascal and the new Turing GTX GPUs.

How is this possible? It’s all about the programmable shaders:

“NVIDIA GeForce GTX GPUs powered by Pascal and Turing architectures will be able to take advantage of ray tracing-supported games via a driver expected in April. The new driver will enable tens of millions of GPUs for games that support real-time ray tracing, accelerating the growth of the technology and giving game developers a massive installed base.

With this driver, GeForce GTX GPUs will execute ray traced effects on shader cores. Game performance will vary based on the ray-traced effects and on the number of rays cast in the game, along with GPU model and game resolution. Games that support the Microsoft DXR and Vulkan APIs are all supported.

However, GeForce RTX GPUs, which have dedicated ray tracing cores built directly into the GPU, deliver the ultimate ray tracing experience. They provide up to 2-3x faster ray tracing performance with a more visually immersive gaming environment than GPUs without dedicated ray tracing cores.”

A very important caveat is that “2-3x faster ray tracing performance” for GeForce RTX graphics cards mentioned in the last paragraph, so expectations will need to be tempered as RT features will be less efficient running on shader cores (Pascal and Turing) than they are with dedicated cores, as demonstrated by these charts:

BFV_CHART.png

METRO_EXODUS_CHART.png

SOTTR_CHART.png

It's going to be a busy April.

Source: NVIDIA

Microsoft Converts Unreal Engine 4 to UWP

Subject: General Tech | July 27, 2016 - 08:47 PM |
Tagged: microsoft, epic games, unreal engine, unreal engine 4, ue4, uwp

The head of Epic Games, Tim Sweeney, doesn't like UWP too much, at least as it exists today (and for noble reasons). He will not support the new software (app) platform unless Microsoft makes some clear changes that guarantee perpetual openness. There really isn't anything, technically or legally, to prevent Microsoft (or an entity with authority over Microsoft, like governments, activists groups who petition government, and so forth) from undoing their changes going forward. If Microsoft drops support for Win32, apart from applications that are converted using Project Centennial or something, their catalog would be tiny.

Ridiculously tiny.

SteamOS would kick its butt levels of tiny, let alone OSX, Android, and countless others.

As a result, Microsoft keeps it around, despite its unruliness. Functionality that is required by legitimate software make it difficult to prevent malware, and, even without an infection, it can make the system just get junked up over time.

microsoft-2016-uwp-logo.png

UWP, on the other hand, is slimmer, contained, and authenticated with keys. This is theoretically easier to maintain, but at the expense of user control and freedom; freedom to develop and install software anonymously and without oversight. The first iteration was with Windows RT, which was basically iOS, right down to the “you cannot ship a web browser unless it is a reskin of Internet Explorer ((replace that for Safari in iOS' case))” and “content above ESRB M and PEGI 16 are banned from the OS” levels of control.

Since then, content guidelines have increased, sideloading has been added, and so forth. That said, unlike the technical hurdles of Win32, there's nothing to prevent Microsoft from, in the future, saying “Okay, we have enough software for lock in. Sideloading is being removed in Windows 10 version 2810” or something. I doubt that the current administration wants to do this, especially executives like Phil Spencer, but their unwillingness to make it impossible to be done in the future is frustrating. This could be a few clauses in the EULA that make it easy for users to sue Microsoft if a feature is changed, and/or some chunks of code that breaks compatibility if certain openness features are removed.

Some people complain that he wasn't this concerned about iOS, but he already said that it was a bad decision in hindsight. Apple waved a shiny device around, and it took a few years for developers to think “Wait a minute, what did I just sign away?” iOS is, indeed, just as bad as UWP could turn into, if not worse.

Remember folks, once you build a tool for censorship, they will come. They may also have very different beliefs about what should be allowed or disallowed than you do. This is scary stuff, albeit based on good intentions.

That rant aside, Microsoft's Advanced Technology Group (ATG) has produced a fork of Unreal Engine 4, which builds UWP content. It is based upon Unreal Engine 4.12, and they have apparently merged changes up to version 4.12.5. This makes sense, of course, because that version is required to use Visual Studio 2015 Update 3.

If you want to make a game in Unreal Engine 4 for the UWP platform, then you might be able to use Microsoft's version. That said, it is provided without warranty, and there might be some bugs that cropped up, which Epic Games will probably not help with. I somehow doubt that Microsoft will have a dedicated team that merges all fixes going forward, and I don't think this will change Tim's mind (although concrete limitations that guarantee openness might...). Use at your own risk, I guess, especially if you don't care about potentially missing out on whatever is added for 4.13 and on (unless you add it yourself).

The fork is available on Microsoft's ATG GitHub, with lots of uppercase typing.

Epic Games Releases Unreal Engine 4.12

Subject: General Tech | June 1, 2016 - 06:26 PM |
Tagged: unreal engine, ue4, unreal engine 4, epic, epic games

Epic Games has released Unreal Engine 4.12, which adds quite a bit, especially cinematic tools. Those who created games or mods in Unreal Engine 3 or 4 will know about Matinee, the interface to animate objects in a scene. It has finally been replaced with Sequencer, which is designed to feel more like Adobe After Effects or Adobe Premiere. They also add a bunch of features to DirectX 12 and Vulkan, but both are still in experimental preview. Vulkan, for instance, only implements rendering features for mobile, not desktop.

epic-2016-412-sequencer.jpg

Beyond Sequencer, mentioned above, Epic has also added a bunch of new rendering technologies for high-end graphics. This includes High Quality Reflections, Planar Reflections, Grass and Foliage Scalability, and Twist Corrective Animation Node. These are quite interesting for someone like me, who has been getting back into pre-rendered animation recently, but finds that typical, production renderers (such as Cycles) are quite heavy, slowing me down. Epic was interested in bringing Unreal Engine into a video production workflow, even back in Unreal Engine 3, and it's good to see a lot of attention in this area. It might be enough to move me over at some point, especially for animations that don't have a hyper-realistic style. Even better -- this level of visual quality should land in some games, too.

Unreal Engine 4.12 is now available on Epic's Launcher.

Source: Epic Games

Epic Games Releases Unreal Engine 4.11

Subject: General Tech | March 31, 2016 - 02:52 PM |
Tagged: epic games, unreal engine, unreal engine 4

It has been in preview since December, but Epic Games has finally released Unreal Engine 4.11 for developers to create awesome things with. This version focused on performance and the features that were added for Paragon, which entered early access two weeks ago. DirectX 12 is still considered experimental, and Vulkan is missing officially (although John Alcatraz has a tutorial to add it to Unreal Engine built from source), but the rendering back-end has received significant changes to accommodate the new graphics APIs in the future.

epicgames-2015-paragon-hair3.jpg

The three features that I'm most interested in, apart from free performance, are lighting channels, capsule shadows, and improved building of static light. Light channels are very difficult to implement in a deferred renderer, but Epic managed. This means that you can have dynamic lights only affect certain objects in the scene, either for performance, if enough lights are ignored to justify the cost of the channels themselves, or for special effects, like making a specific object stand out in a scene. They also added new shading models for eyes, hair, skin, and cloth, and added a bunch of interesting audio features.

Unreal Engine 4.11 is available now from Epic's Launcher. It's free to use, but Epic takes a royalty on certain revenues.

Source: Epic Games

Realistic Landscapes in Unreal Engine 4

Subject: General Tech | March 29, 2016 - 01:55 PM |
Tagged: unreal engine, epic games, art by rens

This work was featured on Unreal Engine's GDC sizzle video as "Photogrammetry", but I've only just now found out where it came from. The creator, Rense de Boer, goes by the consistent online branding of Art by Rens. He worked at DICE from 2011 through 2014, working on the Battlefield franchise, and now he seems to be doing his own thing.

The environment work is stunning. The snow, slightly thawed and refrozen, covers the rocks and leaves in a way that looks absolutely real. Some of the rocks look partially moss-covered, with the plant seemingly breaking it down, and coming up through the cracks. There's only so many ways that I can describe it, but it's definitely worth a look. He targets four Titan-class video cards, but he's aiming for 4K.

epic-2016-artbyrens-leaves.jpg

He hasn't announced any product yet, so we're not really sure why he's doing it. He did receive a grant from Epic Games, though. I'm not sure exactly how much, just that $500,000 USD was split 30 ways, but not uniformly (some received more than others).

Source: Art by Rens

Epic Games Releases Unreal Engine 4.9

Subject: General Tech | September 1, 2015 - 04:24 PM |
Tagged: unreal engine 4, unreal engine, ue4.9, ue4, epic games, dx12

For an engine that was released in late-March, 2014, Epic has been updating it frequently. Unreal Engine 4.9 is, as the number suggests, the tenth release (including 4.0) in just 17 months, which is less than two months per release on average. Each release is fairly sizable, too. This one has about 232 pages of release notes, plus a page and a half of credits, and includes changes for basically every system that I can think of.

The two most interesting features, for me, are Area Shadows and Full Scene Particle Collision.

Area Shadows simulates lights that are physically big and relatively close. At the edges of a shadow, the object that casts the shadow are blocking part of the light. Wherever that shadow falls will be partially lit by the fraction of the light that can see it. As that shadow position gets further back from the shadow caster, it gets larger.

pcper-2015-softshadows.png

On paper, you can calculate this by drawing rays from either edge of each shadow-casting light to either edge of each shadow-casting object, continued to the objects that receive the shadows. If both sides of the light can see the receiver? No shadows. If both sides of the light cannot see the receiver? That light is blocked, which is a shadow. If some percent of a uniform light can see the receiver, then it will be shadowed by 100% minus that percentage. This is costly to do, unless neither the light nor any of the affected objects move. In that case, you can just store the result, which is how “static lighting” works.

Another interesting feature is Full Scene Particle Collision with Distance Fields. While GPU-computed particles, which is required for extremely high particle counts, collide already, distance fields allow them to collide with objects off screen. Since the user will likely be able to move the camera, this will allow for longer simulations as the user cannot cause it to glitch out by, well, playing the game. It requires SM 5.0 though, which limits it to higher end GPUs.

epic-2015-ue4-dx12.jpg

This is also the first release to support DirectX 12. That said, when I used a preview build, I noticed a net-negative performance with my 9000 draw call (which is a lot) map on my GeForce GTX 670. Epic calls it “experimental” for a reason, and I expect that a lot of work must be done to deliver tasks from an existing engine to the new, queue-based system. I will try it again just in case something changed from the preview builds. I mean, I know something did -- it had a different command line parameter before.

UPDATE (Sept 1st, 10pm ET): An interesting question was raised in the comments that we feel could be a good aside for the news post.

Anonymous asked: I don't have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?

Modern, cross-platform game engines are basically an API and a set of tools atop it.

For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function -- let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter - Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.

Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() -- unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.

The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).

One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".

Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.

Devs with a larger budget might want to dive in and tweak stuff themselves, though.

Unreal Engine 4.9 is now available. It is free to use until your revenue falls under royalty clauses.

Source: Epic Games

Epic Games Announces "Unreal Dev Grants"

Subject: General Tech | February 21, 2015 - 07:00 AM |
Tagged: unreal engine 4, unreal engine, epic games

On Thursday, Tim Sweeney joined the Unreal Engine 4 Twitch Broadcast to announce “Unreal Dev Grants”. In short, Epic Games have set aside 5 million dollars to pass out in increments of five thousand ($5000 USD) to fifty thousand dollars ($50,000 USD), with no strings attached. If you are doing something cool in, with, or involving Unreal Engine 4, you are eligible and can use the money in any way. You keep all your “intellectual property” and equity, and you do not even have any accountability requirements.

epic-ue4-dev-grants.jpg

It's free money that you can apply for, or they will even approach you with if they see you doing something awesome (you can even nominate other people's projects). The only “catch” is that your work needs to be relevant to Unreal Engine. From there, it could be anything from congratulating an awesome pull request for the engine on GitHub, to giving an indie (or even AAA) game a little bit of a financial boost. Tim Sweeney was telling stories about mowing lawns for the $3000 it took for him to launch ZZT. He mowed lawns so you don't have to.

For more information, or to nominate yourself or someone else, check out their website.

Source: Epic Games

Epic Games is disappointed in the PS4 and Xbox One?

Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM |
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games

Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.

Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.

Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.

A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.

As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.

Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.

This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.

Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.

But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.

Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:

Unreal Engine 3 compiled to asm.js

Subject: Editorial, Mobile | May 7, 2013 - 12:07 AM |
Tagged: unreal engine, firefox, asm.js

Over the weekend we published a post which detailed Javascript advancements to position the web browser as a respectable replacement for native code. Asm.js allows for C-like languages to be compiled into easily optimized script executed at near native performance on asm.js-aware browsers, but are still functional as plain Javascript otherwise. If you wish to see a presentation about asm.js and compiling native code into web code, check out an online slideshow from Alon Zakai of Mozilla.

If, on the other hand, you wish to see an example of a large application compiled for the browser: would Unreal Engine 3 suffice?

UnrealHTML5.jpg

Clearly a computer hardware website would take the effort required to run a few benchmarks, and we do not disappoint. Epic Citadel was run in its benchmark mode in Firefox 20.0.1, Firefox 22.0a2, and Google Chrome; true, it was not run for long on Chrome before the tab crashed, but you cannot blame me for trying.

Each benchmark was run at full-screen 1080p "High Performance" settings on a PC with a Core i7 3770, a GeForce GTX 670, and more available RAM than the browser could possibly even allocate. The usual Firefox framerate limit was removed; they were the only tab open on the same fresh profile; the setting layout.frame_rate.precise was tested in both positions because I cannot keep up what the state of requestAnimationFrame callback delay is; and each scenario was performed twice and averaged.

Firefox 20.0.1

  • layout.frame_rate.precise true: 54.7 FPS
  • layout.frame_rate.precise false: 53.2 FPS

Firefox 22.0a2 (asm.js)

  • layout.frame_rate.precise true: 147.05 FPS
  • layout.frame_rate.precise false: 144.8 FPS

Google Chrome 26.0.1410.64

  • Crashy-crashy

For Unreal Engine 3 compiled into Javascript we notice an almost 3-fold improvement in average framerate with asm.js and the few other tweaks to rendering, Javascript, and WebGL performance between Firefox 20 and 22. I would say that is pretty enticing for developers who are considering compiling into web standards.

It is also very enticing for Epic as well. A little over a month ago, Mark Rein and Tim Sweeney of Epic were interviewed by Gamasutra about HTML5 support for Unreal Engine. Due in part to the removal of UnrealScript in favor of game code being scripted in C++, Unreal Engine 4 will support HTML5. They are working with Mozilla to make the browser a reasonable competitor to consoles; write once, run on Mac, Windows, Linux, or anywhere compatible browsers can be found. Those familiar with my past editorials know this excites me greatly.

So what do our readers think? Comment away!