Unreal Engine 4.22 Preview 1 Published: Initial DXR Support

Subject: Graphics Cards | February 12, 2019 - 03:56 PM |
Tagged: pc gaming, ue4, epic games, dxr, DirectX 12, microsoft

The upcoming version of Unreal Engine, 4.22, will include several new features.

The most interesting addition for our audience is probably “Early Access” support for DirectX 12 Raytracing (DXR) on DirectX 12. This includes the low-level framework to cast and evaluate rays in shaders (although they don’t clarify whether that means written shaders, nodes for graph-based shaders, or both) as well as higher-level features that use DXR, such as area lights, soft shadows, and reflections. They have also added a denoiser for shadows, reflections, and ambient occlusion, which will improve image quality with lower sample counts.

epicgames-2019-reflections-star-wars-dxr.jpg

If you remember NVIDIA’s RTX announcement, many of their first-party demos were built using Unreal Engine 4. This includes the Star Wars demo with the two Stormtroopers putting their feet in their mouths on an elevator with their boss. It makes sense that Epic would be relatively far along in RTX support, especially just before GDC.

A few other additions include Visual Studio 2019 support (although Visual Studio 2017 is still the default). The new Unreal Audio Engine is now enabled by default for new projects, which was a complete re-write of the original system that started a few years ago. The old audio system was a bit of a mess, and, worse, varied from platform to platform.

Unreal Engine 4.22 also (experimentally) opts-in to the much longer file and paths names that were introduced with the Windows 10 Anniversary Update. The previous limit was 260 characters for a full path, which was defined as MAX_PATH in Win32. I’m not sure what the new limit is, but I think it’s 32,767 characters after expansion. I could be wrong, though.

If you have the Epic Launcher installed, whether it’s for Unreal Engine, Fortnite, something from the Epic Store, Unreal Tournament 4, or whatever, then you can check out Unreal Engine 4.22 for free. (Royalties apply under certain circumstances… but, at that point, you are making money off of it.)

Source: Epic Games

February 12th Update for Battlefield V Adds DLSS

Subject: Graphics Cards | February 12, 2019 - 02:53 PM |
Tagged: pc gaming, battlefield V, ea, dice, nvidia, DLSS, dxr

The Battlefield V Tides of War Chapter 2: Lightning Strikes Update #3 patch, beyond sounding like a Final Fantasy title, has quite a few major improvements. The headlining feature is improved RTX support, which we will discuss shortly, but fans of the game may appreciate the other bullet points, too.

ea-2019-bfv.jpg

But first, because we are a computer hardware site, the RTX stuff. DLSS, which was recently added to 3DMark and greatly improved the image quality, has been added to Battlefield V. This setting uses machine learning to produce a best guess at antialiasing, versus calculating it with a direct algorithm (such as with TXAA or FXAA). Now that MSAA is somewhat uncommon, because it is incompatible with certain rendering processes, we’re stuck with either antialiasing via post-process or super-sampling. Super-sampling is expensive, so it’s usually either FXAA, which tries to find edges and softens them, or TXAA, which gives neighboring frames different sub-pixel positions and blends them. Both cases have issues. TXAA is considered the “higher end” option, although it gets ugly when objects move, especially quickly and visibly smooth. Because DLSS is basically a shortcut to provide something that looks like super-sampling, it should avoid many of these issues.

DXR raytracing performance was also improved.

Okay, now the tech enthusiasts can stop reading – it’s time for the fans.

Vaultable object detection is said to have a major improvement with this release. DICE acknowledges that Battlefield V movement wasn’t as smooth as it should be. There were a lot of waist-high barriers that players can get stuck behind, which the vaulting system should propel them over. It should be much easier to move around the map after this update, which is good for people like me who like to sneak around and flank.

DICE has also discussed several netcode changes, such as adding more damage updates per packet and fixing some issues where damage should be ignored, or healing should occur but would be ignored, and so forth. Basically, all of the netcode improvements were related to health or damage in some way, which is a good area to focus on.

Also, the Rush game mode, introduced in the Battlefield Bad Company sub-franchise, will return on March 7th "for a limited time"... whatever they mean by that.

The update should be available now.

Source: EA / DICE

Podcast #522 - Intel i9-9980XE, Secure HDDs, RTX in BFV, and more!

Subject: General Tech | November 16, 2018 - 01:52 PM |
Tagged: RTX 4000, quadro, podcast, nvidia, Intel, i9-9980XE, dxr, BFV, 7980xe, 2990wx

PC Perspective Podcast #522 - 11/15/18

Join us this week for discussion on Intel's new i9-9980XE, Hardware Encypted HDDs, RTX in BFV and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Allyn Malventano, Jeremy Hellstrom, Josh Walrath, Ken Addison, and Sebastian Peak

Peanut Gallery: Alex Lustenberg

Program length: 1:12:08

Podcast topics of discussion:

  1. Week in Review:
  2. News items of interest:
  3. Picks of the Week:
    1. Allyn: A book on retro console stuff - all disassembled (‘The Game Console’)
    2. Jeremy: A robot arm named Dexter, what could go wrong?
  4. Closing/outro

Podcast #492 - MyDigitalSSD, CalDigit Tuff Drive, and more!

Subject: General Tech | March 22, 2018 - 12:37 PM |
Tagged: winml, vive pro, video, Tobii, SBX, rtx, qualcomm, podcast, pny, MyDigitalSSD, logitech, htc, G560, G513, dxr, CS900, corsair, caldigit, AX1600i

PC Perspective Podcast #492 - 03/22/18

Join us this week for MyDigitalSSD, CalDigit Tuff Drive, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Jim Tanous, Jeremy Hellstrom, Josh Walrath

Peanut Gallery: Alex Lustenberg

Program length: 1:08:16

Podcast topics of discussion:
  1. Week in Review:
  2. RX Bar
  3. News items of interest:
  4. Picks of the Week:
    1. 1:00:55 Josh My kid loves them.
    2. 1:03:30 Jim: Xbox Game Pass
  5. Closing/outro
 
Source:
Manufacturer: Microsoft

O Rayly? Ya Rayly. No Ray!

Microsoft has just announced a raytracing extension to DirectX 12, called DirectX Raytracing (DXR), at the 2018 Game Developer's Conference in San Francisco.

microsoft-2015-directx12-logo.jpg

The goal is not to completely replace rasterization… at least not yet. This effect will be mostly implemented for effects that require supplementary datasets, such as reflections, ambient occlusion, and refraction. Rasterization, the typical way that 3D geometry gets drawn on a 2D display, converts triangle coordinates into screen coordinates, and then a point-in-triangle test runs across every sample. This will likely occur once per AA sample (minus pixels that the triangle can’t possibly cover -- such as a pixel outside of the triangle's bounding box -- but that's just optimization).

microsoft-2018-gdc-directx12raytracing-rasterization.png

For rasterization, each triangle is laid on a 2D grid corresponding to the draw surface.
If any sample is in the triangle, the pixel shader is run.
This example shows the rotated grid MSAA case.

A program, called a pixel shader, is then run with some set of data that the GPU could gather on every valid pixel in the triangle. This set of data typically includes things like world coordinate, screen coordinate, texture coordinates, nearby vertices, and so forth. This lacks a lot of information, especially things that are not visible to the camera. The application is free to provide other sources of data for the shader to crawl… but what?

  • Cubemaps are useful for reflections, but they don’t necessarily match the scene.
  • Voxels are useful for lighting, as seen with NVIDIA’s VXGI and VXAO.

This is where DirectX Raytracing comes in. There’s quite a few components to it, but it’s basically a new pipeline that handles how rays are cast into the environment. After being queued, it starts out with a ray-generation stage, and then, depending on what happens to the ray in the scene, there are close-hit, any-hit, and miss shaders. Ray generation allows the developer to set up how the rays are cast, where they call an HLSL instrinsic instruction, TraceRay (which is a clever way of invoking them, by the way). This function takes an origin and a direction, so you can choose to, for example, cast rays only in the direction of lights if your algorithm was to, for instance, approximate partially occluded soft shadows from a non-point light. (There are better algorithms to do that, but it's just the first example that came off the top of my head.) The close-hit, any-hit, and miss shaders occur at the point where the traced ray ends.

To connect this with current technology, imagine that ray-generation is like a vertex shader in rasterization, where it sets up the triangle to be rasterized, leading to pixel shaders being called.

microsoft-2018-gdc-directx12raytracing-multibounce.png

Even more interesting – the close-hit, any-hit, and miss shaders can call TraceRay themselves, which is used for multi-bounce and other recursive algorithms (see: figure above). The obvious use case might be reflections, which is the headline of the GDC talk, but they want it to be as general as possible, aligning with the evolution of GPUs. Looking at NVIDIA’s VXAO implementation, it also seems like a natural fit for a raytracing algorithm.

Speaking of data structures, Microsoft also detailed what they call the acceleration structure. Each object is composed of two levels. The top level contains per-object metadata, like its transformation and whatever else data that the developer wants to add to it. The bottom level contains the geometry. The briefing states, “essentially vertex and index buffers” so we asked for clarification. DXR requires that triangle geometry be specified as vertex positions in either 32-bit float3 or 16-bit float3 values. There is also a stride property, so developers can tweak data alignment and use their rasterization vertex buffer, as long as it's HLSL float3, either 16-bit or 32-bit.

As for the tools to develop this in…

microsoft-2018-gdc-PIX.png

Microsoft announced PIX back in January 2017. This is a debugging and performance analyzer for 64-bit, DirectX 12 applications. Microsoft will upgrade it to support DXR as soon as the API is released (specifically, “Day 1”). This includes the API calls, the raytracing pipeline resources, the acceleration structure, and so forth. As usual, you can expect Microsoft to support their APIs with quite decent – not perfect, but decent – documentation and tools. They do it well, and they want to make sure it’s available when the API is.

ea-2018-SEED screenshot (002).png

Example of DXR via EA's in-development SEED engine.

In short, raytracing is here, but it’s not taking over rasterization. It doesn’t need to. Microsoft is just giving game developers another, standardized mechanism to gather supplementary data for their games. Several game engines have already announced support for this technology, including the usual suspects of anything top-tier game technology:

  • Frostbite (EA/DICE)
  • SEED (EA)
  • 3DMark (Futuremark)
  • Unreal Engine 4 (Epic Games)
  • Unity Engine (Unity Technologies)

They also said, “and several others we can’t disclose yet”, so this list is not even complete. But, yeah, if you have Frostbite, Unreal Engine, and Unity, then you have a sizeable market as it is. There is always a question about how much each of these engines will support the technology. Currently, raytracing is not portable outside of DirectX 12, because it’s literally being announced today, and each of these engines intend to support more than just Windows 10 and Xbox.

Still, we finally have a standard for raytracing, which should drive vendors to optimize in a specific direction. From there, it's just a matter of someone taking the risk to actually use the technology for a cool work of art.

If you want to read more, check out Ryan's post about the also-announced RTX, NVIDIA's raytracing technology.

NVIDIA RTX Technology Accelerates Ray Tracing for Microsoft DirectX Raytracing API

Subject: Graphics Cards | March 19, 2018 - 01:00 PM |
Tagged: rtx, nvidia, dxr

The big news from the Game Developers Conference this week was Microsoft’s reveal of its work on a new ray tracing API for DirectX called DirectX Raytracing. As the name would imply, this is a new initiative to bring the image quality improvements of ray tracing to consumer hardware with the push of Microsoft’s DX team. Scott already has a great write up on that news and current and future implications of what it will mean for PC gamers, so I highly encourage you all to read that over before diving more into this NVIDIA-specific news.

Ray tracing has been the holy grail of real-time rendering. It is the gap between movies and games – though ray tracing continues to improve in performance it takes the power of offline server farms to render the images for your favorite flicks. Modern game engines continue to use rasterization, an efficient method for rendering graphics but one that depends on tricks and illusions to recreate the intended image. Ray tracing inherently solves the problems that rasterization works around including shadows, transparency, refraction, and reflection. But it does so at a prohibitive performance cost. But that will be changing with Microsoft’s enablement of ray tracing through a common API and technology like what NVIDIA has built to accelerate it.

04.jpg

Alongside support and verbal commitment to DXR, NVIDIA is announcing RTX Technology. This is a combination of hardware and software advances to improve the performance of ray tracing algorithms on its hardware and it works hand in hand with DXR. NVIDIA believes this is the culmination of 10 years of development on ray tracing, much of which we have talked about on this side from the world of professional graphics systems. Think Iray, OptiX, and more.

RTX will run on Volta GPUs only today, which does limit usefulness to gamers. With the only graphics card on the market even close to considered a gaming product the $3000 TITAN V, RTX is more of a forward-looking technology announcement for the company. We can obviously assume then that RTX technology will be integrated on any future consumer gaming graphics cards, be that a revision of Volta of something completely different. (NVIDIA refused to acknowledge plans for any pending Volta consumer GPUs during our meeting.)

The idea I get from NVIDIA is that today’s RTX is meant as a developer enablement platform, getting them used to the idea of adding ray tracing effects into their games and engines and to realize that NVIDIA provides the best hardware to get that done.

I’ll be honest with you – NVIDIA was light on the details of what RTX exactly IS and how it accelerates ray tracing. One very interesting example I was given was seen first with the AI-powered ray tracing optimizations for Optix from last year’s GDC. There, NVIDIA demonstrated that using the Volta Tensor cores it could run an AI-powered de-noiser on the ray traced image, effectively improving the quality of the resulting image and emulating much higher ray counts than are actually processed.

By using the Tensor cores with RTX for DXR implementation on the TITAN V, NVIDIA will be able to offer image quality and performance for ray tracing well ahead of even the TITAN Xp or GTX 1080 Ti as those GPUs do not have Tensor cores on-board. Does this mean that all (or flagship) consumer graphics cards from NVIDIA will includ Tensor cores to enable RTX performance? Obviously, NVIDIA wouldn’t confirm that but to me it makes sense that we will see that in future generations. The scale of Tensor core integration might change based on price points, but if NVIDIA and Microsoft truly believe in the future of ray tracing to augment and significantly replace rasterization methods, then it will be necessary.

Though that is one example of hardware specific features being used for RTX on NVIDIA hardware, it’s not the only one that is on Volta. But NVIDIA wouldn’t share more.

The relationship between Microsoft DirectX Raytracing and NVIDIA RTX is a bit confusing, but it’s easier to think of RTX as the underlying brand for the ability to ray trace on NVIDIA GPUs. The DXR API is still the interface between the game developer and the hardware, but RTX is what gives NVIDIA the advantage over AMD and its Radeon graphics cards, at least according to NVIDIA.

DXR will still run on other GPUS from NVIDIA that aren’t utilizing the Volta architecture. Microsoft says that any board that can support DX12 Compute will be able to run the new API. But NVIDIA did point out that in its mind, even with a high-end SKU like the GTX 1080 Ti, the ray tracing performance will limit the ability to integrate ray tracing features and enhancements in real-time game engines in the immediate timeframe. It’s not to say it is impossible, or that some engine devs might spend the time to build something unique, but it is interesting to hear NVIDIA infer that only future products will benefit from ray tracing in games.

It’s also likely that we are months if not a year or more from seeing good integration of DXR in games at retail. And it is also possible that NVIDIA is downplaying the importance of DXR performance today if it happens to be slower than the Vega 64 in the upcoming Futuremark benchmark release.

05.jpg

Alongside the RTX announcement comes GameWorks Ray Tracing, a colleciton of turnkey modules based on DXR. GameWorks has its own reputation, and we aren't going to get into that here, but NVIDIA wants to think of this addition to it as a way to "turbo charge enablement" of ray tracing effects in games.

NVIDIA believes that developers are incredibly excited for the implementation of ray tracing into game engines, and that the demos being shown at GDC this week will blow us away. I am looking forward to seeing them and for getting the reactions of major game devs on the release of Microsoft’s new DXR API. The performance impact of ray tracing will still be a hindrance to larger scale implementations, but with DXR driving the direction with a unified standard, I still expect to see some games with revolutionary image quality by the end of the year. 

Source: NVIDIA