For an engine that was released in late-March, 2014, Epic has been updating it frequently. Unreal Engine 4.9 is, as the number suggests, the tenth release (including 4.0) in just 17 months, which is less than two months per release on average. Each release is fairly sizable, too. This one has about 232 pages of release notes, plus a page and a half of credits, and includes changes for basically every system that I can think of.

The two most interesting features, for me, are Area Shadows and Full Scene Particle Collision.

Epic Games Releases Unreal Engine 4.9 - General Tech 3

Area Shadows simulates lights that are physically big and relatively close. At the edges of a shadow, the object that casts the shadow are blocking part of the light. Wherever that shadow falls will be partially lit by the fraction of the light that can see it. As that shadow position gets further back from the shadow caster, it gets larger.

On paper, you can calculate this by drawing rays from either edge of each shadow-casting light to either edge of each shadow-casting object, continued to the objects that receive the shadows. If both sides of the light can see the receiver? No shadows. If both sides of the light cannot see the receiver? That light is blocked, which is a shadow. If some percent of a uniform light can see the receiver, then it will be shadowed by 100% minus that percentage. This is costly to do, unless neither the light nor any of the affected objects move. In that case, you can just store the result, which is how “static lighting” works.

Epic Games Releases Unreal Engine 4.9 - General Tech 4

Another interesting feature is Full Scene Particle Collision with Distance Fields. While GPU-computed particles, which is required for extremely high particle counts, collide already, distance fields allow them to collide with objects off screen. Since the user will likely be able to move the camera, this will allow for longer simulations as the user cannot cause it to glitch out by, well, playing the game. It requires SM 5.0 though, which limits it to higher end GPUs.

This is also the first release to support DirectX 12. That said, when I used a preview build, I noticed a net-negative performance with my 9000 draw call (which is a lot) map on my GeForce GTX 670. Epic calls it “experimental” for a reason, and I expect that a lot of work must be done to deliver tasks from an existing engine to the new, queue-based system. I will try it again just in case something changed from the preview builds. I mean, I know something did — it had a different command line parameter before.

UPDATE (Sept 1st, 10pm ET): An interesting question was raised in the comments that we feel could be a good aside for the news post.

Anonymous asked: I don't have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?

Modern, cross-platform game engines are basically an API and a set of tools atop it.

For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function — let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter – Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.

Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() — unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.

The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).

One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".

Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.

Devs with a larger budget might want to dive in and tweak stuff themselves, though.

Unreal Engine 4.9 is now available. It is free to use until your revenue falls under royalty clauses.