Benchmark Overview

We finally get a real world way to evaluate DX12 performance with the new Ashes of the Singularity benchmark.

I knew that the move to DirectX 12 was going to be a big shift for the industry. Since the introduction of the AMD Mantle API along with the Hawaii GPU architecture we have been inundated with game developers and hardware vendors talking about the potential benefits of lower level APIs, which give more direct access to GPU hardware and enable more flexible threading for CPUs to game developers and game engines. The results, we were told, would mean that your current hardware would be able to take you further and future games and applications would be able to fundamentally change how they are built to enhance gaming experiences tremendously.

I knew that the reader interest in DX12 was outstripping my expectations when I did a live blog of the official DX12 unveil by Microsoft at GDC. In a format that consisted simply of my text commentary and photos of the slides that were being shown (no video at all), we had more than 25,000 live readers that stayed engaged the whole time. Comments and questions flew into the event – more than me or my staff could possible handle in real time. It turned out that gamers were indeed very much interested in what DirectX 12 might offer them with the release of Windows 10.

Today we are taking a look at the first real world gaming benchmark that utilized DX12. Back in March I was able to do some early testing with an API-specific test that evaluates the overhead implications of DX12, DX11 and even AMD Mantle from Futuremark and 3DMark. This first look at DX12 was interesting and painted an amazing picture about the potential benefits of the new API from Microsoft, but it wasn’t built on a real game engine. In our Ashes of the Singularity benchmark testing today, we finally get an early look at what a real implementation of DX12 looks like.

And as you might expect, not only are the results interesting, but there is a significant amount of created controversy about what those results actually tell us. AMD has one story, NVIDIA another and Stardock and the Nitrous engine developers, yet another. It’s all incredibly intriguing.

Game and Engine Overview

I don’t claim to be a game review site, but the Ashes of the Singularity benchmark is important because it is based on a real game, coming out later this year, by the same name. Based on the Oxide Nitrous engine that has been featured by AMD during Mantle development and that has now been ported over to DX12, Ashes is a massive scale real-time strategy game for lovers of the genre.

From the developer’s site:

What is Ashes of the Singularity?

Ashes of the Singularity is a real-time strategy game set in the far future that redefines the possibilities of RTS with the unbelievable scale provided by Oxide Games’ groundbreaking Nitrous engine.

What makes Ashes of the Singularity different from other RTS games?

Until now, terrestrial strategy games have had to substantially limit the number of units on screen. As a result, these RTS's could be described as battles.

Thanks to recent technological improvements such as multi-core processors and 64-bit computing combined with the invention of a new type of 3D engine called Nitrous, Ashes of the Singularity games can be described as a war across an entire world without abstraction. Thousands or even tens of thousands of individual actors can engage in dozens of battles simultaneously.

This level of on-screen units is something that Oxide claims is made possible through new low level APIs like Vulkan and DirectX 12. Though I will go into more detail on the following page, there are several scenes in this benchmark (about a third of them) that reach over 20,000 draw calls per second, an amount that DX11 just struggles to handle in a smooth fashion. An RTS game in particular, just with its inherent design, seems more apt to take advantage of this improved threading and scheduling in the new generation of graphics APIs.

What's a new benchmark without some controversy?

Just a couple of days before publication of this article, NVIDIA sent out an information email to the media detailing its “perspective” on the Ashes of the Singularity benchmark. First, NVIDIA claims that the MSAA implementation in the game engine currently has an application-side bug that the developer is working to address and thus any testing done with AA enabled was invalid. (I happened to get wind of this complaint early and did all testing without to AA avoid the complaints.) Oxide and Stardock dispute this claim as a “game bug” and instead chalk up to early drivers and a new API.

Secondly, and much more importantly, NVIDIA makes the claim that Ashes of the Singularity, in its current form, “is [not] a good indicator of overall DirectX 12 gaming performance.”

What’s odd about this claim is that NVIDIA is usually the one in the public forum talking about the benefits of real-world gaming testing and using actual applications and gaming scenarios for benchmarking and comparisons. Due to the results you’ll see in our story though, NVIDIA appears to be on the offensive, trying to dissuade media and gamers from viewing the Ashes test as indicative of future performance.

NVIDIA is correct in that the Ashes of the Singularity benchmark is “primarily useful to understand how your system runs a series of scenes from the alpha version of Ashes of Singularity” – but that is literally every game benchmark. The Metro: Last Light benchmark is only useful to tell you how well hardware performs on that game. The same is true of Grand Theft Auto V, Crysis 3, etc. Our job in the media is to take that information in aggregate and combine with more data points to paint an overall picture of any new or existing product. It just happens this is the first DX12 game benchmark available and thus we have a data point of exactly one: and it’s potentially frightening for the company on the wrong side.

Do I believe that Ashes’ performance will tell you how the next DX12 game and the one after that will perform when comparing NVIDIA and AMD graphics hardware? I do not. But until we get Fable in our hands, and whatever comes after that, we are left with this single target for our testing.

« PreviousNext »