A New Frontier

Taking a look at the performance of the current console generations

Console game performance has always been an area that we've been interested in here at PC Perspective but has been mostly out of our reach to evaluate with any kind of scientific tilt. Our Frame Rating methodology for PC-based game analysis relies on having an overlay application during screen capture which is later analyzed by a series of scripts. Obviously, we can not take this approach with consoles as we cannot install our own code on the consoles to run that overlay. 

A few other publications such as Eurogamer with their Digital Foundry subsite have done fantastic work developing their internal toolsets for evaluating console games, but this type of technology has mostly remained out of reach of the everyman.

Recently, we came across an open source project which aims to address this. Trdrop is an open source software built upon OpenCV, a stalwart library in the world of computer vision. Using OpenCV, trdrop can analyze the frames of ordinary gameplay (without an overlay), detecting if there are differences between two frames, looking for dropped frames and tears to come up with a real-time frame rate.

This means that trdrop can analyze gameplay footage from any source, be it console, PC, or anything in-between from which you can get a direct video capture feed. Now that PC capture cards capable of 1080p60, and even 4K60p are coming down in price, software like this is allowing more gamers to peek at the performance of their games, which we think is always a good thing.

It's worth noting that trdrop is still listed as "alpha" software on it's GitHub repo, but we have found the software to be very stable and flexible in the current iteration.

  Xbox One S Xbox One X PS4 PS4 Pro
CPU 8x Jaguar
1.75 Ghz
8x Jaguar
2.3 Ghz
8x Jaguar
1.6 Ghz
8x Jaguar
2.1 Ghz
GPU CU 12x GCN
914 Mhz
40x Custom
1172 Mhz
18x GCN
800 Mhz
36x GCN
911 Mhz
GPU
Compute
1.4 TF 6.0 TF 1.84 TF 4.2 TF
Memory 8 GB DDR3
32MB ESRAM
12 GB GDDR5 8 GB GDDR5 8 GB GDDR5
Memory
Bandwidth
219GB/s 326GB/s 176GB/s 218GB/s

Now that the Xbox One X is out, we figured it would be a good time to take a look at the current generation of consoles and their performance in a few games as a way to get our feet wet with this new software and method. We are only testing 1080p here, but we now have our hands on a 4K HDMI capture card capable of 60Hz for some future testing! (More on that soon.)

This initial article will mostly focus on the original PS4 vs. the Xbox One S, and the PS4 Pro versus the Xbox One X in three recent titles—Assassin's Creed Origins, Wolfenstein II, and Hitman.

In our time with Assassin's Creed Origins, we noticed that the cutscenes are some of the most difficult scenes for the consoles to render. 

During the opening cinematic of the game, the base PS4 was able to achieve a mostly solid 30FPS, while the Xbox One S struggled to hit the 30FPS mark at times. 

With the Xbox One X and PS4 Pro, both consoles were able to render at a full 30 FPS in this same section.

Since Assassin's Creed Origins implements dynamic resolution, it becomes more difficult to make a direct comparison of the GPU power of these four consoles from these results, but from a gameplay smoothness perspective, it's clear that the original Xbox One S falls behind the other consoles on a 1080p TV.

On the PC side, Wolfenstein II has been hailed as a highly optimized title (just as Doom was before it), building upon the work that id Software put into the id Tech 6 engine for Doom last year. On consoles, we see a similar story.

Even on the base-model consoles, we were able to hit a stable 60 FPS at 1080p.

The latest Hitman game is an excellent example of a console title that easily enables console performance testing. With the option for an "unlocked" frame rate, as well as optimized patches for both PS4 Pro and Xbox One X, we can get a good look at console-to-console performance.

With the frame rate in unlocked mode, the PS4 and Xbox One S both max out at about 45 FPS. However, the PS4 is 10-15% faster than the Xbox in the scenes we tested.

When we move to the PS4 Pro vs. the Xbox One X however, this story changes. The Xbox One X version of Hitman adds a setting allowing users to choose between High-Quality or High Framerate. 

The High-Quality option renders the game at a native 2160p targeting 30 FPS, downsampled to our 1080p display, while the High Framerate option renders at a resolution of 1440p (the same as the PS4 Pro) and targets a frame rate of 60 FPS.

In our testing, we found that the Xbox One X does indeed hit these frame rate targets. Comparing the High framerate option to the PS4 Pro, we see the Xbox render at a solid 60 FPS, while the PS4 Pro hovers more in the 45 FPS region, a big difference between the two consoles.

Overall, it's still far too early to make any definitive performance statements about the Xbox One X and PS4 Pro. While it's clear from the specs that the Xbox One X is more powerful on paper, it's yet to be determined if developers will genuinely take advantage of the available horsepower yet. 

Users do get the added benefit of all games on the Xbox One X being able to take advantage of more GPU horsepower as opposed to the PS4 Pro. While Sony has recently implemented "Boost Mode" in the PS4 Pro to attempt to remedy this, it's riddled with compatibility issues and seems to be mostly a dud. (This could change if Sony puts resources towards that goal, of course.)

We are eager for feedback on this new console testing and would love to hear what our readers are looking for in potential future testing. Personally I am eager to try to compare these new consoles a similarly priced PC in the same titles, although there's a challenge presented there in trying to get the same image quality settings across all platforms.