Feedback

Console Frame Rating - Measuring performance of the new Xbox One X

A New Frontier

Console game performance has always been an area that we've been interested in here at PC Perspective but has been mostly out of our reach to evaluate with any kind of scientific tilt. Our Frame Rating methodology for PC-based game analysis relies on having an overlay application during screen capture which is later analyzed by a series of scripts. Obviously, we can not take this approach with consoles as we cannot install our own code on the consoles to run that overlay. 

A few other publications such as Eurogamer with their Digital Foundry subsite have done fantastic work developing their internal toolsets for evaluating console games, but this type of technology has mostly remained out of reach of the everyman.

View Full Size

Recently, we came across an open source project which aims to address this. Trdrop is an open source software built upon OpenCV, a stalwart library in the world of computer vision. Using OpenCV, trdrop can analyze the frames of ordinary gameplay (without an overlay), detecting if there are differences between two frames, looking for dropped frames and tears to come up with a real-time frame rate.

View Full Size

This means that trdrop can analyze gameplay footage from any source, be it console, PC, or anything in-between from which you can get a direct video capture feed. Now that PC capture cards capable of 1080p60, and even 4K60p are coming down in price, software like this is allowing more gamers to peek at the performance of their games, which we think is always a good thing.

It's worth noting that trdrop is still listed as "alpha" software on it's GitHub repo, but we have found the software to be very stable and flexible in the current iteration.

  Xbox One S Xbox One X PS4 PS4 Pro
CPU 8x Jaguar
1.75 Ghz
8x Jaguar
2.3 Ghz
8x Jaguar
1.6 Ghz
8x Jaguar
2.1 Ghz
GPU CU 12x GCN
914 Mhz
40x Custom
1172 Mhz
18x GCN
800 Mhz
36x GCN
911 Mhz
GPU
Compute
1.4 TF 6.0 TF 1.84 TF 4.2 TF
Memory 8 GB DDR3
32MB ESRAM
12 GB GDDR5 8 GB GDDR5 8 GB GDDR5
Memory
Bandwidth
219GB/s 326GB/s 176GB/s 218GB/s

Now that the Xbox One X is out, we figured it would be a good time to take a look at the current generation of consoles and their performance in a few games as a way to get our feet wet with this new software and method. We are only testing 1080p here, but we now have our hands on a 4K HDMI capture card capable of 60Hz for some future testing! (More on that soon.)

Continue reading our look at measuring performance of the Xbox One X!

This initial article will mostly focus on the original PS4 vs. the Xbox One S, and the PS4 Pro versus the Xbox One X in three recent titles—Assassin's Creed Origins, Wolfenstein II, and Hitman.

In our time with Assassin's Creed Origins, we noticed that the cutscenes are some of the most difficult scenes for the consoles to render. 

View Full Size

During the opening cinematic of the game, the base PS4 was able to achieve a mostly solid 30FPS, while the Xbox One S struggled to hit the 30FPS mark at times. 

View Full Size

With the Xbox One X and PS4 Pro, both consoles were able to render at a full 30 FPS in this same section.

Since Assassin's Creed Origins implements dynamic resolution, it becomes more difficult to make a direct comparison of the GPU power of these four consoles from these results, but from a gameplay smoothness perspective, it's clear that the original Xbox One S falls behind the other consoles on a 1080p TV.

On the PC side, Wolfenstein II has been hailed as a highly optimized title (just as Doom was before it), building upon the work that id Software put into the id Tech 6 engine for Doom last year. On consoles, we see a similar story.

View Full Size

Even on the base-model consoles, we were able to hit a stable 60 FPS at 1080p.

The latest Hitman game is an excellent example of a console title that easily enables console performance testing. With the option for an "unlocked" frame rate, as well as optimized patches for both PS4 Pro and Xbox One X, we can get a good look at console-to-console performance.

View Full Size

With the frame rate in unlocked mode, the PS4 and Xbox One S both max out at about 45 FPS. However, the PS4 is 10-15% faster than the Xbox in the scenes we tested.

View Full Size

When we move to the PS4 Pro vs. the Xbox One X however, this story changes. The Xbox One X version of Hitman adds a setting allowing users to choose between High-Quality or High Framerate. 

The High-Quality option renders the game at a native 2160p targeting 30 FPS, downsampled to our 1080p display, while the High Framerate option renders at a resolution of 1440p (the same as the PS4 Pro) and targets a frame rate of 60 FPS.

View Full Size

In our testing, we found that the Xbox One X does indeed hit these frame rate targets. Comparing the High framerate option to the PS4 Pro, we see the Xbox render at a solid 60 FPS, while the PS4 Pro hovers more in the 45 FPS region, a big difference between the two consoles.

Overall, it's still far too early to make any definitive performance statements about the Xbox One X and PS4 Pro. While it's clear from the specs that the Xbox One X is more powerful on paper, it's yet to be determined if developers will genuinely take advantage of the available horsepower yet. 

View Full Size

Users do get the added benefit of all games on the Xbox One X being able to take advantage of more GPU horsepower as opposed to the PS4 Pro. While Sony has recently implemented "Boost Mode" in the PS4 Pro to attempt to remedy this, it's riddled with compatibility issues and seems to be mostly a dud. (This could change if Sony puts resources towards that goal, of course.)

We are eager for feedback on this new console testing and would love to hear what our readers are looking for in potential future testing. Personally I am eager to try to compare these new consoles a similarly priced PC in the same titles, although there's a challenge presented there in trying to get the same image quality settings across all platforms.


November 22, 2017 | 02:58 PM - Posted by odizzido2 (not verified)

This is one of the big reasons I don't like console games. The frame rate is just so bad. I know this isn't a popular opinion but I thought golden eye on N64 was garbage because the game had such bad graphics that you couldn't see anything and such low frame rates that even if you did, doing something about it was really hard.

Having no mouse support doesn't help either. FPS games on console are just so bad for me.

November 23, 2017 | 05:42 AM - Posted by Power (not verified)

I feel exactly the same when it comes to early 3D games.
I strongly disagree on playing with mouse. I am not a cat ;-)

November 27, 2017 | 11:54 AM - Posted by Thretosix (not verified)

Microsoft is allowing keyboard and mouse support on the Xbox One but I believe it is up to the developers to implement it. I didn't know 60FPS was soo bad. Sounds more like an elitist pro gamer who's only concern is performance. Many people still play games for fun, a balance of visuals, performance, and gameplay all matter to most casual gamers. This article was about performance of consoles. Why even read the article or comment if you knew the outcome before reading the article and has nothing to do with PC gaming. This article was bad for you because it had nothing to do with you.

November 22, 2017 | 03:57 PM - Posted by Xabiss (not verified)

Yes, this is very interesting topic to discuss and I have highly interested in the finding you all find compared to DF.

November 25, 2017 | 06:03 AM - Posted by moo (not verified)

DF is full of nonsense, comparing the XBoxOne X to a gtx 1070... seriously.

November 22, 2017 | 04:19 PM - Posted by MiniPCsGameBetter (not verified)

I'd Buy an XBONE-X right now if I could hook up a keyboard and a mouse and Run Blender 3d on it. Currently I think that the XBONE-X can only run UWP apps but not win32 applications. Intel's NUCs with that EMIB/MCM and Radeon semi-custom discrete GPU die is going to Eat the XBONE-X's lunch if Microsoft does not turn the XBONE-X into a Mini-PC like platform.

Intel's NUC and that EMIB/MCM Intel/Radeon GPU mash-up is going to be popular for a while until the Desktop Raven Ridge APU Offerings come fully online in some Mini-PC form factor offerings with maybe options for Radeon discrete GPUs for dual Explicit Indegrated/Discrete Mult-adaptor DX12/Vulkan aware games.

November 23, 2017 | 12:14 AM - Posted by James

Such a NUC is going to be very expensive for the amount of performance it will deliver. You will be paying a lot just for a small form factor. If you don’t care that much about size, a regular desktop GPU with HBM will be a lot better.

November 23, 2017 | 11:27 AM - Posted by DesktopAPUsInLaptopsHopefully (not verified)

AMD needs to create its own "NUC" style refrence design and make use of its desktop Raven Ridge APUs inside a mini desktop design. And that could compete with Intel's NUC.

What I really want is a laptop with a desktop Raven Ridge APU inside and If ASUS can get a Ryzen 7 1700 in a laptop form factor at 65 Watts then getting a desktop Raven Ridge APU in a laptop at 45-65 Watts is possible also.

Anandtech also says that ASUS will be making a lower cost desktop Ryzen six-core Ryzen 5 1600 in a laptop SKU but I'm hoping that ASUS can get a Vega 11 discrete GPU based laptop offering next year that comes with the dasktop Ryzen 7/8 core or or Ryzen 5/6 core desktop CPUs.

November 23, 2017 | 01:59 AM - Posted by Bnonymous (not verified)

"software like this is allowing more gamers to peak at the performance of their games"
Do you mean the software is allowing gamers to reach a peak in their gaming performance or get peak performance out of their hardware?

November 23, 2017 | 04:40 AM - Posted by Anonymousdfadstah6n5467a5e (not verified)

What he means is that by using this software you can analyze(get a peak) of the frames etc. your pc is outputting.

November 23, 2017 | 07:25 AM - Posted by Bnonymous (not verified)

So, like get only the very tops of frames?

November 23, 2017 | 09:56 AM - Posted by psuedonymous

Or just a typo of 'peek'.

November 23, 2017 | 11:54 AM - Posted by Ken Addison

Whoops... homonyms strike again! I did, in fact, mean "peek" instead of "peak" in this case. Thanks for pointing it out!

November 23, 2017 | 03:35 AM - Posted by Anonymous#16299 (not verified)

lol sub 30fps trash #KillYourConsole #PCMR

November 23, 2017 | 09:58 AM - Posted by psuedonymous

I wonder if a 2DFFT module could be added to dynamically monitor rendering resolution per-frame? The use of post-AA makes it a bit less obvious then the super-sharp aliasing cutoff you'd otherwise see, but it should still be pretty clear where the HF rolloff lies which would tell you the rendering resolution.

November 23, 2017 | 04:38 PM - Posted by ppi (not verified)

It’s great to see PCPer opening a new frontier.

However with consoles, actual performance is not the only part of the equation. Considering that PS4 has roughly 50% more powerful GPU than XB1 and XB1X has roughly 50% more powerful GPU than PS4Pro, and the games typically target specific framerate (usually 30 or 60 fps), which is frequently driven by CPU speed (where the differences between consoles are rather small) rather than GPU, the key differentiator is image quality.

Therefore, without image quality comparison, your analysis cannot be complete.

Now, if I look at Digital Foundry, there is always image quality comparison. But frankly, even on my 4K monitor, frequently I do not really see any obvious difference between the competing consoles, until they zoom it on some specific detail.

Therefore, it would be great if you could make some sort of unscientific, subjective, but still relevant testing in terms of: play the same game on two same screens (or swap inputs), XB1 vs. PS4 and XB1X vs. PS4Pro, shortly after, maybe even interchanging - can you subjectively percieve the difference in image quality and performance?

November 23, 2017 | 06:30 PM - Posted by Cyclops

The memory bandwidth listed for Xbox One S is incorrect. Should be listed as 68 GB/S.

November 24, 2017 | 12:41 AM - Posted by Way neJetSki (not verified)

Is it true that the Xbox one X has Freesync support? Are there any issues with that support?

November 24, 2017 | 03:32 PM - Posted by Ken Addison

The hardware is supposed to support Freesync, but I don't believe it has been enabled in software yet. We tried a few freesync displays on the Xbox One X, but found no evidence of it actually using the Freesync functionality

November 25, 2017 | 04:08 PM - Posted by WayneJetSki (not verified)

Thanks for the reply Ken.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.