Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

View Full Size

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

View Full Size

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

View Full Size

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

View Full Size

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

Video News

October 31, 2014 | 10:52 AM - Posted by Ohwhereohwhereishalo (not verified)

@previous poster!no reviewer are not exactly wrong per say with the info we have today,this being said,the issue Dev have (tho they. All likely never gonna say it,is this:they do not want to rely on ms solution to do their game or when they Dev their game they were already too far done with those game to be able to adopt ms way with Xbox one.but I will say this again ,till customer have experienced the halo meant for Xbox one .nothing is as it isn't gonna talk since halo isn't ready ,and no Mather how much everybody troll them but remember this:HALO IS NOT OUT!nobody know how Xbox will perform.and certainly not the Dev.and if they do ?they chose to ignore ms (as usual with the result people experience now.)

November 1, 2014 | 11:01 PM - Posted by praack

oh one more thing guys- tone it down a bit- we are beginning to sound like the rabid fan boys we abhor!

Ryan and the guys just gave their opinion, like it or not it is based on hardware knowledge. sure there is still the wiggle room of coding- but Sony is having issues with Drive Club- it is beginning to look like the social experience it expected cannot be met with the hardware.

i think you will still get good games- but when the same IP hit's pc we will get better settings, and either the new consoles will get a hardware refresh early - or a new model will come out earlier.

people are not willing to wait around for another 10 years for upgrades.....

November 2, 2014 | 11:16 AM - Posted by Ohwhereohwhereishalo (not verified)

Is it possible (for Xbox one)that all Dev (including halo team)miss understood Xbox team?is it possible that the Xbox is like how we work !example,most in companies have low capability screen ,its almost brainless,everything is sent from the corp server .is it possible the Xbox one is the same thing(just a more powerful one)if its the case ,all Dev would have coded wrongly from the start,right!I am sure if ubisoft jumped the gun,they aren't limly to trash the start ,they all adapt to the size and do it properly at next project!if I am right?wouldn't the Dev pull every hand break and yell at ms ,given it is a way different way of coding then what debate used to be it on PC or ps4(Sony couldn't adopt this since they don't have enough band with on the cloud it possible the Xbox is just a 2013 version of a brainless fetcher every corp had in the 80s or 90s!if I am right?this would be overkill right!in this manner,ultrahd is not only thinkable its actually would be possible with hallo 5!is it even possible to prechew everything,yes including the most basic rendering,I mean if all coordinate are pretty rendered,only thing left for GPU is to put it together.

November 5, 2014 | 02:03 AM - Posted by hahmed330 (not verified)

People are pretty mental about mantle.

November 7, 2014 | 12:36 PM - Posted by Drb (not verified)

Ms has a tendency to put big delay on they is (typicly 200 to 400 ms ,if they did the same here?with 3 os !all virtual by the look of it?we re probably talking second of delay here.yet refresh rate is still aimed at 16.7 ms per frame so 1000ms of delay in a place where user need 16.7 ms max of delay .and you get the idea.I doubt ms engineer ,master or doctor even noticed this.if it affect my computer with is in wow.I don't see how it wouldn't affect a 3os virtual (hyperv or and azure ya its fixable. Since its a timing issue,as usual from ms,they keep having timing time its the universal clock ,at other its hpet when ms should use invariant tsc .and now its Xbox one.sadly this isn't ms team its Xbox team so I doubt Xbox team can fix this on their own .hopefully satia nadella will notice and send doctor to helphis engineer at Xbox team

November 8, 2014 | 02:53 PM - Posted by Drb (not verified)

PS:also ,data execution prevention is still a nightmare for the majority of Dev be it on PC or console.(yep old school die hard)and its easy to understand why so many gamer have issue.imagine,you disabled upnp. On your Nat router ,enabled full dep ,etc etc.pretty much set a LA security now?and the gamer expect the Dev to know or have took notice of these security .lol!

November 12, 2014 | 08:56 PM - Posted by Anonymous (not verified)

TC is a hopeless idiot. comparing SoCs architecture to a open platform that wasn't designed for game production. for a programming illiterate airhead who only learned how to put PC parts together, perhaps you don't understand programming efficiency, thin API layer, customized API platform are the true computing power developers want, how much raw power one has on a high end GPU doesn't concern developers one bit.

which PC indie exclusive from the last 10 years can even hold up to 360/ps3 standard?

let's pretend psychically based rendering, superior human skin shader, 10xtimes key-frame in animation data doesn't matter,sure you can have counter strike lvl graphic on all platform at 4k/60fps.

even trine2 on PS4 is 4k ready, this ???P/???fps idiocy is entirely invented by PC virign community, believed by PC virgins, spread across the internet by PC virgins. most developers don't have the time and patient to address this fullbloom stupidity. when they do care to tell the truth in couple short statement, and you morons able to twist their words and rewrite them in your own theory. you are the reason this world need birth control.

November 12, 2014 | 09:02 PM - Posted by Anonymous (not verified)

The whole world is laughing at Ryan, this is what he said

"If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200)."

AC: unity's benchmark

DA:I's benchmark.

Not only both games completely put these cards out of the picture for above 30fps on respectable setting, but 95% PC virgins around the world won't even run ACU on lowest setting.

next 5 years?? lol!!

November 17, 2014 | 07:32 AM - Posted by Drb (not verified)

I don't know. About ps4(hopefully Sony did not just copy and paste the way ms program l)for Xbox one tho ,I can tell you Xbox team issue is ms security team and probably left hand not speaking to right this:MSI/x patent,it there look for the drawback section.MSI/x drawback:latency !what is the number one enemy of gamer?latency !on user. Hardware MSI/x is useless.IRQ (and the fact PCI allow sharing of IRQ is OK for mobile,tablet and PC (yep even if you stream on don't see anybodyt mentioning. This very important info.only. reason its in the patent is likely because its asked.another self created issue on ms side is dmaand dca deactivation(saw it on window 8 and up but I suspects ms imposed this on azure and xbox one team.without DMA ?xb1just cannot do its job,xb1 was optimized by its team witbdma dca in mind so halo5?its not released because of these changes ,Xbox team is likely searching for a non existent alternative.and now you begin to understand.Xbox team got wreaked by ms security team and MSI/x.MSI/x is great everywhere but 1 place:GAMING

November 22, 2014 | 07:47 PM - Posted by Drb (not verified)

Lastly!be it on console or PC ,performance is no t where it should bewhy msi add latencyton of it compared to IRQ,on top of this ,some guru3d user tweak is (via bios and bcdedit to only run on invariant tsc,OK how you. Gonna get MSI/x?MSI/x work on crappy slow lapic!(Intel decided )yep its a chicken and egg issue google android is likely to find a fix what before Microsoft or Sony ps4

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.