Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

View Full Size

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

View Full Size

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

View Full Size

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

View Full Size

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.


October 27, 2014 | 04:55 PM - Posted by Anonymous (not verified)

Microsoft should have joibed forces with AMD. They should have done Mantle/dx12 together not seperatly.

MS should have jumped on HUMA HSA Mantle because the XBONE could benefit lot from it.

October 28, 2014 | 10:43 AM - Posted by Anonymous (not verified)

Mantle isn't really for consoles though, it's for PC. The PS4 and X1 devs have direct access to every nook and cranny of hardware available to them through the dev kits' own proprietary interfaces. What they can do with consoles gives them a performance advantage beyond what DX/Mantle can provide to PC users. They can already tailor every single process to whatever core or shader they choose at any time.

Since we know that access to low-level functions is NOT the problem, the only remaining possibilities are sloppy programming on the part of devs or expectations set too high by devs/publishers. Everyone knew that the hardware was weaksauce going in, so I'm not sure what the surprise is on that one.

October 27, 2014 | 04:59 PM - Posted by ROdNEY

AI should be computed via GPGPU in the first place! Especially on HSA enabled HW!

Ubisoft is to blame yet again!

October 27, 2014 | 06:00 PM - Posted by Scott Michaud

AI is an interesting beast. Path finding and visibility are heavy, highly parallel, and (apparently) well suited for GPU-compute workloads. Logic, on the other hand, is branching and shallow. It does not really fit either architecture perfectly, but (as you said) an APU might be ideal. That is, of course, if they can get the task in (and result out) on this specific architecture, especially if the GPU is already fully queued-up.

October 27, 2014 | 04:59 PM - Posted by Anonymous (not verified)

"If the PS4, the slightly more powerful console of the pair"

You're either trolling for clicks and a long comment thread with that statement or you are incompetent.

October 27, 2014 | 05:16 PM - Posted by Anonymous (not verified)

I assume your beef with that statement is because he said "slightly"?

October 27, 2014 | 06:41 PM - Posted by nathanddrews

LOL

October 27, 2014 | 05:02 PM - Posted by ImmenseBrick (not verified)

I am not sure why anyone is surprised over this. It is not like magic programming exists that makes a $400 PC run like a $3k titan black wielding monster. What concerns me more than this is the growing demand for Vram on pc "ports" coming from this gen. With the unified 8gb gddr5 of ps4 and ddr3/esram of xbone it is causing problems. I feel there is something wrong when my 780ti gets stuttering in a couple titles of note due to new vram demands. I am not sure if this is lazy programming or not, but it does seem strange. Will we all need 8gb video cards soon? I just hope unity runs well on a pc, regardless of its reported requirements. Time will tell.

October 27, 2014 | 05:20 PM - Posted by Jesse (not verified)

I agree. All the extra RAM is encouraging devs to leave textures and audio uncompressed - to save time?

Titanfall, Watch Dogs, Shadow of Mordor, and it's looking like COD:AW and AC:U are blatant offenders in this area...

October 27, 2014 | 06:56 PM - Posted by arbiter

I believe using compressed audio and texture benefit's rigs with slower cpu's. That was the reason titanfall dev's gave for uncompressed audio in their game. It ran better on slower cpu's which would sound reasonable since don't need to uncompressed the data on the fly. I think its more that game dev's are starting to really push the limits of the hardware we have.

October 27, 2014 | 09:57 PM - Posted by Anonymous (not verified)

don't kid yourself, it is bad optimizations on the developers side. Games run like crap on great hardware becuase the game is built crappy.

October 27, 2014 | 11:49 PM - Posted by Anonymous (not verified)

Titanfall had uncompressed audio because they wanted as little technical differences as possible from xbone version to ease patching (imo understandable). Xbone had uncompressed audio because its cpu is shit.

October 27, 2014 | 06:57 PM - Posted by arbiter

damn double.

October 27, 2014 | 05:03 PM - Posted by ROdNEY

Plus article title states the obvious, because every HW has its limits!
If we want to complaint about HW then we should say what SoC should have been used instead a year ago! Because I would love to see that.

October 27, 2014 | 06:04 PM - Posted by Scott Michaud

Actually, this was discussed long ago. Epic Games, when they made the Samitan Demo on Unreal Engine 3, was pushing console manufacturers to have 2.5 TFLOPs for this generation (~a GTX 670). Sony was closest with 1.84 TFLOPs, with Microsoft quite far behind - 1.31 TFLOPs after an overclock and removal Kinect's overhead.

http://www.pcper.com/news/Editorial/Epic-Games-disappointed-PS4-and-Xbox...
(See the paragraph below the blow torch screenshot).

And, as PS4's launch event, we noted that the Elemental Demo (Unreal Engine 4) had a serious downscale, compared to the PC version released at E3 2012. The PC version was running at about 3 TFLOPs, while the PS4 version was running at 1.84 TFLOPs (Epic did not present any demo for the Xbox One launch). At the time, there was doubt about whether Epic just couldn't get a demo up in time for the event, because they did not receive a dev kit quickly enough. That is looking, more and more, to be not the case.

http://www.pcper.com/news/Editorial/Unreal-Engine-4-Demo-PS4-Reduced-Qua...

October 27, 2014 | 05:05 PM - Posted by Crazycanukk

When these systems are designed and drawn out , long before production, by the time they hit market their hardware has aged considerably. They have the advantage of being able to give the developers tools that they can program closer to the metal but you can only squeeze so much out of hardware that is already most likely 2-3 years old. Plus as far as i know these are not dedicated GPU's like you would find in a laptop either but more of an Integrated GPU on the same die as the CPU.

October 27, 2014 | 05:12 PM - Posted by Anonymous (not verified)

I think you'd be surprised at how much "performance" you can get from an architecture when your paycheck depends on it. (I put performance in quotes because performance really becomes more about abstraction at a certain point.)

Imagine how good say, PlayStation games would look today if that was the only platform being developed for. Developers would have a heavy incentive to make better looking games (sales) and you can bet better looking games they would make.

Of course there is a wall at some point, but it would take a century to hit, and during that time new tricks would continually be found to make better use of clock cycles.

(Not to say that the best looking psx game would look as good as the worst looking ps4 game, just to say that you'd be floored by how good a psx game could look.)

October 27, 2014 | 07:58 PM - Posted by Aurorous (not verified)

The NES came out in Japan in 1983 the same year as the Atari 5200 and they had similar specs. because it didn't come out in America until 1986 people think of it as 2nd gen system when in fact it was a from the first gen era. When you look at what programmers were able to do with that system and how they got around it's limitations it's pretty amazing. For example the original spec was cartridges were limited to only 32kb in size, Some of the final games in the mid 90's were 512kb.

October 27, 2014 | 07:01 PM - Posted by arbiter

Ps3/360 i think were more bleeding edge of hardware when they released hence why they lasted as long as they did, but that was why they were as expensive as they were to start. Problem with what they did now they used mid range hardware that was a few years old to start with.

October 27, 2014 | 05:14 PM - Posted by Anonymous (not verified)

I think a big factor with regard to the longevity of these consoles is how quickly 4K catches on with the TV buying public. If 1080p remains the standard for the next 5 - 7 years than I don't think their "compute power" is going to be a big issue for anyone other than us geeks and possibly the game developers. But if 4K really starts taking hold then I think we may see a console refresh sooner rather than later.

October 28, 2014 | 06:20 PM - Posted by aparsh335i (not verified)

I couldn't agree more.
Why did i go from a GTX 680 to 2x GTX 780ti's? Well for 4k of course. 4k is real, it's great, and its available right now. Not only that, but it's somewhat affordable, with companies like Seiki sell cheap 4k TVs.
Can an xbox 1 or ps4 play current gen games at 4k with 30fps? No! Not even close. We are at one year since the launch of these consoles and they are so outdated it's ridiculous.
I think we will see the PC market grow a lot in the next few years, especially with great free to play games like League of Legends, Smite, and Ghost Recon Phantoms.

October 27, 2014 | 05:18 PM - Posted by AMDBumLover (not verified)

Ryan I couldn't let this bit go, "The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%." Yes thye aren't high performance part but in no way are they slower than baytrail atoms, especially is such a way as to not go into detail. The jaguar core is about 50% faster than baytrail clock for clock and it only takes a 1.6Ghz jaguar core to match ~2.4GHz baytrail core. Also why give the benefit of the doubt to ubisoft when they are squandering their precious cpu power for "50% of the CPU resources for rendering computation" or what ever that means. I can't believe you guys are stooping low enough to ride this trend while not presenting any kind of facts or a more nuanced opinion.

October 27, 2014 | 05:46 PM - Posted by Ryan Shrout

I will stand by that statement though maybe it needs expanding.

http://www.anandtech.com/show/7314/intel-baytrail-preview-intel-atom-z3770-tested/2

Obviously the parts being used in the Xbox One and PS4 may not match this A4-5000, but the architecture is still based on Jaguar.

Add in the fact that BT-D generally has at least half the TDP of Jaguar cores and you could argue that had these SoCs gone with Intel's part the might have been able to implement more cores.

October 27, 2014 | 11:06 PM - Posted by amdbumlover (not verified)

but you haven't corrected or edited the statement and you have linked me to an at article where you are probably using it to demonstrate the performance difference. The problem is that you have tried shifting goalpost bringing up power consumption without mentioning other factors to back up your deflection like die size and isa feature set -read avx and virtualization capabilities. I expect you and your team to be a bastion of unbiased and factual data, not just another tech blog. Leave that to "pros" like me and joshtekk (2 ks and not 3 ;) ).

October 28, 2014 | 12:11 PM - Posted by Anonymous (not verified)

AMD Athlon 5350 - Jaguar 4 cores, 2GHz, 25W TDP, 28nm
Intel J1900 - Silvermont 4 cores, 2GHz, 10W TDP, 22nm

http://www.anandtech.com/bench/product/1223?vs=1227

They trade blows sometimes, but Intel is typically faster (up to 25% as Ryan accurately stated) in single- and multi-threaded scenarios. AMD's GPU performance crushes Intel's, but that's not under debate. Seems pretty factual to me: Jaguar's performance is poor. 4, 8, or 12 cores, doesn't really matter, Jaguar just isn't very quick. Ryan's point stands.

October 28, 2014 | 06:58 PM - Posted by AMDBumLover (not verified)

Ryans point doesn't stand and has no legs without any details. "single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%." that was the quote. This is untrue match up a beema a6-6310 and compare it to baytrail-d. Maybe if he specified that at 1.6GHz in very few scenarios is a 2.4Ghz atom faster. Jaguar's performance isn't poor and that is a very negative way at looking at it, it is designed to give that level of performance and is IPC competitive with most low power uarchs out there.

October 28, 2014 | 10:51 AM - Posted by beckerjr

SoC's with Intel parts? Exactly how would they have done that Ryan? Intel doesn't do custom SoC work for outside companies. The console makers went with AMD cause they were willing to work with them to use their tech/IP in custom designs for them. They would have needed to have bought a separate Intel CPU and then a stand alone GPU which would hav made for a much more expensive console and defeated the purpose of the price points they wanted.

October 28, 2014 | 06:20 PM - Posted by aparsh335i (not verified)

What about an Nvidia SOC like a beefed up k1?

October 31, 2014 | 07:07 AM - Posted by Anonymous (not verified)

Thats an impressive mobile part but its still mobile. Not anywhere near the same level of performance.

October 28, 2014 | 11:59 AM - Posted by Jaguar for consoles is better (not verified)

All the testing I have seen with the AM1 and other Jaguar variants shows it is massively held back by memory bandwidth. It doesn't have that bottleneck on the consoles.

October 28, 2014 | 09:11 PM - Posted by Anonymous (not verified)

you sir are a hopeless idiot, comparing SoCs architecture to a open platform that wasn't designed for game production. for a programing illiterate airhead who only learned how to put PC parts together, perhaps you don't understand programing efficiency, thin API layer, customized API platform are the true power developers want, how much raw power one has on a high end GPU doesn't concern developers one bit.

which PC indie exclusive from the last 10 years can even hold up to 360/ps3 standard?

let's pretend psychically based rendering, superior human skin shader, 10xtimes key-frame in animation data doesn't matter,sure you can have counter strike lvl graphic on all platform at 4k/60fps.

even trine2 on PS4 is 4k ready, this ???P/???fps idiocy is entirely invented by PC virign community, believed by PC virgins, spread across the internet by PC virgins. most developers don't have the time and patient to address this fullbloom stupidity. when they do care to tell the truth in couple short statement, and you morons able to twist their words and rewrite them in your own theory. you are the reason this world need birth control.

October 29, 2014 | 06:36 PM - Posted by NotaConsoleFanboy (not verified)

10 years? SC...nuff said

November 1, 2014 | 01:26 AM - Posted by Anonymous (not verified)

You're right, the soc used by microsoft and sony are built for gaming yet the experience is wildly disappointing. The cusomized api that you speak of is showing absolutely zero results with the next gen consoles, the equivalent pc graphics cards perform the same as the console gpus even though they are a unified system with lean, cusomized apis. Maybe you know something everyone else doesn't and you should program all games from now on. You can clearly see that developers are using these "to the metal, customized api's" just like they would directx 11, they aren't using any of those features because the "to the metal, customized api's" are falling out of favor for the high level simplicity afforded by directx 11. Programmers and developers don't want to optimize their code, they're happy with 1080p and 30 fps. I get sick when I hear about the optimization of last gen games, console games looked like trash from a couple years after release all the way up to when the ps4 and xbone released. Console games look like someone spread butter on my tv screen, the customized api you're speaking of must be broken.

November 1, 2014 | 01:26 AM - Posted by Anonymous (not verified)

You're right, the soc used by microsoft and sony are built for gaming yet the experience is wildly disappointing. The cusomized api that you speak of is showing absolutely zero results with the next gen consoles, the equivalent pc graphics cards perform the same as the console gpus even though they are a unified system with lean, cusomized apis. Maybe you know something everyone else doesn't and you should program all games from now on. You can clearly see that developers are using these "to the metal, customized api's" just like they would directx 11, they aren't using any of those features because the "to the metal, customized api's" are falling out of favor for the high level simplicity afforded by directx 11. Programmers and developers don't want to optimize their code, they're happy with 1080p and 30 fps. I get sick when I hear about the optimization of last gen games, console games looked like trash from a couple years after release all the way up to when the ps4 and xbone released. Console games look like someone spread butter on my tv screen, the customized api you're speaking of must be broken.

October 27, 2014 | 05:34 PM - Posted by beckerjr

I know PC Perspective needs to tout the superiority of the PC Master race and all but come on guys you are better than to post this junk. Some Ubi devs cry and it made out to be that the consoles suck.

Look at the games already coming out and say with a straight face and honest opinion that they look dated or poor.

I also don't know why you talk about the PS4 getting OpenGL to make things faster. The PS4 already has its own high level and low level graphics API's(GNM and GNMX).

October 27, 2014 | 05:51 PM - Posted by Anonymous (not verified)

Look at the games already coming out and say with a straight face and honest opinion that they look dated or poor.

The point is they don't look "next-gen", revolutionary, or long-term. Just about every console before was revolutionary. If you follow PC Gaming and the history you'd know consoles have a long history of being the main focus for innovation and true graphics horsepower. That started shifting in the late 90's and hasn't turned back since. The 360 managed to pull a fast one with Unified Shaders that actually was ahead of the PC industry for a solid year. It took multiple years before PC hardware actually made it obsolete.

Looking poor is one thing, looking and being mediocre from Day 1 to Day 1,500+ or whenever a replacement comes, is another. Also if you're playing on 40"+ TV's the lower resolution actually becomes more noticeable quicker. I'm not going to join the debate of PC's vs Consoles other than saying hUMA had a change to do its thing and be what PC's lacked similar to Unified Shaders.

The opportunity was lost on both sides by an idea that claimed to be revolutionary not being ready.

October 27, 2014 | 07:07 PM - Posted by arbiter

last gen consoles were pretty much cutting edge of tech, current gen ones were mid to low range pc hardware where they were released, that hardware was mid to low range 1 year before that.

October 28, 2014 | 10:53 AM - Posted by beckerjr

Yes they DO if you look at Apples to Apples in graphics. Compare the games at launch on the PS3/360 to games at launch on these systems. Don't point out stuff like GTA V or Last of US as examples for last gen that I see others do.

October 28, 2014 | 06:53 PM - Posted by aparsh335i (not verified)

" I'm not going to join the debate of PC's vs Consoles"
? You did join....Not that i disagree.

October 29, 2014 | 12:58 PM - Posted by Anonymous (not verified)

Referring to taking sides. I really could care less about it.

October 27, 2014 | 05:40 PM - Posted by Anonymous (not verified)

Who didn't see this happening, even with a switch from PowerPC > x86 there is no easy way to make up the Ghz when it comes to single-threads. The 360 had a tri-core 3.2Ghz product. That's a pretty large task you put on the developers to bring high-quality games on a part with half the clock-rate. That was the one part of the specs that really made no sense to me.

Hopefully we see a quick succession of both consoles that are backwards compatible, which shouldn't be hard to do on the same architecture as PC's. There are much better parts out there. Taking damn near a mobile phone chip and putting it into a console is quite frankly embarrassing.

October 27, 2014 | 07:09 PM - Posted by arbiter

Clock rate hasn't ment much for a long time, Just cause a part is clocked slower doesn't mean it is slower in performance.

October 27, 2014 | 08:06 PM - Posted by Anonymous (not verified)

It still matters quite a bit. Not everything is heavily multi-threaded, even in games. When you have to offload something onto a slower core that is going to directly effect performance.

When you drop the clock rate that low there is no way it isn't going to affect things significantly. It just further adds to the bottleneck as other developers have already mentioned. They're basically relying on the improvement in IPC from last generation to this to account for some miracle performance gain and that's just not how it always works.

October 27, 2014 | 11:14 PM - Posted by amdbumlover (not verified)

I dont know what to tell you but a 1.6ghz jaguar core has more ipc and ips than dual threaded xenos core even at 3.2ghz cell is another story but the x360 cpu is slower than an amd netbook apu.

October 28, 2014 | 12:55 PM - Posted by Anonymous (not verified)

It's impossible to make that assumption comparing two different architectures with the PowerPC variant being especially made in the case for Microsoft while Jaguar is just a generic x86 chip.

You also aren't taking into account the Tri-Core Xeno's chip was mostly locked to 1, yes ONE CPU for most games. Some games used 2, but we're now comparing Apples to Oranges. I sure hope a console clocked at 1.6Ghz+ on a semi-modern x86 chip with upwards of 4-6 cores at its disposal for multi-threading, can beat out a Xenos 3.2Ghz specially designed chip locked at 1, maybe 2 cores usable.

October 27, 2014 | 05:40 PM - Posted by DataHound (not verified)

What's interesting is that the AI work is something that could be offloaded to Microsoft's Azure platform (as Respawn did in Titanfall).

If the CPU is the bottleneck and not the GPU then Xbox performance might end up being better than PS4 in certain circumstances in some games despite having inferior rending hardware.

October 27, 2014 | 05:48 PM - Posted by Ryan Shrout

Meh, I would put little faith in any compute resources that are an Internet-connection latency away.

October 27, 2014 | 06:51 PM - Posted by DataHound (not verified)

If this is going to be another 6+ year generation they are going to have to do something and cloud may be the only option they have got, short of clumsy bolt-on upgrades. Mobile devices with have parity with the next gens before we know it and the great divide between PC and the consoles will become so wide that we stop even bothering to draw comparison.

October 27, 2014 | 11:55 PM - Posted by Anonymous (not verified)

Cloud is useless for anything interactive - games come to mind. Latency would be insane.

October 28, 2014 | 10:06 AM - Posted by renz (not verified)

so when my internet is out i could not even play the single player portion since the game could not emulate AI?

October 27, 2014 | 05:47 PM - Posted by mAxius

you do know nvidia is using ubisoft to fight a proxy battle against the amd hardware in both of the consoles right?

http://kotaku.com/ubisofts-newest-deal-could-be-bad-news-for-some-pc-gam...

October 27, 2014 | 05:55 PM - Posted by Anonymous (not verified)

Not even a Nvidia part would make this better. The bottleneck from a spec sheet standpoint was always the CPU. If they bundled it with Nvidia they'd just be back in the dog house like they still are after the X-Box fiasco.

If I was Nvidia I'd want to let AMD take the double whammy (CPU/GPU) on the underwhelming consoles and just keep quiet. There's no reason to throw more gasoline in the fire unnecessarily when the person is already covered in gasoline by their own doing.

October 28, 2014 | 12:20 PM - Posted by Shrapnol (not verified)

no listened or watched this???

http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics...

This week on episode 226 of the No BS Podcast, AMD graphics guru Richard Huddy joins the Maximum PC crew. For those unfamiliar with the AMD exec, Huddy has spent time working at ATI, AMD, Nvidia, Intel, and is now back at AMD. He’s also widely considered one of the pioneers of DirectX. Suffice it to say, he knows his stuff.

In this special edition of the podcast, we pick his brain on a variety of topics relating to AMD. Easily the most exciting part of the discussion is the fact that Huddy doubles down on AMD’s assertion that Nvidia Gameworks presents a threat to PC gamers, most notably by adding lines of code to games that would hinder performance on AMD graphics cards (It’s some pretty damning assertions). We also talk about the recent controversy surrounding the PC version of Watch Dogs, discuss Mantle vs DirectX, AMD’s answer to Nvidia Gsync, 4K, and the future of graphics/graphics cards.

October 28, 2014 | 12:21 PM - Posted by Shrapnol (not verified)

no listened or watched this???

http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics...

This week on episode 226 of the No BS Podcast, AMD graphics guru Richard Huddy joins the Maximum PC crew. For those unfamiliar with the AMD exec, Huddy has spent time working at ATI, AMD, Nvidia, Intel, and is now back at AMD. He’s also widely considered one of the pioneers of DirectX. Suffice it to say, he knows his stuff.

In this special edition of the podcast, we pick his brain on a variety of topics relating to AMD. Easily the most exciting part of the discussion is the fact that Huddy doubles down on AMD’s assertion that Nvidia Gameworks presents a threat to PC gamers, most notably by adding lines of code to games that would hinder performance on AMD graphics cards (It’s some pretty damning assertions). We also talk about the recent controversy surrounding the PC version of Watch Dogs, discuss Mantle vs DirectX, AMD’s answer to Nvidia Gsync, 4K, and the future of graphics/graphics cards.

October 27, 2014 | 06:09 PM - Posted by godrilla (not verified)

Nvidia benefits from bad ports causing disruptive upgrade cycles, makes a lot of sense.

October 27, 2014 | 06:02 PM - Posted by alkarnur

When bulk pricing is considered, Sony/MS could have easily made a $600 machine with an FX-8350 + Radeon 7970.

But really, there is no right answer because the console business model is broken. There are 3 major approaches when it comes to spec'ing and pricing and all three are severely flawed:

1)
If you make a more powerful machine that uses PC components that sits in the sweet spot of price/performance you're going to have more longevity and more power than a cheaper console competitor, but that cheaper console maker's machine is going to outsell yours just based on price. And building up an install base and critical mass is super important especially early on in a console generation.

2)
You decide to match your competitor's cheaper and weaker machine with hardware that's a tiny bit faster, and a price that's a tiny bit lower (i.e. PS4 vs $399 Kinectless Xbone). But then you have weak hardware, probably that barely clears the range of performance/graphics of the previous gen, and makes you hit a bottleneck / performance wall very early on in the generation, which would ultimately force you to shorten the console's life span if you want to overcome that performance wall. But at least you have a similar price as your competitor and you're not losing the market share / install base war to him.

3)
You spec your console with PC parts that sit in the sweet spot of price-performance, but you sell it for far less than is necessary to break even, at a loss, for a price matching your competitor's. Now you've eliminated problems both from weak hardware and from a higher price, but you're running massive loss leaders, on the off chance that by the time your BOM costs go down, you can recoup your initial investment and loss leaders, in the unlikely event that your competitor doesn't cut prices (and thus force you to cut prices, thus further delaying the day when you can finally start generating a profit on your hardware) or just cuts the current generation short and announces his next-gen console to catch up and possibly surpass your console's performance, thus ending your console's life span and preventing you from ever being able to turn a profit on it.

The reason why the console business model is broken is that there is no magic. Console makers and console fans act like it's possible to get equivalent hardware and gaming experience for a much lower price, but that's not possible. In any given year, a certain level of performance is going to cost a certain amount of cash and it can't be had a half or a third of the price without massive subsidies / losses. In 2006, the performance that an $1200 computer buys you, cannot be had for a $500 console. In 2013, the performance that a $900 computer buys you, cannot be had in a $400 console.

October 31, 2014 | 07:12 AM - Posted by Anonymous (not verified)

Lolz a $600 console? You want to see what happens when one of those comes out see the PS3 and its total failure at launch forcing Sony to do price drops in the first year and take heavy losses. Thats why the PS4 is what it is today.

The console market isn't broken. The sales numbers since launch say its fine.

October 31, 2014 | 07:12 AM - Posted by Anonymous (not verified)

Lolz a $600 console? You want to see what happens when one of those comes out see the PS3 and its total failure at launch forcing Sony to do price drops in the first year and take heavy losses. Thats why the PS4 is what it is today.

The console market isn't broken. The sales numbers since launch say its fine.

October 31, 2014 | 07:12 AM - Posted by Anonymous (not verified)

Lolz a $600 console? You want to see what happens when one of those comes out see the PS3 and its total failure at launch forcing Sony to do price drops in the first year and take heavy losses. Thats why the PS4 is what it is today.

The console market isn't broken. The sales numbers since launch say its fine.

October 27, 2014 | 06:13 PM - Posted by KittenMasher (not verified)

Two consoles, perform nearly identically. Even though one has 50% more gpu compute.

I'm not saying that Ubisoft is full of it, but between that and the ridiculously high PC requirements I'm certainly going to at least imply it. Never mind the watch_dogs fiasco.

October 27, 2014 | 06:40 PM - Posted by Azmodan (not verified)

Really, we are surprised that AMD hardware is already hitting the wall. Their top of the line 4 core CPU can barley compete with the lowly i3 dual core and we are using integrated graphics on top of mediocre cpu cores. The consoles were outdated the moment they hit the shops and this light weight fluff will soldier on for another 5-8 years no doubt. The utter hype and BS coming from developers about how this was the duck guts of hardware was appalling and now of course reality bites hard and they can no longer hide the truth.

November 1, 2014 | 01:32 AM - Posted by Anonymous (not verified)

Why are you making this about amd? If sony and microsoft had went with intel and nvidia with the same cost offer then these consoles would be even more obsolete.

October 27, 2014 | 07:03 PM - Posted by Ophelos

These consoles are not meant for a 8 to 10 year cycle at all. It's more like a 5 to 6 year cycle like what Nintendo been doing for many years now.

Heck Game developers were bitching that Sony and Microsoft both needed to release new consoles just after 5 or 6 years with the PS3 and X360.

I think Microsoft always knew their system was a piece of crap out the hard door, when they've said the Xbox one will need to always be connected to the internet. Which means that developers at no choice but use Microsoft cloud service has a back-end for alot of it's games.

October 28, 2014 | 08:36 AM - Posted by dreamer77dd

I agree with you.
I remember reading that the console cycle is not going to be 10 years but 5-6 years.

October 28, 2014 | 08:38 AM - Posted by dreamer77dd

.

October 27, 2014 | 07:03 PM - Posted by John H (not verified)

Isn't the 7850 more comparable to the PS4 than the 7790?

PS4 specs:
1152:72:32 @ 800 mhz (shaders, tmus, rops)
and a 256-bit DDR5 bus at a clock speed that gives 176 GB/sec of bandwidth

7790:
896:56:16 @ 1000 mhz / 128bit bus / 96GB/sec
(GPU slower and a lot less bandwidth)

7850:
1024:64:32 @ 860 mhz / 256bit bus - 153GB/sec
(GPU ~ 90-95% of PS4, bandwidth ~ 15-20% less)

October 27, 2014 | 08:00 PM - Posted by Anonymous (not verified)

Specs don't always tell you the whole story just like laptop parts almost never compare directly to a discrete card despite looking very close.

I'll let the Pro's explain things better as to why we don't have 2" laptop cards in desktops, but in this case I think the CPU is holding things back a lot more than the GPU side of things. The GPU part is nothing to brag about, it's decent, but it doesn't have the horsepower to be remotely pushed, not even the slightest.

October 31, 2014 | 07:15 AM - Posted by Anonymous (not verified)

Yes it is, and Ryan trying to say the PS4's GPU is just a 7790 shows his ignorance on the matter

This is nothing more than PC Per trying to jump on one devs claim and make a blanket statement to damn the current consoles, and show that PC are the awsome.

October 27, 2014 | 07:05 PM - Posted by Anonymous (not verified)

Personally I would love to see someone with deep investment pockets throw some more investment dollars Imagination technologies way, and take the PowerVR wizard mobile SOC GPU technology and ramp it up into a discrete GPU laptop product. The ray tracing hardware on the wizard, and the visual improvements in the gaming demos really impressed me, and if more lighting/illumination effects, that require CPU resources(Ray tracing, etc.), can be moved onto the GPU, then all this worry about having more powerful CPUs for gaming could be put aside. OpenCL for gaming physics will also help.

Consoles will never be a replacement for the gaming rigs, but for those without any technical ability, consoles will be what many choose, hopefully Steam OS, and the Steam Ecosystem can bring some of the expandability of the PC gaming rigs, with a console ecosystem Games Purchasing economy of scale to allow for a better gaming experience. I'm sure there will be an upgrade market, around the Steam Machines that will allow for even the most technologically inexperienced to bring their devices to a service and have the system upgraded for a small fee, and sill save money compared to having to wait to purchase a next generation console to get any improvement. The Steam Boxes, when they do begin to appear, should force both M$ and Sony to upgrade their products more often, or offer more upgrade ability between next generation console releases.

The GPU market does not have enough players in the discrete part of the market, and the mobile SOC GPU market appears to be using more of the open standard graphics APIs. This back and forth API warfare between M$, AMD, and Nvidia will continue, and get worse without other players in the market for GPUs, SOCs, and OSs. And as far as the current console offerings, they had a performance problem when they were released, by not having any ability to be upgraded when needed, and both M$ and Sony are responsible for upgrading at a faster pace, now that they have a platform built around the same basic APU design. How hard could it be to just change out to a more powerful, more power efficient APU without having to change the system board as much.

October 28, 2014 | 10:18 AM - Posted by renz (not verified)

i thought it was IMG decision to leave desktop market and put more focus on mobile. i'm not sure even if more money being poured into the company they will back in desktop discrete game. even on mobile they only license their design.

October 28, 2014 | 10:18 AM - Posted by renz (not verified)

i thought it was IMG decision to leave desktop market and put more focus on mobile. i'm not sure even if more money being poured into the company they will back in desktop discrete game. even on mobile they only license their

October 28, 2014 | 12:18 PM - Posted by Anonymous (not verified)

Well Apple owns 10% of imagination technologies, and Apple for one has the money, and desire for more control of the hardware across all of its product line. I am not as interested in just a GPU doing current GPU tasks, as I am in having the GPU take over the ray tracing/ray interaction calculations in a massively parallel fashion, and saving perhaps 5 grand, or more, in required server CPU purchases, leaving more money available for the GPU. Thousands of GPU cores, if they had ray tracing circuitry built in, would trump the largest multi-socket motherboard full of very expensive top line server CPU SKUs. I was hoping the Apple's iPad AIR 2 would get the PowerVR wizard, and for all we know it may just have that, but none of the tech sites are paying much attention to the actual internal workings of the A8X, or its GPU, other than to wait for chipworks to spoon feed them some superficial information.

With Nvidia developing Maxwell as a microarchitecture for mobile first, and scaling it up into its desktop SKUs, you would think that the PowerVR developers would be capable of scaling the PowerVR wizard core up in the same fashion, with a little Financial support from Apple. The very reason that Nvidia's Maxwell desktop SKUs are so power efficient, is because they are just scaled up from, power optimized for mobile IP, so maybe these Mobile SOC GPU products have the potential to all be scaled up, the mobile GPU IP licensers, like Imagination Technologies, just need someone with the finical deep pockets, to either pay Imagination Technologies to develop a custom discrete GPU(Apple financed for Apple's own use), or finance a third player's entry into the desktop GPU market for more competition, and lower prices for all device OEMs. Apple could with its investment in Imagination Technologies, have a custom discrete laptop GPU made for Apple's exclusive use, a discrete GPU with Ray Tracing circuitry built into the Discrete GPU, and the MacBook whatever, doing graphics workloads, that would normally require an expensive quad core with 8 threads CPU SKU. I would gladly pay the price to have ray tracing moved to the GPU, because it would be worth the cost in time savings alone, as even the quad core i7, can take hours rendering a single frame, for lack of the massive parallelism that a GPU could put to the task of computing 10s of Trillions of ray interactions.

Nothing is stopping Imagination Technologies from making a discrete GPU product for itself, other than the financial backing, and the long term support to re-enter the discrete market, but I do hope for at least great success for the PowerVR wizard in Apple's future SOC products, if the PowerVR wizard is not already in the A8X, and Apple is not saying! A race among the major GPU makers for ray tracing in the GPU hardware could result in everyday laptops with the ability through a discrete GPU to perform ray tracing tasks that currently require very expensive workstation SKUs with dual workstation CPUs.

October 27, 2014 | 07:14 PM - Posted by Anonymous (not verified)

Lazy developers and programmers who spend years only working on single thread now find 2 threads a mount Everest task. Ask them to utilize more then 2 and they throw their hands up and say its too hard.

They rather stick to the lowest cost possible to them which means using single thread game engines from +10yrs ago. Lucky if we get 2 or multi threaded games that actually work due to quick turn around to satisfy the quarterly profits.

October 27, 2014 | 07:14 PM - Posted by Anonymous (not verified)

Lazy developers and programmers who spend years only working on single thread now find 2 threads a mount Everest task. Ask them to utilize more then 2 and they throw their hands up and say its too hard.

They rather stick to the lowest cost possible to them which means using single thread game engines from +10yrs ago. Lucky if we get 2 or multi threaded games that actually work due to quick turn around to satisfy the quarterly profits.

October 27, 2014 | 07:19 PM - Posted by Goofus Maximus (not verified)

This underwhelming graphical performance would seriously undermine my ability to play an enjoyable game of Tetris on either of these consoles!

Now that I cleared the snark out of my system, I'll leave my actual opinion, that maybe this will finally spur the growth of the PC as gaming platform.

October 27, 2014 | 08:33 PM - Posted by Alien0227

Nice article, big question is, and this may put everything in context; What does one expect from a gaming console priced at $399-$499? I think Sony and Microsoft got what they asked for... Another thing to consider, and I don't know what the answer is, is how would Ubisoft's new Assassin's Creed Unity game fare on the previous generation consoles? Is the new generation console leaps and bounds better? If the answer is yes, then the Buyers are getting their money's worth. If they want 1080p at better than consistent 28 FPS then they need to invest in a PC with the appropriate CPU/GPU combination.

October 27, 2014 | 09:16 PM - Posted by Anonymous (not verified)

Optimization isn't a Ubisoft strong point. AC franchise has never been optimize for multi core. 1 CPU and it utilizes that well 2 cores and its okay. 4 and it only utilizes 2 cores and other 2 for minor task. Any more than 4 its a waste.

They have to improve on the core engine and I don't think that is something Ubisoft is willing to due. Spend money on game engine development and optimization when they can just blame it on something other then themselves like they always do.

Its the same ol' same ol'. Just when there games don't sell they run out to blame pirates.

http://www.hardwarepal.com/assassins-creed-4-black-flag-benchmark-cpu-gp...

If you played earlier Assassin’s Creed sequels you will notice that CPU usage is more or less the same when it comes to CPU load. The first core on all of the CPU’s is working at around 70%, and the last core is at 60% of load most of the time, while the load on other cores or threads varies from time to time. The game does use all of the cores and threads but the engine is programmed in a way that nothing above a quad core CPU will give you better performance.

Maybe the'll realize how many cores the X-Box One and PS4 have and utilize them and we PC users wont get the same old crappy Assasins Creed port we always get.

October 27, 2014 | 09:08 PM - Posted by Commander Octavian (not verified)

Who cares? The Assassin Creed series has been the most anti-Western Marxist-propaganda-soaked POS series of games ever produced. It's a disgrace to see all those Western kids play that game.
We're already flooded with games promoting the demographic and cultural genocide of the West.

It's about time someone turns the tide and destroys this despicable suicidal "modern" culture along with the political and global powers that sustain and promote it for good.

October 28, 2014 | 03:18 AM - Posted by Anonymous (not verified)

I hear you sir.

I can't stand these goose liver commies. These rich people that are push marxism down our throats.

October 27, 2014 | 10:16 PM - Posted by Anonymous (not verified)

Is it possible that version 2 of both the xbone and ps4 will have a better apu?

October 27, 2014 | 10:35 PM - Posted by godrilla (not verified)

No because that would cause fragmentation

October 27, 2014 | 11:19 PM - Posted by Anonymous (not verified)

why would it? same architecture, just better apu.

October 28, 2014 | 01:40 AM - Posted by Bob (not verified)

Maybe now game developers will actually start focusing on making the games more creative and engaging rather than just graphics. Nintendo and Valve are primary examples of what you can achieve even with seemingly 'underpowered' hardware.

October 28, 2014 | 01:43 AM - Posted by Anonymous (not verified)

Ubisoft seem to be getting alot of flak for this sub 1080p 30fps dilemma with ps4 users saying their console can do it at 1080p it's other systems bringing them down. Well the order is ps4 exclusive, corridor 3rd person shooter and from all reports it's 800p @ 30fps.

I've got no idea why all these performance issues are showing up but if 1st party devs are struggling multiplatform devs must be in a world of hurt.

October 28, 2014 | 04:26 AM - Posted by Anonymous (not verified)

Console gaming LOL I did that for a while.
Actually was useful for sending my HDTV to another room via Cat6
Can the XB1 do this ???

October 28, 2014 | 04:29 AM - Posted by Anonymous (not verified)

Damn AMD card...every time it quits working I re-seat it in the MOBO and it is good for months...still haven't figured that one out.

October 28, 2014 | 04:31 AM - Posted by Anonymous (not verified)

Peak Compute , Peak Google , Peak Twitter

October 28, 2014 | 04:33 AM - Posted by Anonymous (not verified)

I take it global lighting will not be coming to a console any time soon ?

October 28, 2014 | 04:44 AM - Posted by JohnGR

Most of those who use Steam use PCs with (much) lower specs than those in the consoles. This talk about consoles hitting the wall is rubbish. Even the highest end PC will hit the wall today if you go and use ultra settings in the highest resolution possible. What does this mean? Nothing.

Developers are simply doing what everyone of us would do the last 15+ years with a game in our PCs. We will go in the graphics options and try to find the best mixture of graphical quality and resolution to play the game keeping a minimum frame rate. Games will always look like they hit the wall on Xbox One and PS4, because in every game settings will always be set to use the maximum potential of the consoles.

I don't really get it, why all this fuss for something that it is trivial and normal for over a decade in PC gaming? Everyone seems so surprised having rediscovered the wheel. Maybe Intel and Nvidia need to convince the game console makers to buy their must more expensive hardware for the next Xbox Two and PS5.

October 28, 2014 | 04:44 AM - Posted by Anonymous (not verified)

I never did understand why they put such a crap CPU in both consoles! It had to be none other then to save money and give people crap hardware wrapped in a fancy package.

I don't mind paying for decent hardware. I guest people are cheap judging by how people freaked out over ps3 price when it first came out! (they was selling at a lost for god sakes, you cheap console people!)

What am getting at is i don't know if we can place 100% of blame on console makers or public who wont buy anything but pure cheap crap, if there forced to pay for great hardware they whine.....

They wanted cheap, So they all got a tablet cpu powering there games (mwhahahaha)

Even a half decent CPU would of lasted them many years.

Take my haswell i5 , this thing will playing games just fine for 6+ years easy, plus any games of the (far) future will be DX12 and just like mantle does for cpu side, it let my gpu handle the task with less demands on my cpu to deliver.

My new gtx 970 will be happy to handle the work.

Plus on cpu side of things consoles have such low FOV ( CPU is often limited by the number of things that need to be rendered) Console devs use many tricks to get there games running.

-----------

Hell even Nvidia senior vice president was predicting these very crap hardware issues way before the consoles even hit the market...

http://www.techradar.com/us/news/gaming/consoles/nvidia-compares-ps4-spe...

I love how he just came out and said it "we came to the conclusion that we didn't want to do the business at the price those guys were willing to pay"

I count that as proof alone MS/sony did not want to pay for decent hardware.

There no doubt nvidia would of given them a great deal for millions of units of great console gpu.....
They just didn't want to pay anything less then bottom of the barrel and AMD was the cheap harlot to service them.

October 28, 2014 | 05:47 AM - Posted by Mac (not verified)

Going with nvidia would have meant sourcing the CPU from elsewhere and that would have brought its own headaches. The APU made sense and I can see more APUs in the console future, hell even intel's iGpu might be good enough come ps5 xb2 time.Do we know for a fact that ubisoft is developing on the console and not port from pc? From what I've seen of these title, the visuals are not that compelling.....

October 28, 2014 | 10:29 AM - Posted by renz (not verified)

i think using CPU from another company and use gpu from another company will be a problem and it has been done before. because in the end it will be a custom chip with it's very own software. it just to happen that AMD can supply both.

October 28, 2014 | 05:49 AM - Posted by Anonymous (not verified)

lol like they would have been able to put an expensive i5 and the Nvidia Maxwell that just was released.

If that was even possible no one would buy a console with i5 and maxwell for what they would cost to build. MS and sony had to sell them at a loss which is bad business today when people play more on their phones and tablets which sell for cheap.

The coders have to make better code it is as simple as that. Few of them seem to know how to code for even 4 cores well. Time for them to stop whine and get the skills. Single core performance hardly scale anymore with new chips. Coders are simply late to the party.

October 28, 2014 | 07:56 AM - Posted by Anonymous (not verified)

Maybe VISC will solve that problem.

http://wccftech.com/amd-invest-cpu-ipc-visc-soft-machines/

October 28, 2014 | 04:53 AM - Posted by JohnGR

Never forget in who's paycheck Ubisoft is. So you can expect anonymous developers to bush the consoles all day for every day.

October 28, 2014 | 06:03 AM - Posted by db87

Ubisoft signed a contract with Microsoft for AC Unity.
The Xbox One is the leading platform for AC Unity. This forces Ubisoft to make the best gaming experience on the Xbox One. Instead of a DLC exclusive for the Xbox One they locked-down the PS4 version to Xbox One levels. Ubisoft is not in a position to be honest and open about this matter. Ubisoft PR-department trying to keep the gamers calm on this matter but instead they made it worse (Ubisoft has the habit of doing this)

Also Ubisoft rushes games to be ready at the set deadline, not allowing the games to be delayed in a late state. This compromises the performance that is possible on the hardware. With more time and dedication greater performance was absolutely possible. For PC gamers this is more then ever the case despite the fact that Ubisoft titles are "Nvidia Gameworks". Most Ubisoft games on PC are released in beta state. Ubisoft also doesn't give AMD and Nvidia time for release to proper optimize the game. This is also very stressfull for Nvidia and AMD driverteam to fix this worst-case scenario when the game is (almost) out.

Ubisoft makes great games but also does a lot of things horribly wrong. There PR-department needs a clean sweep. And they need to hire more technical people for there game engines, meeting Crytek and Unreal standards...

The graphics hardware is matured now (no revolutions, only evolutions), to utilize the great power a top-tier game engine more important than ever.

October 28, 2014 | 07:52 AM - Posted by Master Chen (not verified)

FUD. Typical Ubishit FUD. Nothing less or more than just a FUD.

Unity "hit peak" because it's, in actuality, absolutely horrendously optimized, there are massive memory leaks and tons of garbage code, akin to those that can be seen in "Hitman - Absolution" and "Metro - Last Light". Unity is coded just as poorly as Absolution and Last Light was. Ubishit...they're spreading this FUD simply for one reason only - because they know their game is fugly trash optimization and content-wise (the game weights 50GB and requires "2500K+GTX 680"-combo for MINIMAL graphical settings at 1080p? That is ABSOLUTELY clear garbage-coding, performed by people who have hands growing out of their ass).

If you're seriously believing all of this FUD from Ubicrap, you must be a completely brainwashed zombie.

October 28, 2014 | 07:58 AM - Posted by annoyingmouse (not verified)

My first thought when the new consoles were introduced was "bad timing". 20nm process was\is not ready for big chips, and 28nm is too old. The consoles were stagnant from the start. Had they been released two years earlier or 1 1/2 years later, they may have enjoyed a nice honeymoon period.

October 28, 2014 | 08:13 AM - Posted by ChangWang

Call me crazy, but something about this story smells funny. And I'm speaking from the Ubisoft side, not necessarily from Ryan's deductions.

I can't help but feel that this current 1080p/900p dust up with AC Unity lies more with Ubisofts engine than to solely lay all the blame on both console's internals. Even on the best of PC's, Black Flag didn't run as well as it could have.

October 28, 2014 | 08:27 AM - Posted by Anonymous (not verified)

The consoles of this generation have become - with x86 architecture, a somewhat decent amount of memory and hard drive storage, and a "bloated" PC-ish OS - much more PC-ish than all previous generations before, but I think that is one point where the consoles should follow more the PC: power design and thermal design.

Many people complain about the weak hardware of the consoles, but at the same time their suggestions (with HD7950 or HD7970 based GPU) would be incredible expensive.
But with better cooling and a more powerful PSU (although the PS4's PSU should be able to support my suggestions), the hardware manufacturer's might have been able to unleash a considerable amount of "free" additional performance.

The PS4's graphic processing part of the APU is based on the Pitcairn GPU with just two compute units disabled. Now thinking back about a year, what GPU was always recommended for a relatively cheap gaming build intended for 1080p gaming?
The R9 270X. Usually the R9 270X is clocked at 1,000 MHz or more. The PS4 and the Xbone use - like mobile GPUs normally do - much lower clock speeds... sitting at about 800 MHz.

A full R9 270X has up to 2.7 TFlops. At 1,000 MHz the PS4's APU would then deliever exactly the performance wanted by the engine devs of Epic for the Unreal 4 engine at 1080p. Furthermore a R9 270X Toxic at 1,150 MHz in combination with a Core i7-3960X uses only 252 Watt at full load. A PS4/Xbone with less compute units and more energy efficient processor cores should be able to handle the power consumption with stock PSU.

The processor part of the APU also could have been beefed up a bit. The desktop Athlon APUs with Jaguar cores can usually reach clock rates of about 2.5 GHz with air cooling and nearly no increase in voltage. Stock out of the box the Athlon 5350 reaches 2.05 GHz.

I think Microsoft and Sony could have done a better job with only a minor increase in price.

October 28, 2014 | 08:31 AM - Posted by Anonymous (not verified)

Btw about the consoles being more PC-ish:
On the PS4 only the following hardware can be utilized by game developers:
- 6 CPU cores (1 is reserved for OS, 1 is disabled by default)
- 5 GB RAM
- 1 MiByte L2 cache per cluster

October 28, 2014 | 10:41 AM - Posted by Cataclysm_ZA

This isn't entirely correct. All eight cores are enabled and active, it's the GPU clusters that have disabled shaders to improve yields. The PS4 has between 4.5 and 5.5GB of DDR5 memory to use for games, and that ceiling will increase with time as Sony works to optimise their OS more. The Xbox One has a definite 3-5GB split and it will be more difficult to shift that around because they are using a hypervisor to split the OS apart from the game OS.

http://www.anandtech.com/show/7546/chipworks-confirms-xbox-one-soc-has-1...

http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-rev...

http://www.geek.com/games/ps4-gives-5-5gb-of-ram-to-games-out-of-8gb-156...

October 28, 2014 | 01:03 PM - Posted by Anonymous (not verified)

There has been an official presentation from Naughty Dog - a Sony owned studio - in which they told to other game developers about the Playstation 4's hardware and how to make the best use of it.
I am aware that the Die photographs from your links show the full 8 cores (as 8 cores are physically present on the PS4's APU), but they clearly stated that only 6 of the 8 cores can be utilized by game developers.
One of the other two cores is exclusively used by the OS. I assumed that the increasing the yield was the reason for the other core not being usable by game developers, as this was essentially done with the CELL processor of the PS3 (one SPU was always disabled).

The disabled GPU compute units can also be found on the Die, so it is kind of hard to see what the last core is doing. But maybe it is also reserved for the OS...

October 28, 2014 | 01:06 PM - Posted by Anonymous (not verified)

Ok, apparently two cores are used for OS... http://ps4daily.com/2013/07/playstation-4-os-uses-3-5-gb-of-ram/

October 28, 2014 | 01:26 PM - Posted by Cataclysm_ZA

I thought so. two cores reserved for the OS and peripherals. Perhaps they're waiting to see what the takeup on PS Eye is before they remove the requirement to dedicate a single core to it (if they are dedicating a single core, or most of a single-core's CPU cycles, to operating the camera if you have one).

October 28, 2014 | 08:32 AM - Posted by Anonymous (not verified)

Btw about the consoles being more PC-ish:
On the PS4 only the following hardware can be utilized by game developers:
- 6 CPU cores (1 is reserved for OS, 1 is disabled by default)
- 5 GB RAM
- 1 MiByte L2 cache per cluster

October 28, 2014 | 09:00 AM - Posted by Anonymous (not verified)

Im sure there is a ceiling atm, but I would NOT put it past UBISOFT to be a bit over estimating thier optimization. PC AC Black flag was HORRIBLY optimized, and then there was watch dogs....

October 28, 2014 | 09:04 AM - Posted by dreamer77dd

I don't think it was a person from ubisoft. I also think 8-10 year console cycle is wrong.
I think Sony does not want to invest to far into the future as it takes them to long to get the profit they would like back from their investment.
I also think 4K adoption in about 5 years when things become price cheaper and their is enough products on the market.

October 28, 2014 | 01:50 PM - Posted by Cataclysm_ZA

Another developer from Ubisoft making excuses for them? Please, Ubisoft, don't insult my intelligence. First you say that females are hard to animate, then you say that English voice actors with English accents are more appropriate for the time period, then they go on to say that when it comes to Unity, there's only a 1-2fps difference between consoles when they've had the time to work on ironing out the 50% CPU workload issue. They should be using GPGPU for this, not dragging out the rendering process on a weaker processor.

As a reminder, Unity is made by a team of SEVEN Ubisoft studios pooling resources together to make what is a soulless four-person co-op Assassin's Creed game that is capped, hard-limited, on the PC to 30fps no matter what hardware you're running. This is the same developer/publisher that brought The Crew on PC down to 30fps because the physics is tied to the rendering engine (see; Skyrim, NFS Rivals). These are the same people who said 1080p 60fps on the PS4 with Black Flag couldn't be done due to "optimisation considerations" (read: Xbox One parity) and later released a patch to allow the PS4 version to run at 1080p 30 fopr the main storyline and 60fps for the multiplayer once the launch was over and Microsoft got what they wanted.

I haven't trusted anything Ubisoft claims for a while and neither should you. It is in their best interests to keep Microsoft happy (can't market the games the same way if they're not going to play the same, can they?) and they will lie through their teeth to make sure that their game sells well.

October 28, 2014 | 10:34 AM - Posted by Anonymous (not verified)

Console are hitting a wall and PC gamers are paying the price.

October 28, 2014 | 10:37 AM - Posted by collie

Every point from every angle has already been made, EXCEPT the simple fact that the greatest gaming experience of all time, OF ALL TIME was the Atari 2600 combined with lots of drugs. No matter how impressive the systems are, no matter how hard the developers work, no one will ever again hit the excitement and REALISM of mid 80's tripping balls Galaxian. And that is simply a fact!

:)

October 28, 2014 | 10:57 AM - Posted by Mutle (not verified)

Maybe they should have used intel instead of a lackluster AMD CPU

Maybe devs should get thumb out their asses and make/use more optimized engines.

Maybe Xbone should get devs to use mantle api instead of that bottlneck DX11.

Maybe people should stop buying these shit games then devs would stop pushing more shit.

October 28, 2014 | 01:44 PM - Posted by Cataclysm_ZA

The solution is that Microsoft needs to stop moneyhatting developers and publishers in order to make their console special or to make the Xbox One version equal in performance to the PS4. They should rather take that money and put it into their first-party studios and develop better first-party games.

But will they, even with Phil Spencer at the helm? No. Why? Because Spencer was in charge of their first-party developers for years. He doesn't care as much as he professes to AND he was the last person who could have stopped Microsoft from turning Rare into a Kinect-centered studio.

Also, the CPU is fine and the Xbox won't ever use Mantle. What's the point, it has a DX12-like API already!

October 28, 2014 | 11:48 AM - Posted by Booya (not verified)

Damn, Tim Sweeney was right when he said that if game consoles cannot hit 2.5 or 3 Teraflops they would not be truly NEXT GEN.

However for this generation the CPU is the biggest bottleneck. Too bad Intel doesn't licence it's Core architecture.

October 28, 2014 | 12:04 PM - Posted by Shrapnol (not verified)

wouldn't it be better to spend money and just build a pc for same price as these 2 councils?I don't know what the debate here is they both run on basically pc hardware !!!!!and no one brought up fact that the x box one with connect hooked up uses half cpu and gpu cycles when in use! besides you can do more on pc !i dont know anyone saying in my circle of nerds oh got to boot up my council to check my e mail!

October 28, 2014 | 12:05 PM - Posted by Shrapnol (not verified)

WHERES THE STEAM BOXES THEY DO SAME THING DONT THEY LOL

October 28, 2014 | 12:24 PM - Posted by Shrapnol (not verified)

http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics...

no one watched this ???

October 28, 2014 | 12:54 PM - Posted by Darius510 (not verified)

I think it's way too early to declare the consoles as tapped out. Even if we take ubisoft at their word that the game is super well optimized by the known standards - that's only a barrier if you accept that the contemporary optimization techniques are as good as they'll ever be. To their knowledge maybe it's as optimized as possible - but that assumes there's nothing left to be discovered. If anything, being forced up against a wall like this will only accelerate the pace of research into squeezing every last drop out of it using novel techniques.

For instance, think about how much performance AA had to burn when all they had was MSAA, and then along came FXAA and cut that burden *dramatically*. Maybe it's not every bit as good as MSAA, but you get a lot more relative quality vs performance hit for FXAA. I don't believe every last trick has already been discovered, and it's not like they're writing hand coded assembly for games anymore either.. There's still lots of room left to grow and avenues to explore. And maybe, just maybe some of that will filter down to PC.

October 28, 2014 | 01:29 PM - Posted by Henrik (not verified)

This is excellent news.

Devs have to learn how to code now. There is no excape. PC gamers will benefit the most from this in the end.

The CPU takes too much of the cash today to build a computer only meant for gaming. It should not be like that.

October 28, 2014 | 01:33 PM - Posted by Henrik (not verified)

Before they had to tweak for PowerPC hardware and stuff that could not translate into the PC.

Now they have to tweak X86 based stuff to the max which can translate directly into the PC. Things could not be better :)

October 28, 2014 | 03:48 PM - Posted by Anonymous (not verified)

I am a bit more pessimistic on this. I feel they are now only going to shoot for the "max" they have hit on consoles, and regardless of the common pc architecture, PC games are not going to get much more then the console max they are shooting for

October 28, 2014 | 03:34 PM - Posted by Drb (not verified)

Lol!for all the ms Xbox naysayers .tell them this.go view a bit of ms r&d content .I suspect the issue isn't it can't be done,but most Dev probably hate having to prechew everything in azure (whatever ms came with)then send the preshewed data to Xbox .yep this means Dev would likely need a license to do this.I suspect not one Dev want to do it that way ,so they make do .but Xbox is optimized to have everything done on server and then just send final data on console.in this manner it would take very little resource but either Dev don't want to do it that way or ms hasn't released this yet (halo)

October 28, 2014 | 03:36 PM - Posted by Daniel Masterson (not verified)

So many comments for something so small. people really get passionate about brand names.

October 28, 2014 | 04:53 PM - Posted by Master Chen (not verified)

FUD. FUD EVERYWHERE. STOP LISTENING. STOP BELIEVING IN THIS UBICRAP GARBAGE. JUST. STOP. IT'S ALL UTTER LIES AND SHEER AUTISM.

October 28, 2014 | 05:14 PM - Posted by Anonymous (not verified)

Another idiotic article from a low IQ PC virgin with 0 hour training in data mining and QA pretending as if he understands how complex algorithm works on different type hardware architecture and API environment.

it's like saying a high end sports car can carry more cargo than a freight truck because his reference is their engine horse power. Try to get involve in a simplest development in a 3D application, especially in windows/DriectX enviorment, see how many layers it has between hardware and your middle ware and see how hard is to make things working in an acceptable frame rate and resolution. of course, unless you are already somewhat knowledge in C++/Objective C etc.

All brain bead PC virgins made it sound like they can can replace PHD coding veteran anytime any min, calling developers lazy because they have no time and patient to address this "bla a console speced PC can handle it" idiocy which is invented from internet by PC only gamers, hilarious.

Most indie games released recently can be handled at 1080p/1440p/4k and beyond on any mid-high end rig today, and only few of them even hold up to 10 year old 360/ps3 standard.

Video game production needs to find a sweet spot between resolution and fps because they can put more details in their games, not because this "wall", if anything, what stop you from gaming counter strike at 8k/120fps instead of BF4?

October 28, 2014 | 08:35 PM - Posted by ThorAxe

I think you need to take English classes before you start abusing people. Perhaps then they might understand your point.

October 28, 2014 | 08:55 PM - Posted by Anonymous (not verified)

I think you need to take an IQ test before I can take you seriously.

October 28, 2014 | 11:16 PM - Posted by ThorAxe

I have, it was 145.

October 29, 2014 | 05:00 PM - Posted by Anonymous (not verified)

"I have, it was 45"

I fixed your typo for you.

Next.

October 28, 2014 | 08:55 PM - Posted by Anonymous (not verified)

I think you need to take an IQ test before I can take you seriously.

October 28, 2014 | 10:32 PM - Posted by Chris the PC Per fan (not verified)

Honestly guys we hear this story every new generation of consoles. There are two things to say:

1 - Every mainstream console that has been released in the past 20+ years or so has seen a significant improvement for the first 5 years of its life.

2 - Although graphics still have a long way to achieve 'real life' they are certainly good enough for entertainment as it always boils down to a great game.

Oh and that Uncharted 4 treat at E3. Wow!

October 30, 2014 | 07:49 AM - Posted by Shortwave (not verified)

I consider a SNES less dated and more modernly relevant than a PS4.
You can't really recreate that SNES raw experience through emulation, it's truly beautiful, artistic and unique. Music quality unmatched in modern games as well.

What do you get with a PS4?
The same thing you do on PC, just shittier.

Just, crossed my mind as I read your comment.

October 31, 2014 | 11:13 PM - Posted by Chris the PC Per fan (not verified)

I think all the platforms have their awesome factors. Really great video games are sucha rare thing that one must embrace many platforms to feed that on going lust for excellent games.

Consoles have certain exclusives, PC do to and once in a while they get a technically advanced game that goes down in the history books. Portable machines are having a hard time of it except for the 3DS.

All in all I buy these technologies (fortunate to have Xone, PS4, PS3, PS Vita, 3DS, PC, iDevices and Wii U) to play the games I want to.

I love games and the technology that powers them. Heres hoping that Holleywood talent comes to gaming (in mass numbers) sooner than later but I'm not holding my breath.

October 31, 2014 | 11:14 PM - Posted by Chris the PC Per fan (not verified)

I think all the platforms have their awesome factors. Really great video games are sucha rare thing that one must embrace many platforms to feed that on going lust for excellent games.

Consoles have certain exclusives, PC do to and once in a while they get a technically advanced game that goes down in the history books. Portable machines are having a hard time of it except for the 3DS.

All in all I buy these technologies (fortunate to have Xone, PS4, PS3, PS Vita, 3DS, PC, iDevices and Wii U) to play the games I want to.

I love games and the technology that powers them. Heres hoping that Holleywood talent comes to gaming (in mass numbers) sooner than later but I'm not holding my breath.

October 30, 2014 | 07:50 AM - Posted by Shortwave (not verified)

I consider a SNES less dated and more modernly relevant than a PS4.
You can't really recreate that SNES raw experience through emulation, it's truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.

What do you get with a PS4?
The same thing you do on PC, just shittier.
ALREADY DATED AND SERVES NO USE OTHER THAN MARKET MONOPOLY.

Just, crossed my mind as I read your comment.

October 30, 2014 | 07:52 AM - Posted by Shortwave (not verified)

I consider a SNES less dated and more modernly relevant than a PS4.
You can't really recreate that SNES raw experience through emulation properly no matter what, it's truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.

What do you get with a PS4?
The same thing you do on PC, just shittier. (Not just graphically)
ALREADY DATED BY DEFAULT AND SERVES NO USE OTHER THAN MARKET MONOPOLY.

Just, crossed my mind as I read your comment.

October 30, 2014 | 07:52 AM - Posted by Shortwave (not verified)

I consider a SNES less dated and more modernly relevant than a PS4.
You can't really recreate that SNES raw experience through emulation, it's truly beautiful, artistic and unique. Music quality unmatched in modern games as well. Call it timeless.

What do you get with a PS4?
The same thing you do on PC, just shittier.
ALREADY DATED AND SERVES NO USE OTHER THAN MARKET MONOPOLY.

Just, crossed my mind as I read your comment.

October 29, 2014 | 08:22 AM - Posted by gamerk2 (not verified)

I remember being laughed at when I predicted this console generation would be heavily CPU bound. Going forward, this WILL affect PC gaming, as devs do not like doing PC exclusive features, as consoles have become their cash cows. I predict several upcoming games are going to show signs of being scaled back late in development, and be a lot less then what is predicted.

One game I expect this to happen to is Witcher 3. CJPR came out VERY early complaining about the lack of power in consoles, and there have been a few roumers the project is running into problems. I hope I'm wrong; I love the series. But signs point this as being a game prime for underachieving.

In any case, when you look at the decline of independent studios, the growing reliance on mega-games to generate revenue, and what appears to be a VERY weak console generation, we're primed for a major 1983 style contraction of the industry.

October 29, 2014 | 09:31 AM - Posted by Anonymous (not verified)

Given the reported minimum specs for ACU on the PC, I place the blame solely on Ubisoft being lousy programmers.

October 29, 2014 | 01:13 PM - Posted by Anonymous (not verified)

"I have, it was 45"

I fixed your typo for you.

Next.

October 29, 2014 | 02:22 PM - Posted by Truth_Teller (not verified)

There is a huge performance Divide between the XBOne and PS4 in regards to the GPUs used and the memory they are tied to. I'm surprised you continue to push the narrative that the PS4 is "only a little more powerful than the XBOne" when you've had first hand interaction with the GPU's they are based on.

The PS4's GPU itself is closest to the Radeon 7850/R7-265. The PS4 version has 128 more shader cores than the 7850/265, but is also clocked slower. What helps keep the chip strong is the fact that it uses the memory native to the Radeon, GDDR5. The CPU may be underpowered in the machine, but the GPU in the PS4 is up to the task.

The XBOne GPU, on the surface, is a dead ringer for the Radeon 7790/R7-260x. Same amount of shader cores, clocked a little slower, but the GPU itself is still basically a Radeon 7790/R7-260x. HOWEVER, the memory used is DDR3, as opposed to the GPUs native GDDR5. The use of DDR3, is what collapses the performance of the GPU to Radeon 7750/R7-250 levels of performance.

Basically, being a computer hardware site, just answer this. Would Radeon 7790/R7-260X using DDR3 memory as opposed to GDDR5 memory perform close to a standard Radeon 7850/R7-265?

Truth is, it wouldn't be even close, probably 50% of the performance. I implore you and all other hardware/gaming sites to stop pushing the false narrative that the PS4 is a little more powerful than the XBOne, when MATH and LOGIC say the polar opposite. The performance difference between the XBOne and PS4, is about the same as the difference betweent the Wii-U and the XBOne.

It is also openly known that MS and Ubisoft have a co-marketing campaign with AssCreed Unity. They really had to limit the PS4 version to 900p, or look like assholes to MS.

October 29, 2014 | 04:05 PM - Posted by Bret Bowlby (not verified)

Need to switch from x86 to VISC CPU, this would help drastically with CPU limitations.

October 29, 2014 | 10:59 PM - Posted by Anonymous (not verified)

How would that even help. You'll still have lousy CPU utilization. 99% on 1 core, 70% on 2, 40% on 4 cores.

Both X-Box and PS4 have 8 cores.

You'll just be moving lousy programing from one CPU arch to the next.

October 30, 2014 | 07:45 AM - Posted by Shortwave (not verified)

Isn't this what ran through your minds the second their hardware configurations were confirmed?

I basically fell over laughing.
Everyone kept assuming it'd be a AMD APU.
Never did I assume it wouldn't even be the flagship chip of that line up.

Instantly knew that was one of the last nails in that coffin.

Now you can just play Sony games on their new TV's.. (The last nail struck in denial?)
It would make more sense to buy a sony TV and just stream the new AC game (assuming they'd ALLOW you to, but you know.. Still trying to whore these sub-par console systems.) and it'd likely be rendered in an epic high end GPU farm and actually giving you back 60fps.

HAAAAAAAAA!!!

Their game streaming service can provide a better gaming experience (good connection assumed) than their dedicated console.

Sad.

October 30, 2014 | 11:05 AM - Posted by Ohwhereohwhereishalo (not verified)

Let me get this straight,Xbox is maxed out because?another,the GAME that made Xbox something even with all the issue original had ,halo alone saved Xbox so many time its insane,and suddenly Dev are feeling disgruntled at Xbox one so Xbox one is maxed?lol,quad channel ddr3 doesn't grow on trees ,yet ms bothered putting it ,even Intel doesn't have many server with this and its meant for insane numbers,its so new most don't know what to do with it,its deemed useless.ms even went to the trouble of implementing a form of azure donnybrook.and Dev think the console is maxed?lol!Xbox is made from the ground up for cloud,it isn't there to calculate,azure and donnybrook on the cloud is there for that.think of Xbox as a xperia z3 is for ps4 and you get the idea

October 30, 2014 | 11:11 AM - Posted by Ohwhereohwhereishalo (not verified)

With a difference,azure doesn't send stream it send the prechewed number likely compressed and decompressed on Xbox.sadly ms has probably not shown this yet since the savior is meant to be halo.TILL HALO IS OUT NOTHING IS AS IT SEEMS.

October 30, 2014 | 11:37 AM - Posted by Ohwhereohwhereishalo (not verified)

Put another way?if ms idea didn't have so many feature they could hook directly to your TV via internet,yep the Xbox is just a fetch and mix,if you render from Xbox?you aren't using Xbox in the way ms is clouding everything.and no ms isnt streaming the game to Xbox lol.just the data calculated

October 31, 2014 | 03:19 AM - Posted by praack

they are using APU's: Sony did have the initial lead out of the gate by having DDR5 and slight better specs and not mussing about with junk like MS. but in the end - they are using low low cost solutions.

to believe high settings at 50 fps 1080 p for dx11 games is not really possible given the loadout. even with offsetting a lot of the power needed onto server workload and such.

and in consoles you cannot give someone the choice to set graphic settings.

i am actually expecting this generation of consoles to be the first to get a hardware upgrade. so wait another year or two and see what happens.

on the other hand- at least they did not rehire all those DX9 programmers and go back to coding in DX9 again.....

October 31, 2014 | 07:23 AM - Posted by beckerjr

Its so laughable to see Ryan's update to the article where he in a single sentence claims how the console makers could have "properly" set up their systems for a long life cycle. Cause I'm sure Ryan an the crew know so much better how to design, build, market, sell, and support over years a console better than Sony does within a proper budget.

The arrogance is amazing...

PS the claims in the article of the PS4's GPU being less than even a 7790 is a flat out lie too, or just complete ignorance at being able to use google to look up specs.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.