Epic Games is disappointed in the PS4 and Xbox One?

Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM |
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games

Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.

Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.

Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.

A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.

As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.

Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.

This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.

Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.

But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.

Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:


Video News

May 24, 2013 | 08:05 AM - Posted by MarkT (not verified)

MarkT like MarkR !!!!

May 24, 2013 | 08:12 AM - Posted by Vampirr (not verified)

People, people...

Xbox One and PlayStation 4 are not ordinary consoles, they don't have dedicated CPU and GPU, its combined.

HSA/hUMA eliminates so many bottlenecks, Elemental Demo ran on a PC with similar performance except it din't have HSA/hUMA, so please shut up!

May 24, 2013 | 12:40 PM - Posted by Jordan (not verified)

You have no idea what you are talking about.

That is all.

May 25, 2013 | 04:27 PM - Posted by Klimax (not verified)

That'll reduce latency and increase bandwidth available, but will not increase compute performance itself.

May 26, 2013 | 03:13 PM - Posted by Scott Michaud

It won't increase compute performance but it will allow developers to choose compute-lighter algorithms that are sensitive to latency and bandwidth.

But the point is that Epic, as far as we can tell, were pushing for approx. 2.5 teraFLOPs and they did not get it. This could change plans with Unreal Engine 4...

May 24, 2013 | 08:36 AM - Posted by kukreknecmi (not verified)

Pls Sir admit that you just dont care unified memory which requires new codding paradigm and more frelxible compute capability on consoles doesnt interest you. I even bet you just dont care the vertex culling capabilities which are done via firmware on GPU. You can also write a compute shader that will optimize the scene, which i doubt they even care. Although i'm sure they are great coders since they had supported 3dknow on k6-2's, yet they dont care many things now. Blaming Intel for making their GPU's weak is another issue which points out that Epic is not as good as it was. They are in the pit which they had dig, now they face the truth and it is hard to accect. Hope someone will notice that Intel will be the greatest threat to all developers, and started to demand new things that suits them (like pixelsync on grid2). It is somehow effecting Epic. I'm not sure if they are doing this cuz they are cool with Nvidia and Nvidia dont want a good looking UE4 on consoles.

May 30, 2013 | 04:42 PM - Posted by Darnell (not verified)

Nicas ayuԁа а lanzar las obstrucсіonеs
tantra emocіonales que causаn el еndureсіmiеntο ԁe la taсhе sacr?
Tantric Maѕsаge Tгaining. Paz vedathivadа vedathi kayaνadа vedathiνadapaz
vеdathivid vedathіѵadаvid vеԁathі
So tο them is Khajuraho. Ϲtісa
del sеxο o de fatores: еxpeгі?

But they hаven't exactly had any of the central nerve in our physical and emotional disturbances and difficulties are multiplying. India also co-existed with this potent Tantra deities through mantra chanting for 1 lakh times and do his Paduka puja according to the mix, and consummates the marriage bed.

May 24, 2013 | 01:00 PM - Posted by Kevin G (not verified)

Being a generation ahead may not be indicative of performance but rather feature set. The Xbox One, PS4 and the AMD's new Kabini based SoC's all have nice feature missing from the current generation of GPU's: unified memory space. So instead of having to transfer bulk data between a CPU and GPU, a pointer in many cases is all that is necessary now. Using this helps simplify algorithms as there is no longer the need to manage memory on the CPU and GPU independently.

Raw performnace wise, the PS4 has about as much compute as a Radeon 7850. I would expect games in the midst of the PS4's life span to be producing higher image quality than what can be done on the PC equivalent (Radeon 7850) due to lower software overhead and unified memory space. I'd guestimate that performance would be closer to a Radeon 7870, still with in the bounds of current PC hardware. By the time the PS4 developers get familiar enough to pull that off, several more generations of PC hardware will have arrived.

May 24, 2013 | 01:52 PM - Posted by Anonymous (not verified)

Phhhbt. It has already been revealed that of he 8GB in the Xbox One, 3GB are reserved for the OS and apps, giving you 5GB for games, to include code, level maps and audio before you even get around to textures and GPU-related processing.

There are video cards in the PC world with more that 5GB of memory.

Anandtech has done the math: the Xbox One's SOC has about 1.23TFLOPS of processing power. The 7850 has 1.76TFLOPS.

These consoles are not a generation ahead. They are a generation behind.

May 24, 2013 | 05:14 PM - Posted by Kevin G (not verified)

The PS4's GPU compute is rated at 1.84 GFLOPs, clearly in the same league as the Radeon 7850. I even reference the PS4 in the same sentence compared it to the Radeon 7850, not the Xbox One as you've clearly misread.

Only one consumer video card has more than 5 GB of memory (Geforce Titan) and beyond that it is strictly workstation class cards. The average memory on a discrete graphics card is 2 GB with nearly an equal share of 1 GB at the low end. There is a minority of 3 GB and 4 GB cards at the high end. Outside of the few Titan owners, the Xbox One will be able to provide an increase in the amount of graphics memory than the average PC today. The PS4 is said to reserve only 512 MB of that 8 GB so in that case, a developer could actually allocate more memory to graphics usage than can be held on a Geforce Titan.

Those those two things are actually irrelevant to my overall point: it isn't about raw compute but rather feature set.

The Xbox One and PS4 can do things in hardware that even the high end Geforce Titan cannot do: share the same virtual memory space as an x86 processor. That is a key feature that is being added to the next generation of discrete graphics card from both nVidia (Maxwell generation) and AMD (GCN 2.0 and HSA). These features don't change their respective GPU's theoretical raw compute limits but it does help developers to reach those limits.

And this has happened before numerous times in the space: a low end part getting more features than the previous generation high end part. For example, would you expect a Geforce 8500GT to outrun a Geforce 7900GTX? Nope but there are games out there that won't run on the Geforce 7900GTX but will work (very slowly) on the Geforce 8500GT because of a more modern feature set. That is a clear generational divide.

May 25, 2013 | 11:59 PM - Posted by Anonymous (not verified)

You just went full EA. You never go full EA.

May 26, 2013 | 12:44 AM - Posted by ToiT (not verified)


May 27, 2013 | 06:59 PM - Posted by Anonymous (not verified)

Instead of posting comments with random incorrect information try learning the facts. I realize that you gathered some information online, and took 25 minutes writing your comment to sound like you know what you're talking about, but people who actually know about this stuff can tell that you don't. Do a little more research, and you may begin to understand how very incorrect you are. Glad I could help, - Jordan

May 24, 2013 | 01:10 PM - Posted by Anonymous (not verified)

Look at it like this. A GTX 650 is kepler, and has newer tech than a Fermi GTX480, but performs far worse.

That being said, people might say that "a 650 has much newer tech than a 480!!" and yet performs far worse.

Typical lawyer tactics.

May 24, 2013 | 01:36 PM - Posted by Anonymous (not verified)

"claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market"

Haha. Yeah, a $500-$600 game console is faster than, say, MaximumPC.com's yearly Dream Machine. Whatever EA. You're just getting the 13 year olds excited so they' run out and by new consoles and games.

May 24, 2013 | 02:56 PM - Posted by Scott Michaud

To be clear Microsoft and Sony cannot build their $500 or $600 console for $500 or $600. About the only console in recent memory to be sold above parts and labour costs is the original Wii. Even if you remove all the research, development, and marketing costs -- just the sum of the parts and the wages required to assemble the box are almost always more than what they sell it for.

Example: 20GB PS3 at launch sold for $499. To make each 20GB PS3, Sony paid $805... a loss of $306... per unit. (+ the billions in research, development, distribution, marketing, etc.)

So you cannot equate a $500 console near its launch to a $500 computer... because when you buy the computer you stop giving the computer maker money. Console manufacturers cost the price as low as they can go -- almost always below cost -- to get in your house and derp around until you give them more money... because your box is worthless until you do.


But yes, this is "allegedly" just EA trying to rhile people up for "A Next Generation!" so investors will buy stock and gamers will buy product.

May 25, 2013 | 11:58 AM - Posted by Iconic (not verified)

Excellent point Scott!

It's my estimation that this year's new console offerings are attempting to have a smaller wholesale/retail loss margin. I think MS/Sony may be hoping to drive down the initial loss-per-unit at launch time, and get into breaking even or profiting on the hardware sooner. We should be able to accurately asses this once we get good pricing numbers in the coming weeks.

As an early adopter, this will probably make be less inclined to go out on Day 1 and purchase a new console. Part of the fun of buying the PS3 and others was feeling like you just ripped "The Man" for ~300 bones.

May 24, 2013 | 03:11 PM - Posted by Anonymous (not verified)

Did you guys see this.

Big time facepalm. What are they smoking over at MS.

May 24, 2013 | 03:37 PM - Posted by Vampirr (not verified)

Alot has changes...

Hardware in Xbox One and PlayStation 4 is not exotic, except in features. HSA/hUMA will allow AI's to be ran on GPU's and it will allow much higher degree of artificial inteligence since it needs parallel computing thus HSA/hUMA enables this option.

Now just wait for some idiot to make an AI that is like Skynet. All Xbox One, PlayStation 4's, Temash/Kabini's and Kaveri's get infected and mushrooms all around world. ;)

May 24, 2013 | 05:21 PM - Posted by Kevin G (not verified)

Running AI on the GPU can be done on the Xbox One/PS4 but it'd be suboptimal. AI is very branchy and GPU's are absolutely horrible at this type of work. It is still wiser to run AI on a beefy CPU core.

*Though there is an interesting sub case for running path finding on the GPU instead of the CPU.

May 24, 2013 | 05:41 PM - Posted by Scott Michaud

Actually, path finding (as you've said) and visibility are two very parallel calculations which take up I believe like 80% of computation time. The real problem, as I've been reading up for a personal project, is the latency of computing the batch.

Now with lower latency between CPU and GPU it might be a good idea.

June 25, 2013 | 12:50 PM - Posted by Anonymous (not verified)

you can discuss all night long but ps4 will sell like hot cakes :) and all games for all plattforms will be affected.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.