Epic Games is disappointed in the PS4 and Xbox One?

Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM |
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games

Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.

Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.

Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.

A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.

As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.

Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.

This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.

Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.

But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.

Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:

Unreal Engine 3 compiled to asm.js

Subject: Editorial, Mobile | May 7, 2013 - 12:07 AM |
Tagged: unreal engine, firefox, asm.js

Over the weekend we published a post which detailed Javascript advancements to position the web browser as a respectable replacement for native code. Asm.js allows for C-like languages to be compiled into easily optimized script executed at near native performance on asm.js-aware browsers, but are still functional as plain Javascript otherwise. If you wish to see a presentation about asm.js and compiling native code into web code, check out an online slideshow from Alon Zakai of Mozilla.

If, on the other hand, you wish to see an example of a large application compiled for the browser: would Unreal Engine 3 suffice?

UnrealHTML5.jpg

Clearly a computer hardware website would take the effort required to run a few benchmarks, and we do not disappoint. Epic Citadel was run in its benchmark mode in Firefox 20.0.1, Firefox 22.0a2, and Google Chrome; true, it was not run for long on Chrome before the tab crashed, but you cannot blame me for trying.

Each benchmark was run at full-screen 1080p "High Performance" settings on a PC with a Core i7 3770, a GeForce GTX 670, and more available RAM than the browser could possibly even allocate. The usual Firefox framerate limit was removed; they were the only tab open on the same fresh profile; the setting layout.frame_rate.precise was tested in both positions because I cannot keep up what the state of requestAnimationFrame callback delay is; and each scenario was performed twice and averaged.

Firefox 20.0.1

  • layout.frame_rate.precise true: 54.7 FPS
  • layout.frame_rate.precise false: 53.2 FPS

Firefox 22.0a2 (asm.js)

  • layout.frame_rate.precise true: 147.05 FPS
  • layout.frame_rate.precise false: 144.8 FPS

Google Chrome 26.0.1410.64

  • Crashy-crashy

For Unreal Engine 3 compiled into Javascript we notice an almost 3-fold improvement in average framerate with asm.js and the few other tweaks to rendering, Javascript, and WebGL performance between Firefox 20 and 22. I would say that is pretty enticing for developers who are considering compiling into web standards.

It is also very enticing for Epic as well. A little over a month ago, Mark Rein and Tim Sweeney of Epic were interviewed by Gamasutra about HTML5 support for Unreal Engine. Due in part to the removal of UnrealScript in favor of game code being scripted in C++, Unreal Engine 4 will support HTML5. They are working with Mozilla to make the browser a reasonable competitor to consoles; write once, run on Mac, Windows, Linux, or anywhere compatible browsers can be found. Those familiar with my past editorials know this excites me greatly.

So what do our readers think? Comment away!

NVIDIA Shows Unreal Engine 3 Running On Windows RT Tablet

Subject: Mobile | August 29, 2012 - 03:45 PM |
Tagged: unreal engine, tegra 3, tablet, nvidia, gaming

One of the reasons why I have hope for Windows RT is its gaming potential. Microsoft has been hit-or-miss with its gaming projects, but when it succeeds, it really knocks it out of the park – see DirectX, the Xbox 360 and Microsoft’s digital distribution via its console. Bringing Windows to tablets could make life easier for game developers in that space and offer a wider selection of mature titles rather than mobile-focused games, which often (in my opinion) feel watered down and look underwhelming.  

NVIDIA showcased this potential at IFA 2012 by demonstrating a Windows RT tablet (with Tegra 3 hardware, of course) running Unreal Engine 3. The tablet is shown playing the NVIDIA “Epic Citiadel” demo which we saw at the editor’s day conference used to debut the GTX 680 earlier this year. Quality details are probably reduced compared to the version that ran on the GTX 680 (it’s hard to tell in the video) but it still looks excellent and runs smoothly.

 


 

The demonstration highlighted the fact this isn’t some one-off or stripped-down version of the engine designed only for mobile devices. It’s a port of the existing Unreal Engine 3 engine used to make Windows PC games, which means developers shipping games that use UE3 should have minimal trouble porting their game to a Windows 8 RT tablet. Mark Rein, president of Epic Games, stated that Windows 8 RT code is now available to UE3 licenesees. It’ll be interesting to see which game developer is first to jump on board.

The tablet in the video is an ASUS Vivo Tab RT, an upcoming Windows 8 RT tablet with an 11.6” IPS display with 1366x768 resolution and a Tegra 3 SoC.  A tablet like this could be a compelling mobile gaming device if the games become available. I’ve got my fingers crossed.

Source: NVIDIA