NVIDIA CloudLight: White Paper for Lighting in the Cloud

Subject: Editorial, General Tech | August 3, 2013 - 01:03 PM |
Tagged: nvidia, CloudLight, cloud gaming

Trust the cloud... be the cloud.

The executives on stage might as well have waved their hands while reciting that incantation during the announcement of the Xbox One. Why not? The audience would have just assumed Don Mattrick was trying to get some weird Kinect achievement on stage. You know, kill four people with one laser beam while trying to sink your next-generation platform in a ranked keynote. 50 Gamerscore!

View Full Size

Microsoft stated, during and after the keynote, that each Xbox One would have access to cloud servers for certain processing tasks. Xbox Live would be receiving enough servers such that each console could access three times its performance, at launch, to do... stuff. You know, things that are hard to calculate but are not too dependent upon latency. You know what we mean, right?

Apparently Microsoft did not realize that was a detail they were supposed to sell us on.

In the mean time, NVIDIA has been selling us on offloaded computation to cloud architectures. We knew Global Illumination (GI) was a very complicated problem. Most of the last couple decades has been progressively removing approximations to what light truly does.

CloudLight is their research project, presented at SIGRAPH Asia and via Williams College, to demonstrate server-processed indirect lighting. In their video, each of the three effects are demonstrated at multiple latencies. The results look pretty good until about 500ms which is where the brightest points are noticeably in the wrong locations.

View Full Size

Again, the video is available here.

The three methods used to generate indirect lighting are: irradiance maps, where lightmaps are continuously calculated on a server and streamed by H.264; photons, which raytraces lighting for the scene as previous rays expire and streams only the most current ones to clients who need it; and voxels, which stream fully computed frames to the clients. The most interesting part is that as you add more users, in most cases, server-processing remains fairly constant.

It should be noted, however, that each of these demonstrations only moved the most intense lights slowly. I would expect an effect such as switching a light on in an otherwise dark room would create a "pop-in" effect if it lags too far behind user interaction or the instantaneous dynamic lights.

That said, for a finite number of instant switches, it would be possible for a server to render both results and have the client choose the appropriate lightmap (or the appropriate set of pixels from the same, large, lightmap). For an Unreal Tournament 3 mod, I was experimenting with using a Global Illumination solver to calculate lighting. My intention was to allow users to turn on and off a handful of lights in each team's base. As lights were shot out or activated by a switch, the shader would switch to the appropriate pre-rendered solution. I would expect a similar method to work here.

What other effects do you believe can withstand a few hundred milliseconds of latency?

Source: NVIDIA
August 3, 2013 | 06:55 PM - Posted by kukreknecmi (not verified)

On the demos, the thing is you just dont know which part is rendered locally and which part is on remote. So you observe that from some inconsistencies from what you see. From the video, what you can say is most of the secondary light things are being calculated on remote system. Which lightens the task for local. And the secondary lights / indirect lights have a bit less importance, it may be hard to understand what is the exact difference.

Here is the main problem, if the latency is high enough, an additional lightning group will chase the main lightninig by a distance and it will cause inconsistency. For the low latency, the tailing indirect lights from server will be close to main lights. It will cause the reviewer an extended perception of main lights, which will cause them to call it similar to ghosting.

Say you fire a rocket launcher. Primary lightning will be calculated locally and you'll see its brighat tail and lightining the enviroment. After some delay you'll start to see secondary light to come into the scene, lighing just in front of you from no where. Since the rocket is far away, there will be a pale lightning just in front of you (secondary lights) and will follow rockets route, but by far.
On the other hand if the latency is less, there will be additional lightning, making the glow look a bit widespreaded and illuminating the ground in a weird fashion.

In a game, if the camera is mostly fixed or not FPS (like RTS, mmo etc.), the remote calculation delay probably will not cause too much perception of inconsistency. Cuz the update that is needed on the screen will not effect the perception that much. On the other hand, on FPS kinda camera based scenes, it will probably cause weird perception in both case. Either lagging to much or ghosting effect.

August 4, 2013 | 11:58 AM - Posted by razor512

cloud processing for games is just horrible. they are trying to market to the same users who believe that the reason why their phone has no user replaceable battery is because it last for ever and never needs to be replaced. (you would not believe how many people believe this)

if you have games which require server processing, then you are now reliant on a game server staying up at all times and if the company decided that the game is no longer selling enough to justify keeping a large number of servers running for it, then they will simply take them offline or re-purpose them for a new game and leave you with digital $60 paperweights.

August 4, 2013 | 12:09 PM - Posted by Scott Michaud

They do not *need* to do this, they could drop the extra effects or raise the system requirements as they become capable of doing the processing.

But, you're right, they won't. Entertainment, not art; consumable, not intrisic value model.

Wanted to talk about this when I wrote it but decided not to tangent.

August 5, 2013 | 05:55 AM - Posted by mLocke

Somehow this makes me think of a dark future where we all have to run server clients of games on secondary machines before we can play our games. Either that or live under the thumb of Actiblizzion and deal with lag in a single player game or not be able to play the game at all, just like Diablo 3. This is normally where I would say something about voting with your wallet but it seems the market wants multimillion dollar games where more of the budget is spent on marketing than on development of the game itself. Prepare to feel war like you've felt it in a dozen other games in, Call of Crysisfield 360.

August 5, 2013 | 12:45 PM - Posted by Scott Michaud

Likely the latter "not be able to play the game at all".

This are the things I've been talking about when I say, "the industry wants to be consumable". The game will eventually go away, but don't worry, buy the sequel at full price. Sadder, they want to be consumable exclusively.

I complain about disposable art in specific ways such as: consoles, the Windows 8 store, etc. It is something to think about everywhere, though, including non-gaming media.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.