NVIDIA VR Headset Patent Found

Subject: Graphics Cards, Mobile | June 6, 2015 - 04:05 PM |
Tagged: VR, nvidia, gameworks vr

So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.

View Full Size

Image Credit: Declassified

The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.

View Full Size

Image Credit: Declassified

VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?

Source: VRFocus

June 6, 2015 | 05:38 PM - Posted by Edmond (not verified)

GOOD.

Maybe it will come with gsync support and make all the other VR stuff look like shit.

June 6, 2015 | 07:21 PM - Posted by Anonymous (not verified)

There are issues with attempting to have a variable refresh and low persistence - so I wouldn't expect it. Nvidia's said there may be ways around it, but considering we don't even have Gsync/ULMB monitors yet, it's that much more unlikely to be in any headset.

My only fear with them making a headset is some of their vr stuff being exclusive to Nvidia hardware ala 3dVision.

June 6, 2015 | 09:03 PM - Posted by Anonymous (not verified)

We do have G-Sync/ULMB Monitors already. They just can not be active at the same time. If thats what you mean, okay, but they do exist.

June 7, 2015 | 10:41 AM - Posted by Anonymous (not verified)

That was the whole point. You have to make the choice. And low-persistence in VR is king. If you don't have that, you lose prescence.

June 8, 2015 | 02:23 AM - Posted by Edmond (not verified)

No it isnt... as strobing IS flicker by itself, this is just a temporary solution.

Noone wants this constant flicker so close to their eyes, even if its @ constant 90hz. Not to mention you still get all the bullshit that a NON-gsync screen has.

Michael Abrash explained a solution recently = you want to have a flicker free gsync screen first. AND for motion blur reduction you either want hundreds of fps/hz or you want do do frame doubling like TV`s do.

June 8, 2015 | 06:51 AM - Posted by Anonymous (not verified)

"Noone wants this constant flicker so close to their eyes, even if its @ constant 90hz"

This 'noone' doesn't have a clue what he's talking about then. Low persistence driving at high refresh rates is vastly preferable to any other driving method we have available.

June 7, 2015 | 01:25 AM - Posted by H1tman_Actua1

hell yah! GSYNC everything!

June 6, 2015 | 07:07 PM - Posted by Anonymous (not verified)

Read the fine print

In order to get it to work you need

A Nvidia Shield tablet
With a Nvidia Shield Controller
Attached to Nvidia Shield
Connected to a Nvidia GTX 980 Ti on your PC
Displaying to a G-Sync Monitor

They all have to be turned on and running so the headset can boot properly.

$$$

Or

You can save the earth and just get a View-master

http://www.view-master.com/en-us

June 6, 2015 | 09:02 PM - Posted by Anonymous (not verified)

I think you're overreacting, and just... Look, VR will always take a powerful GPU. That is nothing new, and something every VR implementation will have to deal with. Doesn't matter if you're team Red or Blue... there is NO WAY a cheaper graphics card is going to power what is essentially TWO monitors at or above the needed framerates. This isn't like displays, 30 fps isn't going to cut it. You need, in my estimation, at least 90 in each eye to avoid issues.

June 6, 2015 | 09:50 PM - Posted by Anonymous (not verified)

No the anon is not overreacting, vendor lock-in is very high up in Nvidia's business model, and a high priced one at that. Nvidia's plan includes top to bottom milking for profits, from expensive gaming SKUs stripped of DP functionality, down to the nickel and dime Nvidia app store trinkets all obtained through Nvidia's ecosystem. The VR market is just another profit center that Nvidia can hitch their GPU gravy train to, like G-Sync monitors. How long before Nvidia branded PCs will be required just to utilize Nvidia branded GPUs, etc. etc.($$$)!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.