Subject: General Tech | February 1, 2018 - 05:53 PM | Sebastian Peak
Tagged: VR, virtual reality, Tobii, htc vive, eye tracking, CES 2018, CES
Last month in Tobii's suite at CES I was given a demonstration of a prototype VR headset that looked like any other HTC Vive - except for the ring of Tobii eye-tracking sensors inside and around each lens. While this might seem like a bit of an odd concept at first I was patient as the benefits were explained to me, and then blown away when I actually tried it myself.
As you know, if you have used a VR headset like the Oculus Rift or HTC Vive, the basic mechanics of VR interaction involve pointing your head in the direction you want to look, reaching with your hand (and controller) to point to an object, and then pressing a button on the controller to act. I will be completely honest here: I don't like it. After a little while the fatigue and general unnatural feeling of rapid, bird-like head movements kills whatever enthusiasm I might have for the experience, and I was the last person to give high praise to a new VR product. HOWEVER, I will attempt to explain why simply adding eye tracking actually made the entire experience 1000 times better (for me, anyway).
When I put on the prototype headset, the only setup I had to do was quickly follow a dot in my field of vision as it moved up/down/left/right, like a vision test for a driver's license. That's the entire calibration process, and with that out of the way I was suddenly able to look around without moving my head, which made the head movements when they followed feel completely natural. I would instinctively look up, or to the side, with my head following when I decided to focus attention on that area. The amount of physical head movements was reduced to normal, human levels, which alone prevented me from feeling sick after a few minutes. Of course, this was not the only demonstrated feature of the integrated eye-tracking, and if you are familiar with Tobii you will know what's next.
This looks primitive, but it was an effective demo of the eye-tracking integration
The ability of the headset to know exactly where you are looking allows you to aim based on your line of sight if the game implements it, and I tried some target practice (throwing rocks at glass bottles in the demo world) and it felt completely natural. After launching a few rocks at distant bottles I instantly decided that this should be the mechanic of fantastic VR football video game - that I could throw at different receivers just by looking them down.
I also received a demo of simulated AR integration (still within the VR world), and a demo of what eye-tracking adds to a home theater experience - and it was pretty convincing. I could scroll around and select movie titles from an interface by simply looking around, and within the VR world it was as if I was looking up at a big projection screen. Throughout the different demos I kept thinking about how much more natural everything felt when I wasn't constantly moving my head around and pointing at things with my controller.
Finally, there was another side to everything I experienced - and it might have been the most interesting thing from a PC enthusiast perspective: if the VR headset can track your focus, the GPU doesn't have to render anything else at full resolution. That alone could make this something of a breakthrough addition to the current VR headset space, as performance is very expensive (even before the mining craze) and absolutely necessary for a smooth, high frame-rate experience. After 45 minutes with the headset on, I felt totally fine - and that was a change.
So what is the takeaway from all this? I'm just an editor who had a meeting with Tobii at CES, and I walked out of the meeting with a couple of business cards and nothing else. I admit that I am a VR skeptic who went into the meeting with no expectations. And I still left thinking it was the best product I saw at the show.
More information and media about the CES demos are available from Tobii on their CES blog post.
Subject: Mobile | July 27, 2017 - 01:12 PM | Ryan Shrout
Tagged: htc, vive, VR, virtual reality, qualcomm, snapdragon, snapdragon 835
During the ChinaJoy 2017 event in Shanghai, VR pioneer HTC announced its standalone VR headset aimed at the China market. This marks the first major player in the virtual reality space to officially reveal a standalone product intended for the broad consumer market that requires a more affordable, portable VR solution.
Standalone VR headsets differ from the current options on the market in two distinct ways. First, they are disconnected from a PC and don’t require attachment to a desktop for processing or display output. The current HTC Vive product that ships in the market, as well as Facebook’s Oculus Rift, require a high-end PC to play VR games and use HDMI and USB connections to power the headsets. This new standalone design also moves away from the slot-in design of the Samsung Gear VR and doesn’t require the user to monopolize their smartphone for VR purposes.
Though mobile-first VR solutions like Gear VR have existed for several years, selling on the market before the PC-based solutions were released, the move of HTC from tethered virtual reality to a wireless standalone unit signals a shift in the market. Consumers see the value and quality experiences that VR can provide but the expense and hassle of in-place configurations have stagnated adoption.
HTC is using the Qualcomm Snapdragon 835 Mobile Platform to power the Vive Standalone VR Headset, the same chipset used in many high-end smartphones on the market today. Qualcomm and HTC can modify traits of the processor to improve performance without worrying about the sensitive battery life of a consumer’s phone. Though we don’t know the specifics of what HTC might have modified for the configuration of this standalone unit, it likely is a mirror of the Qualcomm Snapdragon 835 VR hardware development kit that was announced in February. That design includes the capability for six degrees of freedom tracking (moving around a space accurately without external sensors), high resolution displays for each eye, and a full suite of graphics and digital signal processors to handle the complex workloads of VR experiences.
Though HTC is the first to announce and a complete standalone VR product, HTC and others announced their intent to release standalone units in the US later this year through Google’s Daydream program. Lenovo plans to build a VR headset using the same Qualcomm reference design for the Daydream platform.
Facebook-owned Oculus has not officially announced its intent but rumors in July point us to another Qualcomm-powered headset that will sell for around $200. Facebook plans to reveal the hardware in October.
HTC’s decision to target the China market first is driven by its ability to promote its custom Viveport software store in a region that does not offer Google services like the Android Play Store or Daydream. HTC will leverage a customer base that is larger than North America and Western Europe combined, and one that is expected to grow rapidly. IDC statistics show VR headset shipments reaching 10.1 million units this year and target 61 million units by 2020 worldwide. iResearch Consulting estimates Chinese VR market revenues to reach $8.1B in that same time frame.
Growth in VR and AR (augmented reality) is driven by the consumer markets but it is the enterprise implementations that provide the push for expanded usage models. Medical professionals already utilize VR technology to analyze data and mechanical engineers can dissect and evaluate models of products in a virtual space to improve and speed up workflows. Target fields also include factory workers, emergency personnel, the military, delivery drivers, and nearly all facets of business. As VR technology improve usability, comfort, and general societal acceptance, the merger of virtual and augmented reality hardware will create a new age of connected consumers.
VR Performance Evaluation
Even though virtual reality hasn’t taken off with the momentum that many in the industry had expected on the heels of the HTC Vive and Oculus Rift launches last year, it remains one of the fastest growing aspects of PC hardware. More importantly for many, VR is also one of the key inflection points for performance moving forward; it requires more hardware, scalability, and innovation than any other sub-category including 4K gaming. As such, NVIDIA, AMD, and even Intel continue to push the performance benefits of their own hardware and technology.
Measuring and validating those claims has proven to be a difficult task. Tools that we used in the era of standard PC gaming just don’t apply. Fraps is a well-known and well-understood tool for measuring frame rates and frame times utilized by countless reviewers and enthusiasts. But Fraps lacked the ability to tell the complete story of gaming performance and experience. NVIDIA introduced FCAT and we introduced Frame Rating back in 2013 to expand the capabilities that reviewers and consumers had access to. Using more sophisticated technique that includes direct capture of the graphics card output in uncompressed form, a software-based overlay applied to each frame being rendered, and post-process analyzation of that data, we were able to communicate the smoothness of a gaming experience, better articulating it to help gamers make purchasing decisions.
VR pipeline when everything is working well.
For VR though, those same tools just don’t cut it. Fraps is a non-starter as it measures frame rendering from the GPU point of view and completely misses the interaction between the graphics system and the VR runtime environment (OpenVR for Steam/Vive and OVR for Oculus). Because the rendering pipeline is drastically changed in the current VR integrations, what Fraps measures is completely different than the experience the user actually gets in the headset. Previous FCAT and Frame Rating methods were still viable but the tools and capture technology needed to be updated. The hardware capture products we used since 2013 were limited in their maximum bandwidth and the overlay software did not have the ability to “latch in” to VR-based games. Not only that but measuring frame drops, time warps, space warps and reprojections would be a significant hurdle without further development.
VR pipeline with a frame miss.
NVIDIA decided to undertake the task of rebuilding FCAT to work with VR. And while obviously the company is hoping that it will prove its claims of performance benefits for VR gaming, it should not be overlooked the investment in time and money spent on a project that is to be open sourced and free available to the media and the public.
NVIDIA FCAT VR is comprised of two different applications. The FCAT VR Capture tool runs on the PC being evaluated and has a similar appearance to other performance and timing capture utilities. It uses data from Oculus Event Tracing as a part of the Windows ETW and SteamVR’s performance API, along with NVIDIA driver stats when used on NVIDIA hardware to generate performance data. It will and does work perfectly well on any GPU vendor’s hardware though with the access to the VR vendor specific timing results.
Subject: General Tech | October 6, 2016 - 07:00 PM | Tim Verry
Tagged: virtual reality, htc vive, assistive technology
As technology continues to advance, virtual reality is slowly but surely becoming more of a reality. For many readers, VR is the next step in gaming and achieving an immersive (virtual) experience. However, for Jamie Soar virtual reality is being used to allow him to experience what it is like to have "normal" vision in the real world. Mr. Soar lives with a genetic and progressive eye condition called Retinitis Pigmentosa as well as diplopia (or double vision) which means that he has severely limited night and peripheral vision. Jamie uses a white cane for mobility and needs to get close to things like computer monitors and signs in order to read them.
EIC Ryan Shrout using the HTC Vive to enter a VR world (Job Simulator) during a live stream.
Enter the HTC Vive and its dual lens solution that puts the displays (and the vitrual world) front and center. After donning the virtual reality headset at a PC World demo in the UK, Jamie was amazingly able to experience the virtual world in a similar way to how many people see the real world. His eyes were able to refocus on the close up displays, and thanks to the illusion of depth created by the dual lenses, he was able to look around the virtual world and see everything clearly and in brilliant color both near and far!
Via Blindness.org: An example of what vision is like with Retinitis Pigmentosa in an advanced stage. Peripheral and night vision are generally the first aspects to be lost as photoreceptors (rods) on outer edges of retina die.
In an interview with Upload VR, Mr. Soar had this to say to those with similar visual impairments:
“Try VR . Find a means to try it because I went so long without ever knowing that this extra dimension existed that you can see. Try out as many experiences as possible. It might not be for everyone but it might give people a lot more freedom or independence in what they do.”
This is a very cool story and I am excited for Mr. Soar. The aspiring music producer plans to continue experimenting with VR and I hope that as it continues to advance it can help him even more. My first thought jumped to Scott's desire to use VR for productivity work using an infinite desktop and how it could help Jamie compose and produce his music and get the same – or better – benefits most people get from having mutiple monitor setups without having to lean in to each monitor. I do not have nearly the vision loss that Mr. Soar has, but I can definitely empathize with him on many points. I think that it is awesome that he was able to test out VR and explore how he can use it to help him!
In my case I am more looking forward to AR (augmented reality) and future products built on things like Or Cam, Microsoft's Seeing AI project (which I thought I wrote about previously but can not find it via Google heh), and even things like and AiPoly (iOS) that use neural networks and can identify objects, people and their facial expressions, and even describe what is happening in natural language (we are not quite there yet but are definitely getting there).
Regardless of whether AR or VR, the advances in technology in just my 26 years have been amazing and the assitive technology available now is unbelievable. The future is exciting, indeed and I can't wait to see what comes next!
Subject: General Tech, Processors, Displays, Shows and Expos | August 16, 2016 - 01:50 PM | Ryan Shrout
Tagged: VR, virtual reality, project alloy, Intel, augmented reality, AR
At the opening keynote to this summer’s Intel Developer Forum, CEO Brian Krzanich announced a new initiative to enable a completely untether VR platform called Project Alloy. Using Intel processors and sensors the goal of Project Alloy is to move all of the necessary compute into the headset itself, including enough battery to power the device for a typical session, removing the need for a high powered PC and a truly cordless experience.
This is indeed the obvious end-game for VR and AR, though Intel isn’t the first to demonstrate a working prototype. AMD showed the Sulon Q, an AMD FX-based system that was a wireless VR headset. It had real specs too, including a 2560x1440 OLED 90Hz display, 8GB of DDR3 memory, an AMD FX-8800P APU with R7 graphics embedded. Intel’s Project Alloy is currently using unknown hardware and won’t have a true prototype release until the second half of 2017.
There is one key advantage that Intel has implemented with Alloy: RealSense cameras. The idea is simple but the implications are powerful. Intel demonstrated using your hands and even other real-world items to interact with the virtual world. RealSense cameras use depth sensing to tracking hands and fingers very accurately and with a device integrated into the headset and pointed out and down, Project Alloy prototypes will be able to “see” and track your hands, integrating them into the game and VR world in real-time.
The demo that Intel put on during the keynote definitely showed the promise, but the implementation was clunky and less than what I expected from the company. Real hands just showed up in the game, rather than representing the hands with rendered hands that track accurately, and it definitely put a schism in the experience. Obviously it’s up to the application developer to determine how your hands would actually be represented, but it would have been better to show case that capability in the live demo.
Better than just tracking your hands, Project Alloy was able to track a dollar bill (why not a Benjamin Intel??!?) and use it to interact with a spinning lathe in the VR world. It interacted very accurately and with minimal latency – the potential for this kind of AR integration is expansive.
Those same RealSense cameras and data is used to map the space around you, preventing you from running into things or people or cats in the room. This enables the first “multi-room” tracking capability, giving VR/AR users a new range of flexibility and usability.
Though I did not get hands on with the Alloy prototype itself, the unit on-stage looked pretty heavy, pretty bulky. Comfort will obviously be important for any kind of head mounted display, and Intel has plenty of time to iterate on the design for the next year to get it right. Both AMD and NVIDIA have been talking up the importance of GPU compute to provide high quality VR experiences, so Intel has an uphill battle to prove that its solution, without the need for external power or additional processing, can truly provide the untethered experience we all desire.
Subject: Graphics Cards | April 4, 2016 - 09:00 AM | Sebastian Peak
Tagged: workstation, VR, virtual reality, quadro, NVIDIA Quadro M5500, nvidia, msi, mobile workstation, enterprise
NVIDIA's VR Ready program, which is designed to inform users which GeForce GTX GPUs “deliver an optimal VR experience”, has moved to enterprise with a new program aimed at NVIDIA Quadro GPUs and related systems.
“We’re working with top OEMs such as Dell, HP and Lenovo to offer NVIDIA VR Ready professional workstations. That means models like the HP Z Workstation, Dell Precision T5810, T7810, T7910, R7910, and the Lenovo P500, P710, and P910 all come with NVIDIA-recommended configurations that meet the minimum requirements for the highest performing VR experience.
Quadro professional GPUs power NVIDIA professional VR Ready systems. These systems put our VRWorks software development kit at the fingertips of VR headset and application developers. VRWorks offers exclusive tools and technologies — including Context Priority, Multi-res Shading, Warp & Blend, Synchronization, GPU Affinity and GPU Direct — so pro developers can create great VR experiences.”
Partners include Dell, HP, and Lenovo, with new workstations featuring NVIDIA professional VR Ready certification.
Desktop isn't the only space for workstations, and in this morning's announcement NVIDIA and MSI are introducing the WT72 mobile workstation; the “the first NVIDIA VR Ready professional laptop”:
"The MSI WT72 VR Ready laptop is the first to use our new Maxwell architecture-based Quadro M5500 GPU. With 2,048 CUDA cores, the Quadro M5500 is the world’s fastest mobile GPU. It’s also our first mobile GPU for NVIDIA VR Ready professional mobile workstations, optimized for VR performance with ultra-low latency."
Here are the specs for the WT72 6QN:
- GPU: NVIDIA Quadro M5500 3D (8GB GDDR5)
- CPU Options:
- Xeon E3-1505M v5
- Core i7-6920HQ
- Core i7-6700HQ
- Chipset: CM236
- 64GB ECC DDR4 2133 MHz (Xeon)
- 32GB DDR4 2133 MHz (Core i7)
- Storage: Super RAID 4, 256GB SSD + 1TB SATA 7200 rpm
- 17.3” UHD 4K (Xeon, i7-6920HQ)
- 17.3” FHD Anti-Glare IPS (i7-6700HQ)
- LAN: Killer Gaming Network E2400
- Optical Drive: BD Burner
- I/O: Thunderbolt, USB 3.0 x6, SDXC card reader
- Webcam: FHD type (1080p/30)
- Speakers: Dynaudio Tech Speakers 3Wx2 + Subwoofer
- Battery: 9 cell
- Dimensions: 16.85” x 11.57” x 1.89”
- Weight: 8.4 lbs
- Warranty: 3-year limited
- Xeon E3-1505M v5 model: $6899
- Core i7-6920HQ model: $6299
- Core i7-6700HQ model: $5499
No doubt we will see details of other Quadro VR Ready workstations as GTC unfolds this week.
Subject: General Tech, Graphics Cards | March 26, 2016 - 12:11 AM | Ryan Shrout
Tagged: VR, vive pre, vive, virtual reality, video, pre, htc
On Friday I was able to get a pre-release HTC Vive Pre in the office and spend some time with it. Not only was I interested in getting more hands-on time with the hardware without a time limit but we were also experimenting with how to stream and record VR demos and environments.
Enjoy and mock!
Subject: General Tech | March 15, 2016 - 05:32 PM | Sebastian Peak
Tagged: VRScore, VR, virtual reality, gdc 2016, GDC, crytek, CRYENGINE, benchmark, Basemark
Basemark has announced VRScore, a new benchmarking tool for VR produced in partnership with Crytek. The benchmark uses Crytek’s CRYENGINE along with the Basemark framework, and can be run with or without a head-mounted display (HMD).
"With VRScore, consumers and companies are able to reliably test their PC for VR readiness with various head mounted displays (HMDs). Unlike existing tools developed by hardware vendors themselves, VRScore has been developed independently to be an essential source of unbiased information for anyone interested in VR."
An independent solution is certainly welcome as we enter what promises to be the year of VR, and Basemark is well known for providing objective benchmark results with applications such as Basemark X and OS II, cross-platform benchmarks for mobile devices. The VRScore benchmark supports the Oculus Rift, HTC Vive, and Razer's OSVR headsets, and the corporate versions include VRTrek, a left/right eye latency measurement device.
Here’s the list of features from Basemark:
- Supports HTC Vive, Oculus Rift and OSVR
- Uses CRYENGINE
- Supports both DirectX 12 and DirectX 11
- Features Codename: Sky Harbor, an original IP game scene by Crytek
- Includes tests for interactive VR (VR game), non-interactive VR (360 VR video) and VR spatial audio (360 sound)
- Can be used with or without an HMD
- Power Board, an integrated online service, gives personalized PC upgrading advice and features performance ranking lists for HMDs, CPUs and GPUs
- Corporate versions include VRTrek, a patent pending latency testing device with dual phototransistors for application to photon latency, display persistence, left and right eye latency, dropped frames and duplicated frames testing
VRScore Trek eye latency measurement device, included with corporate version
VRScore is currently available only to corporate customers via the company’s early access program and Benchmark Development Program. The consumer versions (free and paid) will be released in June.
Subject: General Tech | March 14, 2016 - 01:51 PM | Sebastian Peak
Tagged: wireless vr headset, vr headset, VR, virtual reality, Sulon Q, FX-8800P, amd fx, amd
AMD is powering the world's first truly self-contained VR solution, the Sulon Q, a wireless headset with a powerful computer built in.
AMD has partnered with Sulon Technologies, an startup based in Toronto, to produce this new headset, which seems to have the potential to disrupt the fledgling VR market. The idea is simple, and unique; unlike existing designs that require a VR-ready PC (Oculus Rift, HTC Vive) or the latest smartphone (GearVR) to work, the Sulon Q VR headset incorporates a full gaming PC inside the headset, allowing for the first actually wireless experience in this young technology's existence.
As Ars Technica notes in their post on the Sulon Q this morning:
"According to the announcement, that 'wear and play' untethered design makes the Sulon Q quite different from competition like the Oculus Rift or SteamVR-powered HTC Vive, which both need a relatively high-end PC to actually generate the images on the headset. With the Sulon Q, the Windows 10 PC hardware is built into the unit, including an expected four-core AMD FX-8800P processor with a Radeon R7 graphics card."
Who wouldn't want to wear an entire PC on their head? Thermal (and other health) concerns aside, just what sort of hardware is under the hood (so to speak)? According to the report published at VideoCardz this morning, it will offer a new AMD FX processor (the FX-8800P) and overall specs that look like they belong more to a gaming laptop than a VR headset.
(Quoting directly from the report on VideoCardz via this Reddit post):
Experiences: VR, AR, and spatial computing Ergonomics Lightweight, comfortable, ergonomically designed all-in-one tether-free form factor
Processors: AMD FX-8800P processor at up to 35W with Radeon R7 Graphics leveraging AMD’s Graphics Core Next architecture 4 compute cores and 8 GPU cores unlocked through Heterogeneous System Architecture (HSA) Sulon Spatial Processing Unit (SPU)
Memory: 8 GB DDR3 Memory
Storage: 256 GB SSD
Display: 2560×1440 OLED display at 90 Hz 110-degree Field-of-View
Audio: 3D spatial audio powered by GenAudio’s AstoundSound® technology Built-in 3.5 mm audio jack Custom spatially-optimized Sulon Q earbuds Dual noise-cancelling embedded microphones.
Tracking: Sulon Spatial Processing Unit combining real-time machine vision technologies and mixed reality spatial computer for real-time environment mapping and tracking from the inside outward, dynamic virtualization for VR/AR fusion, and gesture recognition
Sensors: Accelerometer, Gyroscope, Magnetometer, SPU
Software: Microsoft Windows® 10 “Project Dragon” application for spatial computing AMD LiquidVR technologies for ensure smooth and responsive VR and AR experiences
Peripherals: Wireless keyboard and mouse provided in box Any other Windows 10-compatible controllers and joysticks
Connectivity: WiFi 802.11ac + Bluetooth 4.1, 2x USB 3.0 Type A, Micro HDMI OUT
A video for the Sulon Q is also up on YouTube this morning:
The two biggest questions that always accompany any new hardware announcement - how much will it cost, and when is it available - have not been answered just yet. We'll await further information as GDC has just begun, but it seems very safe to say that 2016 will be focused very heavily on VR.
Subject: Graphics Cards, Shows and Expos | January 5, 2016 - 09:39 PM | Ryan Shrout
Tagged: vr ready, VR, virtual reality, video, Oculus, nvidia, htc, geforce, CES 2016, CES
Other than the in-depth discussion from NVIDIA on the Drive PX 2 and its push into autonomous driving, NVIDIA didn't have much other news to report. We stopped by the suite and got a few updates on SHIELD and the company's VR Ready program to certify systems that meet minimum recommended specifications for a solid VR experience.
For the SHIELD, NVIDIA is bringing Android 6.0 Marshmallow to the device, with new features like shared storage and the ability to customize the home screen of the Android TV interface. Nothing earth shattering and all of it is part of the 6.0 rollout.
The VR Ready program from NVIDIA will validate notebooks, systems and graphics cards that have the amount of horsepower to meet the minimum performance levels for a good VR experience. At this point, the specs essentially match up with what Oculus has put forth: a GTX 970 or better on the desktop and a GTX 980 (full, not 980M) on mobile.
Other than that, Ken and I took in some of the more recent VR demos including Epic's Bullet Train on the final Oculus Rift and Google's Tilt Brush on the latest iteration of the HTC Vive. Those were both incredibly impressive though the Everest demo that simulates a portion of the mountain climb was the one that really made me feel like I was somewhere else.
Check out the video above for more impressions!
Follow all of our coverage of the show at http://pcper.com/ces!