Subject: Graphics Cards | March 28, 2016 - 02:20 PM | Ryan Shrout
Tagged: vive, valve, steamvr, rift, Oculus, nvidia, htc, amd
As the first Oculus Rift retail units begin hitting hands in the US and abroad, both AMD and NVIDIA have released new drivers to help gamers ease into the world of VR gaming.
Up first is AMD, with Radeon Software Crimson Edition 16.3.2. It adds support for Oculus SDK v1.3 and the Radeon Pro Duo...for all none of you that have that product in your hands. AMD claims that this driver will offer "the most stable and compatible driver for developing VR experiences on the Rift to-date." AMD tells us that the latest implementation of LiquidVR features in the software help the SDKs and VR games at release take better advantage of AMD Radeon GPUs. This includes capabilities like asynchronous shaders (which AMD thinks should be capitalized for some reason??) and Quick Response Queue (which I think refers to the ability to process without context change penalties) to help Oculus implement Asynchronous Timewarp.
NVIDIA's release is a bit more substantial, with GeForce Game Ready 364.72 WHQL drivers adding support for the Oculus Rift, HTC Vive and improvements for Dark Souls III, Killer Instinct, Paragon early access and even Quantum Break.
For the optimum experience when using the Oculus Rift, and when playing the thirty games launching alongside the headset, upgrade to today's VR-optimized Game Ready driver. Whether you're playing Chronos, Elite Dangerous, EVE: Valkyrie, or any of the other VR titles, you'll want our latest driver to minimize latency, improve performance, and add support for our newest VRWorks features that further enhance your experience.
Today's Game Ready driver also supports the HTC Vive Virtual Reality headset, which launches next week. As with the Oculus Rift, our new driver optimizes and improves the experience, and adds support for the latest Virtual Reality-enhancing technology.
Good to see both GPU vendors giving us new drivers for the release of the Oculus Rift...let's hope it pans out well and the response from the first buyers is positive!
Subject: Graphics Cards | March 19, 2016 - 07:02 PM | Ryan Shrout
Tagged: VR, vive, valve, htc, gdc 2016, GDC
A story posted over at UploadVR has some interesting information that came out of the final days of GDC last week. We know that Valve, HTC and Oculus have recommended users have a Radeon R9 290 or GTX 970 GPU or higher to run virtual reality content on both the Vive and the Rift, and that comes with a high cost for users that weren't already invested in PC gaming. Valve’s Alex Vlachos has other plans that might enable graphics cards from as far back as 2012 to work in Valve's VR ecosystem.
Valve wants to lower the requirements for VR
Obviously there are some trade offs to consider. The reason GPUs have such high requirements for the Rift and Vive is their need to run at 90 FPS / 90 Hz without dropping frames to create a smooth and effective immersion. Deviance from that means the potential for motion sickness and poor VR experiences in general.
From UploadVR's story:
“As long as the GPU can hit 45 HZ we want for people to be able to run VR,” Vlachos told UploadVR after the talk. “We’ve said the recommended spec is a 970, same as Oculus, but we do want lesser GPUs to work. We’re trying to reduce the cost [of VR].”
It's interesting that Valve would be talking about a 45 FPS target now, implying there would be some kind of frame doubling or frame interpolation to get back to the 90 FPS mark that the company believes is required for a good VR experience.
Image source: UploadVR
Vlachos also mentioned some other avenues that Valve could expand on to help improve performance. One of them is "adaptive quality", a feature we first saw discussed with the release of the Valve SteamVR Performance Test. This would allow the game to lower the image quality dynamically (texture detail, draw distance, etc.) based on hardware performance but might also include something called fixed foveated rendering. With FFR only the center of the image is rendered at maximum detail while the surrounding image runs at lower quality; the theory being that you are only focused on the center of the screen anyway and human vision blurs the periphery already. This is similar to NVIDIA's multi-res shading technology that is integrated into UE4 already, so I'm curious to see how this one might shape out.
Another quote from UploadVR:
“I can run Aperture [a graphically rich Valve-built VR experience] on a 680 without dropping frames at a lower quality, and, for me, that’s enough of a proof of concept,” Vlachos said.
I have always said that neither Valve nor Oculus are going to lock out older hardware, but that they wouldn't directly support it. That a Valve developer can run its performance test (with adaptive quality) on a GTX 680 is a good sign.
The Valve SteamVR Performance Test
But the point is also made by Vlachos that "most art we’re seeing in VR isn’t as dense" as other PC titles is a bit worrisome. We WANT VR games to improve to the same image quality and realism levels that we see in modern PC titles and not depend solely on artistic angles to get to the necessary performance levels for high quality virtual reality. Yes, the entry price today for PC-based VR is going to be steep, but I think "console-ifying" the platform will do a disservice in the long run.
Subject: Graphics Cards | February 22, 2016 - 11:03 PM | Ryan Shrout
Tagged: vive, valve, steamvr, steam, rift, performance test, Oculus, htc
Though I am away from my stacks of hardware at the office attending Mobile World Congress in Barcelona, Valve dropped a bomb on us today in the form of a new hardware performance test that gamers can use to determine if they are ready for the SteamVR revolution. The aptly named "SteamVR Performance Test" is a free title available through Steam that any user can download and run to get a report card on their installed hardware. No VR headset required!
And unlike the Oculus Compatibility Checker, the application from Valve runs actual game content to measure your system. Oculus' app only looks at the hardware on your system for certification, not taking into account the performance of your system in any way. (Overclockers and users with Ivy Bridge Core i7 processors have been reporting failed results on the Oculus test for some time.)
The SteamVR Performance Test runs a set of scenes from the Aperture Science Robot Repair demo, an experience developed directly for the HTC Vive and one that I was able to run through during CES last month. Valve is using a very interesting new feature called "dynamic fidelity" that adjusts image quality of the game in a way to avoid dropped frames and frame rates under 90 FPS in order to maintain a smooth and comfortable experience for the VR user. Though it is the first time I have seen it used, it sounds similar to what John Carmack did with the id Tech 5 engine, attempting to balance performance on hardware while maintaining a targeted frame rate.
The technology could be a perfect match for VR content where frame rates above or at the 90 FPS target are more important than visual fidelity (in nearly all cases). I am curious to see how Valve may or may not pursue and push this technology in its own games and for the Vive / Rift in general. I have some questions pending with them, so we'll see what they come back with.
A result for a Radeon R9 Fury provided by AMD
Valve's test offers a very simple three tiered breakdown for your system: Not Ready, Capable and Ready. For a more detailed explanation you can expand on the data to see metrics like the number of frames you are CPU bound on, frames below the very important 90 FPS mark and how many frames were tested in the run. The Average Fidelity metric is the number that we are reporting below and essentially tells us "how much quality" the test estimates you can run at while maintaining that 90 FPS mark. What else that fidelity result means is still unknown - but again we are trying to find out. The short answer is that the higher that number goes, the better off you are, and the more demanding game content you'll be able to run at acceptable performance levels. At least, according to Valve.
Because I am not at the office to run my own tests, I decided to write up this story using results from a third part. That third party is AMD - let the complaining begin. Obviously this does NOT count as independent testing but, in truth, it would be hard to cheat on these results unless you go WAY out of your way to change control panel settings, etc. The demo is self run and AMD detailed the hardware and drivers used in the results.
- Intel i7-6700K
- 2x4GB DDR4-2666 RAM
- Z170 motherboard
- Radeon Software 16.1.1
- NVIDIA driver 361.91
- Win10 64-bit
|2x Radeon R9 Nano||11.0|
|GeForce GTX 980 Ti||11.0|
|Radeon R9 Fury X||9.6|
|Radeon R9 Fury||9.2|
|GeForce GTX 980||8.1|
|Radeon R9 Nano||8.0|
|Radeon R9 390X||7.8|
|Radeon R9 390||7.0|
|GeForce GTX 970||6.5|
These results were provided by AMD in an email to the media. Take that for what you will until we can run our own tests.
First, the GeForce GTX 980 Ti is the highest performing single GPU tested, with a score of 11 - because of course it goes to 11. The same score is reported on the multi-GPU configuration with two Radeon R9 Nanos so clearly we are seeing a ceiling of this version of the SteamVR Performance Test. With a single GPU score of 9.2, that is only a 19% scaling rate, but I think we are limited by the test in this case. Either way, it's great news to see that AMD has affinity multi-GPU up and running, utilizing one GPU for each eye's rendering. (AMD pointed out that users that want to test the multi-GPU implementation will need to add the -multigpu launch option.) I still need to confirm if GeForce cards scale accordingly. UPDATE: Ken at the office ran a quick check with a pair of GeForce GTX 970 cards with the same -multigpu option and saw no scaling improvements. It appears NVIDIA has work to do here.
Moving down the stack, its clear why AMD was so excited to send out these early results. The R9 Fury X and R9 Fury both come out ahead of the GeForce GTX 980 while the R9 Nano, R9 390X and R9 390 result in better scores than NVIDIA's GeForce GTX 970. This comes as no surprise - AMD's Radeon parts tend to offer better performance per dollar when it comes to benchmarks and many games.
There is obviously a lot more to consider than the results this SteamVR Performance Test provides when picking hardware for a VR system, but we are glad to see Valve out in front of the many, many questions that are flooding forums across the web. Is your system ready??
Subject: Displays, Shows and Expos | February 22, 2016 - 01:27 AM | Scott Michaud
Tagged: MWC, mwc 16, valve, htc, vive, Oculus
Valve and HTC announced that the Vive consumer edition will be available in April for $799 USD, with pre-orders beginning on February 29th. Leave it to Valve to launch a product on a date that doesn't always exist. The system comes with the headset, two VR controllers, and two sensors. The unit will have “full commercial availability” when it launches in April, but that means little if it sells out instantly. There's no way to predict that.
The announcement blog post drops a subtle jab at Oculus. “Vive will be delivered as a complete kit” seems to refer to the Oculus Touch controllers being delayed (and thus not in the hands of every user). This also makes me think about the price. The HTC Vive costs $200 more than the Oculus Rift. That said, it also has the touch controllers, which could shrink that gap. It also does not come with a standard gamepad, like Oculus does, although that's just wasted money if you already have one.
Unlike the Oculus, which has its own SDK, the Vive is powered by SteamVR. Most engines and middleware that support one seem to support both, so I'm not sure if this will matter. It could end up blocking content in an HD-DVD vs BluRay fashion. Hopefully Valve/HTC and Oculus/Facebook, or every software vendor on an individual basis, works through these interoperability concerns and create an open platform. Settling on a standard tends to commoditize industries, but that will eventually happen to VR at some point anyway. Hopefully, if it doesn't happen sooner, cross-compatibility at least happens then.
That Depends on Whether They Need One
Ars Technica UK published an editorial called, Hey Valve: What's the point of Steam OS? The article does not actually pose the question in it's text -- it mostly rants about technical problems with a Zotac review unit -- but the headline is interesting none-the-less.
Here's my view of the situation.
The Death of Media Center May Have Been...
There's two parts to this story, and both center around Windows 8. The first was addressed in an editorial that I wrote last May, titled The Death of Media Center & What Might Have Been. Microsoft wanted to expand the PC platform into the living room. Beyond the obvious support for movies, TV, and DVR, they also pushed PC gaming in a few subtle ways. The Games for Windows certification required games to be launchable by Media Center and support Xbox 360 peripherals, which pressures game developers to make PC games comfortable to play on a couch. They also created Tray and Play, which is an optional feature that allows PC games to be played from the disk while they installed in the background. Back in 2007, before Steam and other digital distribution services really took off, this eliminated install time, which was a major user experience problem with PC gaming (and a major hurdle for TV-connected PCs).
It also had a few nasty implications. Games for Windows Live tried to eliminate modding by requiring all content to be certified (or severely limiting the tools as seen in Halo 2 Vista). Microsoft was scared about the content that users could put into their games, especially since Hot Coffee (despite being locked, first-party content) occurred less than two years earlier. You could also argue that they were attempting to condition PC users to accept paid DLC.
Regardless of whether it would have been positive or negative for the PC industry, the Media Center initiative launched with Windows Vista, which is another way of saying “exploded on the launch pad, leaving no survivors.” Windows 7 cleared the wreckage with a new team, who aimed for the stars with Windows 8. They ignored the potential of the living room PC, preferring devices and services (ie: Xbox) over an ecosystem provided by various OEMs.
If you look at the goals of Steam OS, they align pretty well with the original, Vista-era ambitions. Valve hopes to create a platform that hardware vendors could compete on. Devices, big or small, expensive or cheap, could fill all of the various needs that users have in the living room. Unfortunately, unlike Microsoft, they cannot be (natively) compatible with the catalog of Windows software.
This may seem like Valve is running toward a cliff, but keep reading.
What If Steam OS Competed with Windows Store?
Windows 8 did more than just abandon the vision of Windows Media Center. Driven by the popularity of the iOS App Store, Microsoft saw a way to end the public perception that Windows is hopelessly insecure. With the Windows Store, all software needs to be reviewed and certified by Microsoft. Software based on the Win32 API, which is all software for Windows 7 and earlier, was only allowed within the “Desktop App,” which was a second-class citizen and could be removed at any point.
This potential made the PC software industry collectively crap themselves. Mozilla was particularly freaked out, because Windows Store demanded (at the time) that all web browsers become reskins of Internet Explorer. This means that Firefox would not be able to implement any new Web standards on Windows, because it can only present what Internet Explorer (Trident) draws. Mozilla's mission is to develop a strong, standards-based web browser that forces all others to interoperate or die.
Remember: “This website is best viewed with Internet Explorer”?
Executives from several PC gaming companies, including Valve, Blizzard, and Mojang, spoke out against Windows 8 at the time (along with browser vendors and so forth). Steam OS could be viewed as a fire escape for Valve if Microsoft decided to try its luck and kill, or further deprecate, Win32 support. In the mean time, Windows PCs could stream to it until Linux gained a sufficient catalog of software.
Image Credit: Wikipedia
This is where Steam OS gets interesting. Its software library cannot compete against Windows with its full catalog of Win32 applications, at least not for a long time. On the other hand, if Microsoft continues to support Win32 as a first-class citizen, and they returned to the level of openness with software vendors that they had in the Windows XP era, then Valve doesn't really have a reason to care about Steam OS as anything more than a hobby anyway. Likewise, if doomsday happens and something like Windows RT ends up being the future of Windows, as many feared, then Steam OS wouldn't need to compete against Windows. Its only competition from Microsoft would be Windows Store apps and first-party software.
I would say that Valve might even have a better chance than Microsoft in that case.
Subject: General Tech | January 9, 2016 - 12:06 AM | Scott Michaud
Tagged: valve, half life 3
I won't blame them if they hide the silverware, however. I can be trusted with company secrets, but not with spoons. Never with spoons.
Marc Laidlaw is an author of science fiction, who wrote much of the story of Half-Life, its expansions, and Half-Life 2. Valve's flat corporate structure (at least at the time) makes it difficult to find out who did what. All employees are listed alphabetically in the credits. He hasn't been given a lot of public credit since Half-Life 2, though.
Whatever he's been working on, he has since retired from the company after eighteen years. On his way out, he emailed a Reddit user with opinions regarding his departure, because that's a Valve thing to do I guess. Gamasutra confirmed it's true. It's a relatively short, interview format letter. The Reddit user apparently initiated contact and didn't realize Marc had just retired.
He wouldn't go into too many details about why he left the company, except that he's “old” and he wants to start writing his own narratives. He published several novels before being hired at Valve Software, which he apparently shelved after The 37th Mandala at the short story Catamounts in 1996. He wrote a couple of short stories in the last 2000s, right after Half-Life 2: Episode 2 launched. He wishes to go back to doing that again, which should be a nice retirement pass-time.
What this means for future Half-Life titles? Who knows.
He says that everything's in Valve's hands at the moment, but he could very well have wrapped up involvement in a project just before he left. I mean, it's been five or six years since his last publicly credited work. That's plenty of time to finish an unannounced product. Again, who knows?
Subject: General Tech | January 6, 2016 - 03:03 AM | Scott Michaud
Tagged: valve, CS:GO, esports
About a year ago, Valve blocked several players from participating in their sponsored tournaments when the players were believed to be match fixing. This is the practice of arranging outcomes in events and tournaments. This is often accompanied by betting on the pre-arranged winners, but it could also be used to shift around positions in seed brackets by having one or more member intentionally lose winnable games. This is bad all-around, but can even be illegal (due to the implications of fraud and so forth).
Since then, the game developer has reviewed their earlier decision, and they decided to make it permanent. They did not state how many players were involved, although PC Gamer knows of 21. These individuals will never be allowed to compete at any Valve-sponsored tournaments, and other organizers will be able to extend those bans to their events, too.
A similar incident happened in the Korean StarCraft II scene. In that situation, a dozen individuals were arrested and detained by Korean law enforcement, charged for betting (or enabling third-parties to bet) on predetermined outcomes. This has been an ongoing problem.
Subject: General Tech | December 31, 2015 - 08:00 PM | Scott Michaud
Tagged: htc, valve, vive, vive vr
This bit of news is a little more pleasant for Valve. According to Engadget, the HTC Vive has passed FCC approval. HTC recently announced that the product would launch in April, slipping from its original launch date, Holiday 2015, by a few months. This was due to a “very, very big technological breakthrough” that was in no way elaborated on.
The linked FCC report calls the device the “HTC Base station.” This likely refers to the Lighthouse laser tracking system that are monitored by light sensors on the headset and controllers. The public notice includes the FCC warning label, which mentions that the device is a Class 1 laser system. There are five classifications of lasers, from Class 1 through Class 4 (with Class 3 split into Class 3a and Class 3b). Class 1 means that the laser is completely incapable of producing harmful radiation. Class 4 can cause fires. Since HTC's device is Class 1, this means that either the laser's intensity is too low to cause damage, even with sustained viewing, or the laser never produces a harmful amount of radiation in a way that could be viewed under normal operation. For instance, a laser printer is a “Class 1” laser, because everything occurs within the device. Laser pointers, on the other hand, are typically Class 2.
This raises an interesting question about how the lasers are used. They are clearly emitted into open space, because the sensors are on the visor. This suggests that the lasers are either very low power, or the beam is manipulated in such a way that it cannot be pointed into someone's eye for a meaningful amount of time. How? No idea.
HTC and Valve are expected to fully unveil the product at CES. PC Perspective will be at the event, and we'll probably have more information at that time.
Subject: General Tech | December 31, 2015 - 04:48 AM | Scott Michaud
Tagged: valve, steam, security, Privacy
On Christmas Day, Valve had a few hours of problems. Their servers were being overloaded by malicious traffic. The best analogy that I could provide would be a bad organization who sent a thousand people to Walmart, to do nothing but stand in the check-out line and ask the cashier about the time. This clogs up the infrastructure, preventing legitimate customers from making their transactions. This was often done after demanding a ransom. Don't pay? Your servers get clogged at the worst time.
A little too much sharing...
There are two ways to counter-act a DDoS attack: add hardware or make your site more efficient.
When a website is requested, the server generates the page and sends it to the customer. This process is typically slow, especially for complicated sites that pull data from one or more database(s). It then feeds this data to partners to send to customers. Some pages, like the Steam Store's front page, are mostly the same for anyone who views it (from the same geographic region). Some pages, like your order confirmation page, are individual. You can save server performance by generating the pages only when they change, and giving them to relevant users from the closest delivery server.
Someone, during a 20-fold spike in traffic relative to the typical Steam Sale volume, accidentally started saving (caching) pages with private information and delivering them to random users. This includes things like order confirmation and contact information pages for whatever logged-in account generated them. This is pretty terrible for privacy. Again, it does not allow users to interact with the profiles of other users, just see the results that other users generated.
But this is still quite bad.
Users complained, especially on Twitter, that Valve should have shut down their website immediately. From my position, I agree, especially since attempting to make a purchase tells the web server to pull the most sensitive information (billing address, etc.) from the database. I don't particularly know why Valve didn't, but I cannot see that from the outside.
But again, I don't work there. I don't know the details.
Subject: General Tech | December 18, 2015 - 09:26 PM | Scott Michaud
Tagged: valve, htc, vive, vive vr
A grain of salt is needed for this one. Users on Reddit claim to have found a pair of renders, one of the headset and one of the controllers, for the HTC Vive VR system. They also have a screenshot of the page, although the first words you see are “This Is Real,” which are the most sketchy, ironic, and unfortunate words to be greeted with in a product leak.
The current HTC Vive prototype looks like a rough version of this. There are some significant differences, though. My major concern is at the front of the headset. You can clearly see a front-facing camera as well as two nubs below it, one to the bottom-left, and one to the bottom-right. If those two nubs are also cameras, then that makes a bit more sense.
If those two nubs are not cameras, then Valve would have downgraded from a two-camera system, in the original prototype, to a single camera. Valve has already claimed that the Vive will have front-facing cameras, plural, to track objects (like pets) for safety reasons. I can see them adding an extra camera, but I doubt that they would use just a single one. Two cameras allow more accurate depth tracking at low distances, which is when you risk... interacting... with the user. That sounds unlikely.
If it's three cameras? That makes sense.
Kyle Orland of Ars Technica is using the original prototype during GDC 2015.
Image Credit: Ars Technica
The controllers are also interesting, but mostly from an aesthetic standpoint. The hexagonal plates, which apparently functioned as sensors, seem to have been changed into circular rings (if the hole goes all the way through). They retain their thumb trackpads, triggers, and a couple of buttons. It's unclear whether each controller is identical, or if there's a difference between the intended-left and intended-right models. Being a lefty, I hope not.
At roughly the same time, Cher Wang, the CEO of HTC, announced that the HTC Vive will be unveiled at CES (in January). It won't be available until around April, but we should know basically all there is to know about the system at next month's trade show. Given this timing, and that multiple users have been posting the leak seemingly independently, it sounds valid. The camera configuration, on the other hand, takes a bit away from that.