All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | June 30, 2017 - 02:17 PM | Ryan Shrout
Tagged: Vega, radeon, Frontier Edition, amd
Hopefully you have already read up on my review of the new Radeon Vega Frontier Edition graphics card; it is full of interesting information about the gaming and professional application performance.
But I thought it would be interesting to share the bare card and GPU in its own post, just to help people find it later on.
For measurements, here's what we were able to gleam with the calipers.
(Editor's Update: we have updated the die measurements after doing a remeasure. I think my first was a bit loose as I didn't want to impact the GPU directly.)
- Die size: 25.90mm x 19.80mm (GPU only, not including memory stacks)
- Area: 512.82mm2
- Package size: 47.3mm x 47.3mm
- Area: 2,237mm2
Enjoy the sexy!
Click to Enlarge
Click to Enlarge
Click to Enlarge
Click to Enlarge
Click to Enlarge
Click to Enlarge
- There is a LOT of empty PCB space on the Vega FE card. This is likely indicative of added area needed for a large heatsink and fan to cool 300-375 watt TDP without throttling.
- Benefits of the smaller HBM-based package appears to be at a cost of SMT components on the GPU substrate and the PCB
- The die size of Vega is large - bigger than GP102 even, despite running at a much lower performance level. It will be interesting to see how AMD answers the question of why the die has expanded as much as it did.
Feel free to leave us some comments if anything stands out!
Subject: General Tech | June 29, 2017 - 09:34 PM | Scott Michaud
Tagged: pc gaming, blizzard
We first reported on StarCraft Remastered when it was announced, which was alongside the GSL 2017 Season 1 finals on March 26th. This was accompanied by a patch that brought the base game up to modern standards, which conveniently allows it to be multiplayer-compatible with Remastered, although skill-based matchmaking is exclusive to Remastered.
It has now been given a trailer, above, and a release date: August 14th, 2017.
As for the price? Pre-orders for StarCraft Remastered are available at $14.99 USD, although it’s unclear whether this price will stick after the pre-order period. I should note that the page states that StarCraft Remastered requires “StarCraft Anthology”. The way its worded makes it look like you need to buy something else, but StarCraft Anthology was made free with the aforementioned 1.18a patch. Basically, it looks like Blizzard is treating StarCraft Remastered as a paid booster to StarCraft Anthology, but, again, the latter is free so it probably only matters in terms of the install process. At least, that’s how it looks to me.
Subject: Mobile | June 29, 2017 - 03:09 PM | Jeremy Hellstrom
Tagged: smartphone, oneplus 5, oneplus
You can pick up the OnePlus 5 with 8GB of RAM and 128GB of storage for $640, or if really want you could grab the model which Ars Technica reviewed for $620 but you have half the storage and only 6GB of RAM. There are likely better deals out there if you shop around, Ars found their review model @ $479.
The phone uses the same Snapdragon 835 SoC and Adreno 540 GPU as the Galaxy S8+ which Sebastian just tested, which shows in the benchmarks Ars Technica ran it through up to and including battery life. In all but the storage tests we see the OnePlus meet or exceed the S8+, however the screen cannot compete. It is a 1080p screen with a lot more bezel than you will find on a Galaxy or even iPhone for that matter. Take a look at the review and decide if you value form over function when it comes to your mobile phone.
"Today OnePlus is both announcing the OnePlus 5 and lifting the review embargo on the device, which we've had for about two weeks now. $479 (£449) gets you an aluminum-clad pocket computer with a 2.45GHz Snapdragon 835 SoC, 6GB of RAM, 64GB of storage, and a 3,300mAh battery."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Wise Pad W7 Windows 10 4G LTE Phablet @ TechARP
- Surface Pro review: Incremental improvement isn’t enough @ Ars Technica
- Asus ROG GX501VI Zephyrus with Nvidia Max-Q technology @ Kitguru
Subject: Memory | June 29, 2017 - 02:03 PM | Tim Verry
Tagged: x299, trident z, samsung 8Gb, overclocking, G.Skill, ddr4
G.Skill recently announced new DDR4 memory kits for the Intel X299 HEDT platform. The new kits include a dual channel DDR4 4400 MHz kit for Kaby Lake X and a quad channel DDR4 4200 MHz kit for Skylake X. The dual channel kit is available under the company’s Trident Z RGB and Trident Z Black brands depending on whether you want RGB lighting or simple black heatspreaders. The quad channel DDR4-4200 kit is only available in non-RGB Trident Z modules.
According to G.Skill, all of the new memory kits use Samsung 8Gb dies and feature CAS latencies of 19-19-19-39. The quad channel 4200 MHz DDR4 modules need 1.40V to hit those specifications, and while it is not yet known what the higher clocked dual channel DDR4 4400 MHz kits need to hit CL19 timings I would presume they need a bit more.
The new kits will be available in an 8GB x 2 (16GB) 4400 MHz kit and up to 64 GB (8GB x 8) 4200 MHz kits. Pricing has not yet been announced, but the new RAM kits should be available soon. While Intel processors do not get as much of a boost from increased memory speeds as AMD’s APUs and Ryzen CPUs do, there are still noticeable gains to be had with faster memory though gamers should still prioritize graphics cards and processors over memory when picking parts for a budget build.
Note that since these kits are using Samsung 8Gb ICs, they also have a good chance of working with Ryzen, but check with your motherboard manufacturer and reviews before ponying up the cash.
Subject: Graphics Cards | June 29, 2017 - 01:33 PM | Jeremy Hellstrom
Tagged: asus, ASUS ROG, gtx 1080 ti, Poseidon GTX 1080 Ti Platinum Edition, poseidon, DirectCU H20, factory overclocked
We've seen the ASUS ROG Poseidon before, the last one that comes to mind being the GTX 980 Ti from Computex 2015. The name refers to the hybrid cooling solution which incorporates both watercooling and aircooling, giving you the option to add watercooling to increase your thermal dissipation or to remain with aircooling. [H]ard|OCP is working on a two part review of the card, this first article covering the performance of the card on aircooling alone. The card exceeded the quoted boost clock of 1708MHz, averaging 1939MHz in the BF1 test on default Gaming Mode clocks, 2025MHz once they overclocked. That is an impressive clock but there are other air cooled cards which are able to reach higher frequencies so it will be interesting to see what adding watercooling to the card will do.
"Air cooling? Liquid Cooling? How about both, the ASUS ROG Poseidon GeForce GTX 1080 Ti Platinum Edition hybrid video card can run them both. In Part 1 of our evaluation we will test the video card on "air cooling" and overclock it as high as possible. In Part 2, we pump liquid through its veins and compare overclocks."
Here are some more Graphics Card articles from around the web:
- Corsair's Hydro GFX GeForce GTX 1080 Ti @ The Tech Report
- Radeon Vega Frontier Edition launches today for $999 and up @ The Tech Report
- MSI Radeon RX 570 GAMING X @ [H]ard|OCP
Subject: General Tech | June 29, 2017 - 12:43 PM | Jeremy Hellstrom
Tagged: amd, ryzen pro, EPYC
Official news about Ryzen Pro has finally arrived and The Tech Report was right on top of it. This is the first we have seen of the "3" parts, a Ryzen 3 Pro 1300 and Ryzen 3 Pro 1200, their four non-SMT cores clocked at a decent 3.5/3.7GHz and 3.1/3.4GHz respectively. That makes the Ryzen 3 Pro 1300 essentially the same chip as the Ryzen 5 Pro 1500 but with half the total cache and without multi-threading, theoretically reducing the price. Five of the six new parts have a TDP of 65W with only the top tier Ryzen 7 Pro 1700X hitting 95W, with its 8 cores, 16 threads operating at 3.5/3.7GHz.
The speeds and core counts are not the most important features of these chips however, it is the features they share with AMD's soon to arrive EPYC chips. AMD Secure Processor features, TPM 2.0 and DASH which offers features similar to Intel's vPro architecture. This one area in which AMD offers a broader choice of products than Intel whose Core i3 parts do not support enterprise features; at least not yet. Click the link above to check out more.
"AMD's Ryzen Pro platform blends business-class security and management features with the performance of the Zen architecture. We take an early look at how AMD plans to grapple with Intel in the battle for the standard corporate desktop."
Here is some more Tech News from around the web:
- AMD's Ryzen Pro desktop line-up targets Intel's enterprise dominance @ The Inquirer
- Everything you need to know about the Petya, er, NotPetya nasty trashing PCs worldwide @ The Register
- Google Photos 3.0 is out now, with automatic sharing features @ Ars Technica
- See you in 2023 – Bitcoin exchange Coin.mx bigwig gets 66 months in the slammer @ The Register
- Amazon Will Offer Prime Video At Half-Price In All New Markets For Six More Months @ Slashdot
- Hot news! Combustible Galaxy Note 7 to return as 'Galaxy Note FE' @ The Register
- Let's Encrypt Hits New Milestone: Over 100,000,000 Certificates Issued @ Slashdot
Subject: General Tech | June 29, 2017 - 11:06 AM | Alex Lustenberg
Tagged: video, Vega FE, thermalright, Spirit 140, Samsung, radeon, prorender, podcast, mining, mini ITX, microcode, logitech, GTX USB, gigabyte, galaxy s8+, G433, amd, AM4
PC Perspective Podcast #456 - 06/28/17
Join us for talk about Radeon Vega FE, Intel SSD 545S, GTX USB, Mining Specifc Cards, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano
Peanut Gallery: Alex Lustenberg, Ken Addison
Week in Review:
News items of interest:
Hardware/Software Picks of the Week
Jeremy: Cheap drone with 720p video
Subject: General Tech | June 28, 2017 - 11:39 PM | Scott Michaud
Tagged: pc gaming, gdq, speedrun
Starting on Sunday, Games Done Quick will be hosting their twice-annual, 24-hour speedrun marathon until 3am on the following Sunday. It will begin with a one-handed playthrough of NieR: Automata, and just keep going through game after game, including a handful of races between popular runners of applicable titles. (Personally, those tend to be my favorite segments.) Many are run on the PC!
This event will benefit Doctors Without Borders.
Until Awesome Games Done Quick 2017, it looked like the amount raised per week-long event settled at around 1.3 million. That one, however, leapfrogged the previous year’s total by a whole million dollars, ending up at $2.22 million USD. Summer Games Done Quick, apart from last year, tends to do a little less, but who knows?
Subject: General Tech | June 28, 2017 - 11:17 PM | Scott Michaud
Tagged: Unity, machine learning, deep learning
Unity, who makes the popular 3D game engine of the same name, has announced a research fellowship for integrating machine learning into game development. Two students, who must have been enrolled in a Masters or a PhD program on June 26th, will be selected and provided with $30,000 for a 6-month fellowship. The deadline is midnight (PDT) on September 9th.
We’re beginning to see a lot of machine-learning applications being discussed for gaming. There are some cases, like global illumination and fluid simulations, where it could be faster for a deep-learning algorithm to hallucinate a convincing than a physical solver will produce a correct one. In this case, it makes sense to post-process each frame, so, naturally, game engine developers are paying attention.
If eligible, you can apply on their website.
Subject: Graphics Cards | June 28, 2017 - 11:00 PM | Scott Michaud
Tagged: epic games, ue4, nvidia, geforce, giveaway
If you are an indie game developer, and you could use a little more GPU performance, NVIDIA is hosting a hardware giveaway. Starting at the end of July, and ongoing until Summer 2018, NVIDIA and Epic Games will be giving away GeForce GTX 1080 and GeForce GTX 1080 Ti cards to batches of Unreal Engine 4 projects.
To enter, you need to share screenshots and videos of your game on Twitter, Facebook, and Instagram, tagging both UnrealEngine and NVIDIA. (The specific accounts are listed on the Unreal Engine blog post that announces this initiative.) They will also feature these projects on both the Unreal Engine and the NVIDIA blog, which is just as valuable for indie projects.
So... hey! Several chances at free hardware!
Subject: General Tech | June 28, 2017 - 10:40 PM | Scott Michaud
Tagged: square enix, pc gaming, eidos montreal, deus ex: mankind divided
Frames of modern video games can be made up of tens of thousands of draw calls, which consist of a set of polygons and a shader pipeline that operates on it, and compute tasks. Last September, we found an article by Adrian Courrèges that broke down a single frame of DOOM, and discussed all of the techniques based on information from debug tools and SIGGRAPH slides.
This time, we found a video from János Turánszki that analyzes the ~32,000 - 33,000 graphics API calls of a single Deus Ex: Mankind Divided frame, using NVIDIA Nsight. As he scrubs through these events, he mentions things like how text is painted, a bug with temporal anti-aliasing, what appears to be a multi-pass blur for frosted glass, and so forth.
János Turánszki develops the open-source (MIT licensed) Wicked Engine.
Subject: Storage | June 28, 2017 - 09:49 PM | Allyn Malventano
Tagged: wdc, WD, toshiba, QLC, nand, BiCS, 96-layer, 3d
A couple of announcements out of Toshiba and Western Digital today. First up is Toshiba announcing QLC (4 bit per cell) flash on their existing BiCS 3 (64-layer) technology. QLC may not be the best for endurance as the voltage tolerances become extremely tight with 16 individual voltage states per cell, but Toshiba has been working on this tech for a while now.
In the above slide from the Toshiba keynote at last year's Flash Memory Summit, we see the use case here is for 'archival grade flash', which would still offer fast reads but is not meant to be written as frequently as MLC or TLC flash. Employing QLC in Toshiba's current BiCS 3 (64-layer) flash would enable 1.5TB of storage in a 16-die stack (within one flash memory chip package).
Next up is BiCS 4, which was announced by Western Digital. We knew BiCS 4 was coming but did not know how many layers it would be. We now know that figure, and it is 96. The initial offerings will be the common 256Gbit (32GB) capacity per die, but stacking 96 cells high means the die will come in considerably smaller, meaning more per wafer, ultimately translating to lower cost per GB in your next SSD.
While these announcements are welcome, their timing and coordinated launch from both companies seems odd. Perhaps it has something to do with this?
Subject: General Tech | June 28, 2017 - 06:24 PM | Scott Michaud
Tagged: solidworks, ray tracing, radeon, prorender, nvidia, mental ray, Blender, amd
AMD has released a free ray-tracing engine for Blender, as well as Maya, 3D Studio Max, and SolidWorks, called Radeon ProRender. It uses a physically-based workflow, which allows multiple materials to be expressed in a single, lighting-independent shader, making it easy to color objects and have them usable in any sensible environment.
Image Credit: Mike Pan (via Twitter)
I haven’t used it yet, and I definitely haven’t tested how it stacks up against Cycles, but we’re beginning to see some test renders from Blender folks. It looks pretty good, as you can see with the water-filled Cornell box (above). Moreover, it’s rendered on an NVIDIA GPU, which I’m guessing they had because of Cycles, but that also shows that AMD is being inclusive with their software.
Radeon ProRender puts more than a little pressure on Mental Ray, which is owned by NVIDIA and licensed on annual subscriptions. We’ll need to see how quality evolves, but, as you see in the test render above, it looks pretty good so far... and the price can’t be beat.
Subject: General Tech | June 28, 2017 - 05:09 PM | Scott Michaud
Tagged: ubisoft, pc gaming
Honestly, I don’t really know how many first-party engines Ubisoft currently maintains anymore. Anvil is one of their more popular ones, which was used in Assassin’s Creed, Steep, For Honor, and Tom Clancy’s Ghost Recon Wildlands. Far Cry 5 will be using the Dunia Engine, which was forked from the original CryEngine. Tom Clancy’s The Division, Mario + Rabbids, and the new South Park use Snowdrop. I know that I’m missing some.
Add another one to the list: Voyager, which will be used in Beyond Good and Evil 2.
From what I gather with the video, this engine is optimized for massive differences in scale. The Creative Director for Beyond Good and Evil 2, Michael Ancel, showed the camera (in developer mode) smoothly transition from a high-detailed player model out to a part of a solar system. They claim that the sunset effects are actually caused by the planet’s rotation. Interesting stuff!
Subject: Processors | June 28, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: 7900x, Core i9, Intel, skylake-x, x299
The Tech Report recently wrapped up the first part of their review of Intel's new Core i9-7900X, focusing on its effectiveness in production machine. Their benchmarks cover a variety of scientific tasks such as PhotoWorxx, FPU Julia and Mandel as well as creativity benchmarks like picCOLOR, DAWBench DSP 2017 and STARS Euler3D. During their testing they saw the same peaks in power consumption as Ryan did in his review, 253W under a full Blender load. Their follow up review will focus on the new chips gaming prowess, for now you should take a look at how your i9-7900X will perform for you when you are not playing around.
"Intel's Core i9-7900X and its Skylake-X brethren bring AVX-512 support, a new cache hierarchy, and a new on-die interconnect to high-end desktops. We examine how this boatload of high-performance computing power advances the state of the art in productivity applications."
Here are some more Processor articles from around the web:
- Intel Core i9 7900X Linux Benchmarks @ Phoronix
- Intel Core i7 7740X Benchmarks On Linux @ Phoronix
- Ryzen 5 1400 @ Hardware Secrets
Subject: Storage | June 28, 2017 - 02:12 PM | Jeremy Hellstrom
Tagged: Toshiba XG5, toshiba, ssd, NVMe, nand, M.2, BiCS, 64-Layer
We first heard about the Toshiba XG5 1TB NVMe SSD at Computex, with its 64 layer BiCS flash and stated read speeds of 3GB/s, writes just over 2 GB/s. Today Kitguru published a review of the new drive, including ATTO results which match and even exceed the advertised read and write speeds. Their real world test involved copying 30GB of movies off of a 512GB Samsung 950 Pro to the XG5, only Samsung's new 960 lineup and the OCZ RD400 were able to beat Toshiba's new SSD. Read more in their full review, right here.
"The Toshiba XG5 1TB NVMe SSD contains Toshiba's newest 3D 64-Layer BiCS memory and our report will examine Toshiba's newest memory, as well as their newest NVMe controller to go along with it."
Here are some more Storage reviews from around the web:
- Toshiba N300 8TB HDD @ Kitguru
- Kingston Gold Series UHS-1 Speed Class 3 64GB MicroSDXC @ Modders-Inc
- Kingston DataTraveler Ultimate GT 2TB USB 3.1 Gen 1 Flash Drive Review @ NikKTech
- Drobo 5D3 DAS Review (Thunderbolt 3) @ Kitguru
- LaCie 2TB Rugged Thunderbolt USB-C Professional All-Terrain Mobile Storage Review @ NikKTech
Subject: General Tech | June 28, 2017 - 01:07 PM | Jeremy Hellstrom
Tagged: gaming, Intel, ddr3, ddr4
Overclockers Club have completed a daunting task, testing the effect of RAM frequency on game performance from DDR3-1333 through DDR4-3200. In theory Intel's chips will not see the same improvements as AMD's Ryzen, lacking Infinity Fabric which has proved to be sensitive to memory frequency. Since OCC cover two generations of RAM they also needed to test with two different processors, in this case the i7-4770K and i7-7700K and they tested performance at 1440p as well as 1080p. Read the full article to see the full results which do show some performance deltas, however they nothing compared to spending more on your GPU.
"After running through all of the tests, it appears that what I previously thought was an easy and clear answer is in fact more complicated. With the evidence provided I can safely say that memory can play a large role in some games over all frame rates. However, other factors like the processor, type of video card, and resolution will usually provide bigger impact in the final frame rates. Strictly speaking of game performances, the fastest memory tested does yield better results."
Here is some more Tech News from around the web:
- Nintendo New 2DS XL mini-review: The best version of the 3DS hardware yet @ Ars Technica
- The Steam Sale continues ... if you somehow didn't realize it by now
- Unknown Pleasures: Steam’s latest hidden delights @ Rock, Paper, SHOTGUN
- Racing games on sale @ Humble Store
- Wot I Think: Darkest Dungeon – The Crimson Court @ Rock, Paper, SHOTGUN
- Sand worms and lightning: Aven Colony takes city-building to exoplanets @ Ars Technica
- Double Dragon Trilogy free with any purchase @ GOG
Subject: General Tech | June 28, 2017 - 12:53 PM | Ryan Shrout
Tagged: giveaway, contest
It seems like it has been forever since we had a contest on the site...let's remedy that with our friends at FSP!
Anyone on the globe is able to enter - good luck!
Subject: Motherboards | June 28, 2017 - 01:44 AM | Tim Verry
Tagged: gigabyte, mini ITX, b350, amd, AM4, raven ridge, SFF, ryzen
Gigabyte is joining the small form factor Ryzen motherboard market with its new GA-AB350N-Gaming WIFI. The new Mini ITX motherboard sports AMD’s AM4 socket and B350 chipset and supports Ryzen “Summit Ridge” CPUs, Bristol Ridge APUs (7th Gen/Excavator), and future Zen-based Raven Ridge APUs. The board packs a fair bit of hardware into the Mini ITX form factor and is aimed squarely at gamers and enthusiasts.
The AB350N-Gaming WIFI has an interesting design in that some of the headers and connectors are flipped versus where they are traditionally located. The chipset sits to the left of CPU socket above the 6-phase VRMs and PowIRStage digital ICs. Four SATA 6Gbps ports and a USB 3.0 header occupy the top edge of the board. Two DDR4 dual channel memory slots are aligned on the right edge and support (overclocked) frequencies up to 3200 MHz depending on the processor used. The Intel wireless NIC, Realtek Gigabit Ethernet, and Realtek ALC1220 audio chips have been placed in the space between the AM4 socket and the single PCI-E 3.0 x16 slot. There is also a single M.2 (PCI-E 3.0 x4 32Gbps) slot on the underside of the motherboard. Gigabyte has also integrated “RGB Fusion” technology with two on board RGB LED lighting zones and two RGBW headers for off board lighting strips as well as high end audio capacitors and headphone amplifier. Smart Fan 5 technology allegedly is capable of automatically differentiating between fans and water pumps connected to the two fan headers and will automatically provide the correct PWM signal based on fan curves the user can customize in the UEFI BIOS. The motherboard is powered by a 24-pin ATX and 8-pin EPS and while it does not have a very beefy power phase setup it should be plenty for most overclocks (especially with Ryzen not wanting to go much past 4 GHz (easily) anyway).
Rear I/O includes:
- 1 x PS/2
- 2 x Antenna (Intel 802.11ac Wi-Fi + BT 4.2)
- 2 x USB 2.0
- 2 x USB 3.1 Gen 2 (10Gbps)
- 4 x USB 3.1 Gen 1 (5Gbps)
- 6 x Audio (5 x analog, 1 x S/PDIF)
- 1 x DisplayPort 1.2
- 1 x HDMI 1.4
- 1 x Realtek GbE
Gigabyte has an interesting SFF motherboard with the GA-AB350N-Gaming WIFI and I am interested in seeing the reviews. More Mini ITX options for Ryzen and other Zen-based systems is a good thing, and moving the power phases to the left may end up helping overclocking and cooling in smaller cases with tower coolers.
Unfortunately, Gigabyte has not yet revealed pricing or availability. Looking around online at its competition, i would guess it would be around $85 though.
- Biostar's ITX Ryzen motherboard in action; the X370GTN
- BIOSTAR Shows Mini-ITX AM4 Motherboard for AMD Ryzen
- ASRock's Fatal1ty X370 Gaming-ITX/ac Mini ITX Motherboard for Ryzen Coming Soon
- The AMD Ryzen 7 1800X Review: Now and Zen
- The Ryzen 5 Review: 1600X and 1500X Take on Core i5
Subject: Mobile | June 27, 2017 - 08:00 PM | Ryan Shrout
Tagged: xr, VR, qualcomm, google, daydream, AR
Qualcomm has put forward steady work on creating the vibrant hardware ecosystem for mobile VR to facilitate broad adoption of wireless, dedicated head mounted displays. Though the value of Samsung’s Gear VR and Google’s Daydream View cannot but overstated in moving the perception of consumer VR forward, the need to utilize your smart phone in a slot-in style design has its limitations. It consumes battery that you may require for other purposes, it limits the kinds of sensors that the VR system can utilize, and creates a sub-optimal form factor in order to allow for simple user installation.
The Qualcomm Snapdragon 835 VR Reference Device
Qualcomm created the first standalone VR HMD reference design back in early 2016, powered by the Snapdragon 820 processor. Google partnered with Qualcomm at I/O to create the Daydream standalone VR headset reference design with the updated Snapdragon 835 Mobile Platform at its core, improving performance and graphical capability along the way. OEMs like Lenovo and HTC have already committed to Daydream standalone units, with Qualcomm at the heart of the hardware.
Qualcomm Technologies recently announced a HMD Accelerator Program (HAP) to help VR device manufacturers quickly develop premium standalone VR HMDs. At the core of this program is the standalone VR HMD reference design. It goes beyond a simple prototype device, offering a detailed reference design that allows manufacturers to apply their own customizations while utilizing our engineering, design, and experience in VR. The reference design is engineered to minimize software changes, hardware issues, and key component validation.
- Hugo Swart, Qualcomm Atheros, Inc.
As part of this venture, and to continue pushing the VR industry forward to more advanced capabilities like XR (extended reality, a merger of VR and AR), Qualcomm is announcing agreements with key component vendors aiming to tighten and strengthen the VR headset ecosystem.
Hugo Swart, Senior Director, Product Management, Qualcomm Atheros, Inc.
Ximmerse has built a high-precision and drift-free controller for VR applications that offers low latency input and 3DoF (3 degrees of freedom) capability. This can “provide just about any interaction, such as pointing, selecting, grabbing, shooting, and much more. For precise 6 DoF positional tracking of your head, tight integration is required between the sensor fusion processing (Snapdragon) and the data from both the camera and inertial sensors.”
Bosch Sensortec has the BMX055 absolute orientation sensor that performs the function that its name would imply: precisely locating the user in the real world and tracking movement via accelerometer, gyroscope, and magnetometer.
Finally, OmniVision integrates the OV9282 which is a 1MP high speed shutter image sensor for feature tracking.
These technologies, paired with the work Qualcomm has already done for the Snapdragon 835 VR Development Kit, including on the software side, is an important step to the growth of this segment of the market. I don’t know of anyone that doesn’t believe standalone, wireless headsets are the eventual future of VR and AR and the momentum created by Qualcomm, Google, and others continues its steady pace of development.