GDC 2018: Qualcomm Talks Future of VR and AR with Upcoming Dev Kit

Subject: General Tech | March 21, 2018 - 09:20 AM |
Tagged: xr, VR, Tobii, qualcomm, HMD, GDC 2018, GDC, eye-tracking, developers, dev kit, AR

We have recently covered news of Qualcomm's ongoing VR/AR efforts (the two terms now combine as "XR", for eXtended reality), with news of the Snapdragon 845-powered reference HMD and more recently the collaboration with Tobii to bring eye-tracking to the Qualcomm development platform. Today at GDC Qualcomm is mapping out their vision for the future of XR, and providing additional details about the Snapdragon 845 dev kit - and announcing support for the HTC Vive Wave SDK.


From Qualcomm:

For the first time, many new technologies that are crucial for an optimal and immersive VR user experience will be supported in the Snapdragon 845 Virtual Reality Development Kit. These include:


  • Room-scale 6DoF SLAM: The Snapdragon 845 Virtual Reality Development Kit is engineered to help VR developers create applications that allow users to explore virtual worlds, moving freely around in a room, rather than being constrained to a single viewing position. Un-tethered mobile VR experiences like these can benefit from the Snapdragon 845 Virtual Reality Development Kit’s pre-optimized hardware and software for room-scale six degrees of freedom (6DoF) with “inside-out” simultaneous localization and mapping (SLAM). All of this is designed to be accomplished without any external setup in the room by the users, and without any cables or wires.
  • Qualcomm® Adreno™ Foveation: Our eyes are only able to observe significant details in a very small center of our field of vision - this region is called the “fovea”. Foveated rendering utilizes this understanding to boost performance & save power, while also improving visual quality. This is accomplished through multiple technology advancements for multi-view, tile-based foveation with eye-tracking and fine grain preemption to help VR application developers deliver truly immersive visuals with optimal power efficiency.


  • Eye Tracking: Users naturally convey intentions about how and where they want to interact within virtual worlds through their eyes. Qualcomm Technologies worked with Tobii AB to develop an integrated and optimized eye tracking solution for the Snapdragon 845 VR Development Kit. The cutting-edge eye tracking solution on Snapdragon 845 VR Development Kit is designed to help developers utilize Tobii’s EyeCore™ eye tracking algorithms to create content that utilizes gaze direction for fast interactions, and superior intuitive interfaces.
  • Boundary System: The new SDK for the Snapdragon 845 VR Development Kit supports a boundary system that is engineered to help VR application developers accurately visualize real-world spatial constraints within virtual worlds, so that their applications can effectively manage notifications and play sequences for VR games or videos, as the user approaches the boundaries of the real-world play space.


In addition to enhancing commercial reach for the VR developer community, Qualcomm Technologies is excited to announce support for the HTC Vive Wave™ VR SDK on the Snapdragon 845 Virtual Reality Development Kit, anticipated to be available later this year. The Vive Wave™ VR SDK is a comprehensive tool set of APIs that is designed to help developers create high-performance, Snapdragon-optimized content across diverse hardware vendors at scale, and offer a path to monetizing applications on future HTC Vive ready products via the multi-OEM Viveport™ application store.

The Snapdragon 845 HMD/dev kit and SDK are expected to be available in Q2 2018.

Source: Qualcomm

AMD finalizing fixes for Ryzen, EPYC security vulnerabilities

Subject: Processors | March 20, 2018 - 04:33 PM |
Tagged: ryzenfall, masterkey, fallout, cts labs, chimera, amd

AMD’s CTO Mark Papermaster released a blog today that both acknowledges the security vulnerabilities first shown by a CTS Labs report last week, while also laying the foundation for the mitigations to be released. Though the company had already acknowledged the report, and at least one other independent security company validated the claims, we had yet to hear from AMD officially on the potential impact and what fixes might be possible for these concerns.

In the write up, Papermaster is clear to call out the short period of time AMD was given with this information, quoting “less than 24 hours” from the time it was notified to the time the story was public on news outlets and blogs across the world. It is important to detail for some that may not follow the security landscape clearly that this has no relation to the Spectre and Meltdown issues that are affecting the industry and what CTS did find has nothing to do with the Zen architecture itself. Instead, the problem revolves around the embedded security protocol processor; while an important distinction moving forward, from a practical view to customers this is one and the same.


AMD states that it has “rapidly completed its assessment and is in the process of developing and staging the deployment of mitigations.” Rapidly is an understatement – going from blindsided to an organized response is a delicate process and AMD has proven its level of sincerity with the priority it placed on this.

Papermaster goes on to mention that all these exploits require administrative access to the computer being infected, a key differentiator from the Spectre/Meltdown vulnerabilities. The post points out that “any attacker gaining unauthorized administrative access would have a wide range of attacks at their disposal well beyond the exploits identified in this research.” I think AMD does an excellent job threading the needle in this post balancing the seriousness of these vulnerabilities with the overzealous hype that was created upon their initial release and the accompanying financial bullshit that followed.

AMD provides an easy to understand table with a breakdown of the vulnerabilities, the potential impact of the security risk, and what the company sees as its mitigation capability. Both sets that affect the secure processor in the Ryzen and EPYC designs are addressable with a firmware update for the secure unit itself, distributed through a standard BIOS update. For the Promontory chipset issue, AMD is utilizing a combination of a BIOS update and further work with ASMedia to further enhance the security updates.



That is the end of the update from AMD at this point. In my view, the company is doing a satisfactory job addressing the problems in what must be an insanely accelerated time table. I do wish AMD was willing to offer more specific time tables for the distribution of those security patches, and how long we should expect to wait to see them in the form of BIOS updates for consumer and enterprise customers. For now, we’ll monitor the situation and look for other input from AMD, CTS, or secondary security firms to see if the risks laid out ever materialize.

For what could have been a disastrous week for AMD, it has pivoted to provide a controlled, well-executed plan. Despite the hype and hysteria that might have started with stock-shorting and buzzwords, the plight of the AMD processor family looks stable.

Source: AMD

NVIDIA RTX Technology Accelerates Ray Tracing for Microsoft DirectX Raytracing API

Subject: Graphics Cards | March 19, 2018 - 01:00 PM |
Tagged: rtx, nvidia, dxr

The big news from the Game Developers Conference this week was Microsoft’s reveal of its work on a new ray tracing API for DirectX called DirectX Raytracing. As the name would imply, this is a new initiative to bring the image quality improvements of ray tracing to consumer hardware with the push of Microsoft’s DX team. Scott already has a great write up on that news and current and future implications of what it will mean for PC gamers, so I highly encourage you all to read that over before diving more into this NVIDIA-specific news.

Ray tracing has been the holy grail of real-time rendering. It is the gap between movies and games – though ray tracing continues to improve in performance it takes the power of offline server farms to render the images for your favorite flicks. Modern game engines continue to use rasterization, an efficient method for rendering graphics but one that depends on tricks and illusions to recreate the intended image. Ray tracing inherently solves the problems that rasterization works around including shadows, transparency, refraction, and reflection. But it does so at a prohibitive performance cost. But that will be changing with Microsoft’s enablement of ray tracing through a common API and technology like what NVIDIA has built to accelerate it.


Alongside support and verbal commitment to DXR, NVIDIA is announcing RTX Technology. This is a combination of hardware and software advances to improve the performance of ray tracing algorithms on its hardware and it works hand in hand with DXR. NVIDIA believes this is the culmination of 10 years of development on ray tracing, much of which we have talked about on this side from the world of professional graphics systems. Think Iray, OptiX, and more.

RTX will run on Volta GPUs only today, which does limit usefulness to gamers. With the only graphics card on the market even close to considered a gaming product the $3000 TITAN V, RTX is more of a forward-looking technology announcement for the company. We can obviously assume then that RTX technology will be integrated on any future consumer gaming graphics cards, be that a revision of Volta of something completely different. (NVIDIA refused to acknowledge plans for any pending Volta consumer GPUs during our meeting.)

The idea I get from NVIDIA is that today’s RTX is meant as a developer enablement platform, getting them used to the idea of adding ray tracing effects into their games and engines and to realize that NVIDIA provides the best hardware to get that done.

I’ll be honest with you – NVIDIA was light on the details of what RTX exactly IS and how it accelerates ray tracing. One very interesting example I was given was seen first with the AI-powered ray tracing optimizations for Optix from last year’s GDC. There, NVIDIA demonstrated that using the Volta Tensor cores it could run an AI-powered de-noiser on the ray traced image, effectively improving the quality of the resulting image and emulating much higher ray counts than are actually processed.

By using the Tensor cores with RTX for DXR implementation on the TITAN V, NVIDIA will be able to offer image quality and performance for ray tracing well ahead of even the TITAN Xp or GTX 1080 Ti as those GPUs do not have Tensor cores on-board. Does this mean that all (or flagship) consumer graphics cards from NVIDIA will includ Tensor cores to enable RTX performance? Obviously, NVIDIA wouldn’t confirm that but to me it makes sense that we will see that in future generations. The scale of Tensor core integration might change based on price points, but if NVIDIA and Microsoft truly believe in the future of ray tracing to augment and significantly replace rasterization methods, then it will be necessary.

Though that is one example of hardware specific features being used for RTX on NVIDIA hardware, it’s not the only one that is on Volta. But NVIDIA wouldn’t share more.

The relationship between Microsoft DirectX Raytracing and NVIDIA RTX is a bit confusing, but it’s easier to think of RTX as the underlying brand for the ability to ray trace on NVIDIA GPUs. The DXR API is still the interface between the game developer and the hardware, but RTX is what gives NVIDIA the advantage over AMD and its Radeon graphics cards, at least according to NVIDIA.

DXR will still run on other GPUS from NVIDIA that aren’t utilizing the Volta architecture. Microsoft says that any board that can support DX12 Compute will be able to run the new API. But NVIDIA did point out that in its mind, even with a high-end SKU like the GTX 1080 Ti, the ray tracing performance will limit the ability to integrate ray tracing features and enhancements in real-time game engines in the immediate timeframe. It’s not to say it is impossible, or that some engine devs might spend the time to build something unique, but it is interesting to hear NVIDIA infer that only future products will benefit from ray tracing in games.

It’s also likely that we are months if not a year or more from seeing good integration of DXR in games at retail. And it is also possible that NVIDIA is downplaying the importance of DXR performance today if it happens to be slower than the Vega 64 in the upcoming Futuremark benchmark release.


Alongside the RTX announcement comes GameWorks Ray Tracing, a colleciton of turnkey modules based on DXR. GameWorks has its own reputation, and we aren't going to get into that here, but NVIDIA wants to think of this addition to it as a way to "turbo charge enablement" of ray tracing effects in games.

NVIDIA believes that developers are incredibly excited for the implementation of ray tracing into game engines, and that the demos being shown at GDC this week will blow us away. I am looking forward to seeing them and for getting the reactions of major game devs on the release of Microsoft’s new DXR API. The performance impact of ray tracing will still be a hindrance to larger scale implementations, but with DXR driving the direction with a unified standard, I still expect to see some games with revolutionary image quality by the end of the year. 

Source: NVIDIA

HTC announces VIVE Pro Pricing, Available now for Preorder

Subject: General Tech, Graphics Cards | March 19, 2018 - 12:09 PM |
Tagged: vive pro, steamvr, rift, Oculus, Lighthouse, htc

Today, HTC has provided what VR enthusiasts have been eagerly waiting for since the announcement of the upgraded VIVE Pro headset earlier in the year at CESthe pricing and availability of the new device.


Available for preorder today, the VIVE Pro will cost $799 for the headset-only upgrade. As we mentioned during the VIVE Pro announcement, this first upgrade kit is meant for existing VIVE users who will be reusing their original controllers and lighthouse trackers to get everything up and running.

The HMD-only kit, with it's upgraded resolution and optics, is set to start shipping very soon on April 3 and can be preordered now on the HTC website.

Additionally, your VIVE Pro purchase (through June 3rd, 2018) will come with a free six-month subscription to HTC's VIVEPORT subscription game service, which will gain you access to up to 5 titles per month for free (chosen from the VIVEPORT catalog of 400+ games.)

There is still no word on the pricing and availability of the full VIVE Pro kit including the updated Lighthouse 2.0 trackers, but it seems likely that it will come later in the Summer after the upgrade kit has saturated the market of current VIVE owners.

As far as system requirements go, the HTC site doesn't list any difference between the standard VIVE and the VIVE Pro. One change, however, is the lack of an HDMI port on the new VIVE Pro link box, so you'll need a graphics card with an open DisplayPort 1.2 connector. 

Source: HTC

PNY Adds CS900 960GB SATA SSD To Budget SSD Series

Subject: General Tech, Storage | March 18, 2018 - 12:20 AM |
Tagged: ssd, sata 3, pny, 3d nand

PNY has added a new solid-state drive to its CS900 lineup doubling the capacity to 960GB. The SATA-based SSD is a 2.5" 7mm affair suitable for use in laptops and SFF systems as well as a budget option for desktops.

PNY CS900 960GB SATA SSD.png

The CS900 960GB SSD uses 3D TLC NAND flash and offers ECC, end-to-end data protection, secure erase, and power saving features to protect data and battery life in mobile devices. Unfortunately, information on the controller and NAND flash manufacturer is not readily available though I suspect it uses a Phison controller like PNY's other drives.

The 960GB capacity model is rated for sequential reads of 535 MB/s and sequential writes of 515 MB/s. PNY rates the drive at 2 million hours MTBF and they cover it with a 3-year warranty.

We may have to wait for reviews (we know how Allyn loves to tear apart drives!) for more information on this drive especially where random read/write and latency percentile performance are concerned. The good news is that if the performance is there the budget price seems right with an MSRP of $249.99 and an Amazon sale price of $229.99 (just under 24 cents/GB) at time of writing. Not bad for nearly a terabyte of solid state storage (though if you don't need that much space you can alternatively find PCI-E based M.2 SSDs in this price range).

Source: PNY

Tobii and Qualcomm Announce Collaboration on Mobile VR Headsets with Eye-Tracking

Subject: General Tech | March 16, 2018 - 09:45 AM |
Tagged: xr, VR, Tobii, snapdragon 845, qualcomm, mobile, HMD, head mounted display, eye tracking, AR, Adreno 630

Tobii and Qualcomm's collaboration in the VR HMD (head-mounted display) space is a convergence of two recent stories, with Tobii's impressing showing of a prototype HMD device at CES featuring their eye-tracking technology, and Qualcomm's unvieling last month of their updated mobile VR platform, featuring the new Snapdragon 845.


The Qualcomm Snapdragon 845 Mobile VR Reference Platform

What does this new collaboration mean for the VR industry? For now it means a new reference design and dev kit with the latest tech from Tobii and Qualcomm:

"As a result of their collaboration, Tobii and Qualcomm are creating a full reference design and development kit for the Qualcomm Snapdragon 845 Mobile VR Platform, which includes Tobii's EyeCore eye tracking algorithms and hardware design. Tobii will license its eye tracking technologies and system and collaborate with HMD manufacturers on the optical solution for the reference design."

The press release announcing this collaboration recaps the benefits of Tobii eye tracking in a mobile VR/AR device, which include:

  • Foveated Rendering: VR/AR devices become aware of where you are looking and can direct high-definition graphics processing power to that exact spot in real time. This enables higher definition displays, more efficient devices, longer battery life and increased mobility.
  • Interpupillary Distance: Devices automatically orient images to align with your pupils. This enables devices to adapt to the individual user, helping to increase the visual quality of virtual and augmented reality experiences.
  • Hand-Eye Coordination: By using your eyes in harmony with your hands and associated controllers, truly natural interaction and immersion, not possible without the use of gaze, is realized.
  • Interactive Eye Contact: Devices can accurately track your gaze in real time, enabling content creators to express one of the most fundamental dimensions of human interaction – eye contact. VR technologies hold the promise of enabling a new and immersive medium for social interaction. The addition of true eye contact to virtual reality helps deliver that promise.


Tobii's prototype eye-tracking HMD

For its part, Qualcomm's Snapdragon 845-powered VR mobile platform promises greater portability of a better VR experience, with expanded freedom on top of the improved graphics horsepower from the new Adreno 630 GPU in the Snapdragon 845. This portability includes 6DoF (6 degrees of freedom) using external cameras to identify location within a room, eliminating the need for external room sensors.

"Together, 6DoF and SLAM deliver Roomscale - the ability to track the body and location within a room so you can freely walk around your XR environment without cables or separate room sensors – the first on a mobile standalone device. Much of this is processed on the new dedicated Qualcomm Hexagon Digital Signal Processor (DSP) and Adreno Graphics Processing Unit within the Snapdragon 845. Qualcomm Technologies’ reference designs have supported some of the first wave of standalone VR devices from VR ecosystem leaders like Google Daydream, Oculus and Vive."

It is up to developers, and consumer interest in VR moving forward, to see what this collaboration will produce. To editorialize briefly, from first-hand experience I can vouch for the positive impact of eye-tracking with an HMD, and if future products live up to the promise of a portable, high-performance VR experience (with a more natural feel from less rapid head movement) a new generation of VR enthusiasts could be born.

Source: PR Newswire

PCPer Mailbag #35 - 3/16/2018

Subject: Editorial | March 16, 2018 - 09:00 AM |
Tagged: video, Ryan Shrout, pcper mailbag

It's time for the PCPer Mailbag, our weekly show where Ryan and the team answer your questions about the tech industry, the latest and greatest GPUs, the process of running a tech review website, and more!

On today's show:

00:50 - Rumble gaming chairs?
02:18 - SSD for VMs?
04:08 - Playing old games in a virtual machine?
05:54 - Socketed GPUs?
09:08 - PC middlemen?
11:02 - NVIDIA tech demos?
13:10 - Raven Ridge motherboard features?
14:45 - x8 and x16 PCIe SSDs?
17:50 - Ryzen 2800X and Radeon RX 600?
19:53 - GeForce Partner Program?
21:14 - Goo goo g'joob?

Want to have your question answered on a future Mailbag? Leave a comment on this post or in the YouTube comments for the latest video. Check out new Mailbag videos each Friday!

Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!

Source: YouTube

Intel promises 2018 processors with hardware mitigation for Spectre and Meltdown

Subject: Processors | March 15, 2018 - 10:29 AM |
Tagged: spectre, meltdown, Intel, cascade lake, cannon lake

In continuing follow up from the spectacle that surrounded the Meltdown and Spectre security vulnerabilities released in January, Intel announced that it has provided patches and updates that address 100% of the products it has launched in the last 5 years. The company also revealed its plan for updated chip designs that will address both the security and performance concerns surrounding the vulnerabilities.

Intel hopes that by releasing new chips to address the security and performance questions quickly it will cement its position as the leader in the enterprise compute space. Customers like Amazon, Microsoft, and Google that run the world’s largest data centers are looking for improved products to make up for the performance loss and assurances moving forward that a similar situation won’t impact their bottom line.


For current products, patches provide mitigations for the security flaws in the form operating system updates (for Windows, Linux) and what are called microcode updates, a small-scale firmware that helps provide instruction processing updates for a processor. Distributed by Intel OEMs (system vendors and component providers) as well as Microsoft, the patches have seemingly negated the risks for consumers and enterprise customer data, but with a questionable impact on performance.

The mitigations cause the processors to operate differently than originally designed and will cause performance slowdowns on some workloads. These performance degradations are the source of the handful of class-action lawsuits hanging over Intel’s head and are a potential sore spot for its relationship with partners. Details on the performance gaps from the security mitigations have been sparse from Intel, with only small updates posted on corporate blogs. And because the problem has been so widespread, covering the entire Intel product line of the last 10 years, researchers are struggling to keep up.

The new chips that Intel is promising will address both security and performance considerations in silicon rather than software, and will be available in 2018. For the data center this is the Cascade Lake server processor, and for the consumer and business markets this is known as Cannon Lake. Both will include what Intel is calling “virtual fences” between user and operating system privilege levels and will create a significant additional obstacle for potential vulnerabilities.

The chips will also lay the ground work and foundation for future security improvement, providing a method to more easily update the security of the processors through patching.

By moving the security mitigations from software (both operating system and firmware) into silicon, Intel is reducing the performance impact that Spectre and Meltdown cause on select computing tasks. Assurances that future generations of parts won’t suffer from a performance hit is good news for Intel and its customer base, but I don’t think currently afflicted customers will be satisfied at the assertion they need to buy updated Intel chips to avoid the performance penalty. It will be interesting to see how, if at all, the legal disputes are affected.


The speed at which Intel is releasing updated chips to the market is an impressive engineering feat, and indicates at top-level directive to get this fixed as quickly as possible. In the span of just 12 months (from Intel’s apparent notification of the security vulnerability to the expected release of this new hardware) the company will have integrated fairly significant architectural changes. While this may have been a costly more for the company, it is a drop in the bucket compared to the potential risks of lowered consumer trust or partner migration to competitive AMD processors.

For its part, AMD has had its own security issues pop up this week from a research firm called CTS Labs. While there are extenuating circumstances that cloud the release of the information, AMD does now have a template for how to quickly and effectively address a hardware-level security problem, if it exists.

The full content of Intel's posted story on the subject is included below:

Hardware-based Protection Coming to Data Center and PC Products Later this Year

By Brian Krzanich

In addressing the vulnerabilities reported by Google Project Zero earlier this year, Intel and the technology industry have faced a significant challenge. Thousands of people across the industry have worked tirelessly to make sure we delivered on our collective priority: protecting customers and their data. I am humbled and thankful for the commitment and effort shown by so many people around the globe. And, I am reassured that when the need is great, companies – and even competitors – will work together to address that need.

But there is still work to do. The security landscape is constantly evolving and we know that there will always be new threats. This was the impetus for the Security-First Pledge I penned in January. Intel has a long history of focusing on security, and now, more than ever, we are committed to the principles I outlined in that pledge: customer-first urgency, transparent and timely communications, and ongoing security assurance.

Today, I want to provide several updates that show continued progress to fulfill that pledge. First, we have now released microcode updates for 100 percent of Intel products launched in the past five years that require protection against the side-channel method vulnerabilities discovered by Google. As part of this, I want to recognize and express my appreciation to all of the industry partners who worked closely with us to develop and test these updates, and make sure they were ready for production.

With these updates now available, I encourage everyone to make sure they are always keeping their systems up-to-date. It’s one of the easiest ways to stay protected. I also want to take the opportunity to share more details of what we are doing at the hardware level to protect against these vulnerabilities in the future. This was something I committed to during our most recent earnings call.

While Variant 1 will continue to be addressed via software mitigations, we are making changes to our hardware design to further address the other two. We have redesigned parts of the processor to introduce new levels of protection through partitioning that will protect against both Variants 2 and 3. Think of this partitioning as additional “protective walls” between applications and user privilege levels to create an obstacle for bad actors.

These changes will begin with our next-generation Intel® Xeon® Scalable processors (code-named Cascade Lake) as well as 8th Generation Intel® Core™ processors expected to ship in the second half of 2018. As we bring these new products to market, ensuring that they deliver the performance improvements people expect from us is critical. Our goal is to offer not only the best performance, but also the best secure performance.

But again, our work is not done. This is not a singular event; it is a long-term commitment. One that we take very seriously. Customer-first urgency, transparent and timely communications, and ongoing security assurance. This is our pledge and it’s what you can count on from me, and from all of Intel.

Source: Intel

Podcast #491 - Intel Optane 800P, UltraWide Monitors, and more!

Subject: General Tech | March 15, 2018 - 09:07 AM |
Tagged: video, ultrawide, podcast, Optane, Intel, Huawei, GeForce Partner Program, FreeSync2, cts labs, caldigit

PC Perspective Podcast #491 - 03/14/18

Join us this week for discussion on Intel Optane 800P, UltraWide Monitors, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath,Allyn Malventano

Peanut Gallery: Alex Lustenberg, Jim Tanous

Program length: 1:34:30

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
  3. Picks of the Week:
    1. 1:25:45 Ryan: Basta ZMover
  4. Closing/outro

Logitech Announces G560 Speakers and G513 Keyboard with LIGHTSYNC

Subject: General Tech | March 15, 2018 - 08:00 AM |
Tagged: speakers, romer-g, mechanical, logitech, keyboard, key switches

Logitech has announced a pair of devices featuring the company’s “LIGHTSYNC” technology,  with the new G560 gaming speaker and G513 gaming keyboard. What is LIGHTSYNC? According to Logitech, the technology synchronizes audio and RGB lighting with gameplay, to create a more immersive experience. We have seen a similar concept with the Philips Ambilight TVs, which project lighting onto a rear wall, synchronized with on-screen content to enhance immersion.

Logitech G560 PC Gaming Speaker 8.jpg

How is this new lighting tech implemented with gaming peripherals, exactly? We begin with the G560, which offers plenty of power with a 120W (240W peak) amplifier, and 3D audio effects courtesy of DTS:X Ultra 1.0. As to the LIGHTSYNC implementation, Logitech explains:

“Powered by advanced LIGHTSYNC technology, the Logitech G560 synchronizes brilliant RGB lighting and powerful audio in real time to match on-screen gameplay action. Light and animation effects can be customized across approximately 16.8 million colors, with four lighting zones.”

Logitech G560 PC Gaming Speaker 2.jpg

The G560 offers USB and 3.5mm connectivity, as well as Bluetooth, and Logitech's "Easy-Switch" technology allows for switching between any four connected devices. The included LIGHTSYNC technology works with both video and music.

Logitech G513 Mechanical Gaming Keyboard Carbon 1.jpg

Next we have the G513 Mechanical Gaming Keyboard, which combines Romer-G switches (your choice of either Romer-G Tactile or Linear), and support for LIGHTSYNC. The Logitech G513 offers a brushed aluminum top case and USB passthrough port, and includes an optional palm rest.

As to pricing, the Logitech G560 PC Gaming Speaker will carry an MSRP of $199.99, with a $149.99 MSRP for the G513 Mechanical Gaming Keyboard. Availability is set for April 2018.

Source: Logitech

3000MHz of RGB LEDs, ADATA's XPG Spectrix D40 DDR4

Subject: Memory | March 12, 2018 - 03:05 PM |
Tagged: adata, xpg spectrix d40, DDR4-3000, RGB

ADATA's new DDR4-3000 DIMMs have ASUS Aura Sync compatible RGBs, or with their own software you can download to power your lightshow if you aren't running an ASUS board.  The DIMMs each have 5 LEDs which you can program to display a single colour, cycle colours or set a gradient or you can opt for breathing or music modes if you prefer.  We won't bore you with unimportant details such as the default timings of 16-18-18 or that Modders Inc hit 3733 MHz at 18-20-20 timings with a voltage of 1.38 as that has nothing to do with shiny lights.


"The XPG line of memory modules from ADATA is considered to be its enthusiast line. The XPG SPECTRIX D40 is the first DDR-4 RAM that features RGB LED. The memory starts off with a base speed of 2,666MHz and is offered in speeds up to 4000Mhz. The kit featured in this review is the DDR-4 3,000MHz version."

Here are some more Memory articles from around the web:



Source: Modders Inc

Need a new display, right now?

Subject: Displays | March 12, 2018 - 02:03 PM |
Tagged: 1080p, 1440p, 4k, 21:9, g-sync, freesync

Today is perhaps not the best day to buy a new monitor, FreeSync 2 should be arriving soon, as should high refresh rate UHD models, and well, the HDR standard is a wee bit more dynamic than we want right now.  There are some out there who will feel the need to upgrade or to replace a veteran panel which has hit retirement age, so check out TechSpot's current recommendations.  They have spilt their displays into four categories, 1080p, 1440p, 4K, Ultrawide aka 21:9 and a budget category.  For the most part, they chose G-SYNC as NVIDIA holds the largest marketshare but they did include a few FreeSync alternatives. 

Check out their recommendations to see if anything might fit your immediate needs.


"With the gaming monitor market expanding to all sorts of display types and technologies, it's time we had a dedicated Best Of feature dedicated to them. Today we'll provide you with 5-10 key monitor recommendations across a variety of popular categories."

Here are some more Display articles from around the web:


Source: TechSpot

Design a thing, win a prize; change the world?

Subject: General Tech | March 12, 2018 - 01:29 PM |
Tagged: hack, DIY, nifty, hackaday prize

Last years grand prize winner of the Hack a Day prize picked up $50,000 for creating the Open Source Underwater Glider, an autonomous underwater vehicle which uses a buoyancy engine instead of screws to travel underwater.  That makes it silent and able to roam around for a week or more before returning home and the plans and materials are readily available for anyone who wants to build one.

Today the 2018 Hackaday Prize launches, commencing with the Open Hardware Design Challenge.  For this challenge you need only to provide detailed plans of your project and the theory behind it, if your plans are among the best 20 and fit into one of the next four challenges you might just pick up $1000 and move onto the next stage.  The four specific challenges are Robotics, Power Harvesting, Human Computer Interface and Musical Instruments; so if you have an existing project or an idea just burning around in your brain, then here is your chance to shine!  Check out the full rules and details here.


"The Hackaday Prize begins with 5 themed challenges which run in nonstop series (one directly after the other). Each challenge lasts 6 weeks long, with the first challenge beginning on March 12th and the last ending October 8th. The top 20 projects from each round win $1000 and advance to the finals."

Here is some more Tech News from around the web:

Tech Talk

Source: Hack a Day

AMD FreeSync 2 for Xbox One S and Xbox One X

Subject: General Tech | March 10, 2018 - 05:11 PM |
Tagged: xbox, freesync, amd

Next week, in the Xbox One alpha release ring, Microsoft will enable AMD FreeSync 2 for the Xbox One S and the Xbox One X. This allows compatible displays, ones that accept FreeSync variable refresh rate signals over HDMI, to time their refresh rate to the console’s rendering rate and removes the micro-stutter that could be seen due to this mismatch.


Because it is FreeSync 2, it will also work with HDR content.

As stated, FreeSync over HDMI will be required to use this feature, which has two caveats. The first is that DisplayPort will not work, so that’s something to be careful about if you’re planning to buy something (either a display or an Xbox itself) for this feature. The second is that, as far as I know, not a single TV currently supports FreeSync – but that could change. There is now a major console manufacturer pushing the standard, which is a stronger use case than “maybe someone with an AMD (or potentially Intel someday) GPU will plug their PC into this TV”.


The menu to enable FreeSync on the Xbox One

The Xbox Insider Program Alpha Preview ring is invite only. It will then trickle to Beta, Delta, and Omega, before being released to the general public.

Mid-octane Optane, Intel's 800P series

Subject: Storage | March 9, 2018 - 05:08 PM |
Tagged: ssd, PCIe 3.0 x2, Optane, NVMe, Intel, Brighton Beach, 800p, 58GB, 3D XPoint, 118GB

The price of the 480GB 900P is somewhat prohibitive but the small size of the 32GB gumstick also causes one pause; hence the 800P family with a 58GB and a 118GB model.  They bear price tags of $130 and $200, as you may remember from Al's review.  The Tech Report also had a chance to test these two Optane sticks out, with some tests not covered in our review, such as their own real world copying benchmark.  If you are looking for a second opinion, drop by and take a look.


"Intel's duo of Optane SSD 800P drives promises the same blend of impressively-low latency and performance consistency as its larger Optane devices at a price more builders can afford. We ran these drives through our storage-testing gauntlet to see whether they can make a name for themselves as primary storage."

Here are some more Storage reviews from around the web:


All in the wrist? Fingering your Threadripper's TIM

Subject: Cases and Cooling, Processors | March 9, 2018 - 02:45 PM |
Tagged: amd, Threadripper, tim, ryzen

If you are looking for advice on how to install and cool a Threadripper. [H]ard|OCP have quickly become the site to reference.  They've benchmarked the majority of waterblocks which are compatible with AMD's big chip as well as publishing videos on how to install it on your motherboard.  Today the chip is out again, this time it is getting a manually applied TIM facial.  Check out Kyle's tips on getting ready to coat your chip and the best way to spread the TIM to ensure even cooling.


"AMD's Threadripper has shown to be a very different CPU in all sorts of ways and this includes how you install the Thermal Interface Material as well should you be pushing your Threadripper's clocks beyond factory defaults. We show you what techniques we have found to give us the best temperatures when overclocking. "

Here are some more Processor articles from around the web:


Source: [H]ard|OCP

Windows 10's latest trick? Perpetually hitting EOL; secretly.

Subject: General Tech | March 9, 2018 - 01:49 PM |
Tagged: windows 10, spring creators update, microsoft

Microsoft demonstrated once again how little it learns from past mistakes.  Those who chose to opt out of updates to reduce the amount of data which Microsoft collects, or to ensure a production machine remains in a known state will soon find themselves running the latest build of Win10.  This will not be a choice, as it bypasses Windows Update and will install even if you have blocked that service; similar to the last three major updates.  Microsoft decided not to officially inform users of this, perhaps in the hopes no one would notice.

It seems that Windows 10 builds will essentially hit EOL every time a new major update is pushed out, and if you manage to successfully block the update, you won't receive any new security patches.  The Inquirer is as unimpressed with this as you are.


"Users, particularly those who have opted out of data collection, are being told that they must update to Build 1709 (the most recent) in order to continue receiving security patches."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

PCPer Mailbag #34 - It's Pronounced "Burger"

Subject: Editorial | March 9, 2018 - 09:00 AM |
Tagged: video, pcper mailbag, Josh Walrath

It's time for the PCPer Mailbag, our weekly show where Ryan and the team answer your questions about the tech industry, the latest and greatest GPUs, the process of running a tech review website, and more!

Josh takes the hot seat this week, so please try to constrain yourselves:

00:37 - Ryzen APUs a target for crypto miners?
03:07 - Replacing old AT power supply with ATX?
04:24 - M.2 SSD performance: motherboard vs. PCIe adapter?
05:47 - Max length for a USB cable?
07:07 - SUPoX motherboards absent from US market?
10:11 - Global Foundries able to hit 7nm without Samsung?
14:38 - What's so super about a supa-pipe?
15:54 - Josh's best and worst burger experiences?
18:19 - Josh's question for the audience?
19:03 - Josh vs. Ryan: FIGHT!

Want to have your question answered on a future Mailbag? Leave a comment on this post or in the YouTube comments for the latest video. Check out new Mailbag videos each Friday!

Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!

Source: YouTube

The GeForce Partner Program has some Kool-Aid it would like you to try

Subject: General Tech | March 8, 2018 - 03:26 PM |
Tagged: dirty pool, nvidia, gpp, GeForce Partner Program

[H]ard|OCP have posted an article looking at the brand new GeForce Partner Program which NVIDIA has announced that has a striking resemblance to a certain Intel initiative ... which turned out poorly.  After investigating the details for several weeks, including attempts to talk with OEMs and AIBs some serious concerns have been raised, including what seems to be a membership requirement to only sell NVIDIA GPUs in a product line which is aligned with GPP.  As membership to the GPP offers "high-effort engineering engagements -- early tech engagement -- launch partner status -- game bundling -- sales rebate programs -- social media and PR support -- marketing reports -- Marketing Development Funds (MDF)" this would cut out a company which chose to sell competitors products from quite a few things.

At this time NVIDIA has not responded to inquiries and the OEMs and AIBs which [H] spoke to declined to make any official comments; off the record there were serious concerns about the legality of this project.  Expect to hear more about this from various sites as they seek the transparency which NVIDIA Director John Teeple mentioned in his post.


"While we usually like to focus on all the wonderful and immersive worlds that video cards and their GPUs can open up to us, today we are tackling something a bit different. The GeForce Partner Program, known as GPP in the industry, is a "marketing" program that looks to HardOCP as being an anticompetitive tactic against AMD and Intel."

Here is some more Tech News from around the web:

Tech Talk


Source: [H]ard|OCP

Asus Introduces Gemini Lake-Powered J4005I-C Mini ITX Motheboard

Subject: Graphics Cards, Motherboards | March 8, 2018 - 02:55 AM |
Tagged: passive cooling, mini ITX, j4005i-c, Intel, gemini lake, fanless, asus

Asus is launching a new Mini ITX motherboard packing a passively-cooled Intel Celeron J4005 "Gemini Lake" SoC. The aptly-named Asus Prime J4005I-C is aimed at embedded systems such as point of sale machines, low end networked storage, kiosks, and industrial control and monitoring systems and features "5x Protection II" technology which includes extended validation and compatibility/QVL testing, overcurrent and overvoltage protection, network port surge protection, and ESD resistance. The board also features a EUFI BIOS with AI Suite.

Asus Prime J4005I-C Mini ITX Gemini Lake Motherboard.jpg

The motherboard features an Intel Celeron J4005 processor with two cores (2.0 GHz base and 2.7 GHz boost), 4MB cache, Intel UHD 600 graphics, and a 10W TDP. The SoC is passively cooled by a copper colored aluminum heatsink. The processor supports up to 8GB of 2400 MHz RAM and the motherboard has two DDR4 DIMM slots. Storage is handled by two SATA 6 Gbps ports and one M.2 slot (PCI-E x2) for SSDs. Further, the Prime J4005I-C has an E-key M.2 slot for WLAN and Bluetooth modules (PCI-E x2 or USB mode) along with headers for USB 2.0, USB 3.1 Gen 1, LVDS, and legacy LPT and COM ports.

Rear I/O includes two PS/2, two USB 2.0, one Gigabit Ethernet (Realtek RTL8111H), two USB 3.1 Gen 1, one HDMI, one D-SUB, one RS232, and three audio ports (Realtek ALC887-UD2).

The motherboard does not appear to be for sale yet in the US, but Fanless Tech notes that is is listed for around 80 euros overseas (~$100 USD). More Gemini Lake options are always good, and Asus now has one with PCI-E M.2 support though I see this board being more popular with commercial/industrial sectors than enthusiasts unless it goes on sale.

Source: Asus