All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | December 15, 2017 - 06:12 PM | Jeremy Hellstrom
Tagged: mionix, wei keyboard, mechanical keyboard, Cherry MX RGB red
You know the Mionix Wei keyboard has to be good, just look at that classic curly tail! In addition to nice looks the board uses Cherry MX RGB Red switches for your tactile pleasure. You can swap out the default keycaps if you opt to purchase one of their replacement sets, they come in light red/pink, yellow, and blue so you can have a rather unique looking board if you so desire. The lighting on the other hand is relatively simple, TechPowerUp felt the software doesn't match up to the competitions but it still lets you change the behaviour of your LEDs somewhat. Pop by for their full impression on the Wei keyboard right here.
"The Wei is Mionix's first keyboard, and part of their new "Get Fresh!" campaign with lighter color shades, optional keycap sets, and wrist rests; color schemes based off food items and clean aesthetics inside and out, which make this a keyboard targeting gamers and professionals alike."
Here is some more Tech News from around the web:
- The HyperX Alloy FPS Pro Mechanical Gaming Keyboard @ TechARP
- Patriot Viper V570 and RGB Mouse Pad Review @ Hardware Asylum
- Roccat Kone AIMO Gaming Mouse @ Benchmark Reviews
- G.Skill Ripjaws MX780 RGB game mouse @ Guru of 3D
Subject: General Tech | December 15, 2017 - 01:59 PM | Jeremy Hellstrom
Tagged: stereo, review, neodymium, HS50, headset, headphones, gaming, corsair, 50mm, audio
Sebastian was not the only one who listened to Corsair's new HS50 gaming headset, several other sites tested out this $50 headset and offered their thoughts. The Tech Report contrasted this headset with their favoured Audio-Technica ATH-M50x headphones which cost roughly three times as much. They found the audio to be somewhat clearer on the more expensive headset as you might expect, but Corsair's offering came very close and offer congratulations on the quality they managed on such an inexpensive pair of headphones. Check out their review in full right here.
"Corsair's HS50 Stereo Gaming Headset boasts solid build quality and classy looks, plus a Swiss Army knife's worth of compatibility. We gamed with the HS50 on consoles and listened to its musical chops to see whether it's a winner for $50."
Here is some more Tech News from around the web:
- Corsair HS50 Stereo Gaming Headset @ Guru of 3D
- Corsair HS50 Gaming Headset @ Kitguru
- Corsair HS50 Stereo Gaming Headset @ Benchmark Reviews
- 1MORE MK802 Bluetooth Headphones @ TechPowerUp
- Kingston HyperX Cloud Alpha Pro Gaming Headset Review @ OCC
Subject: General Tech | December 15, 2017 - 01:35 PM | Jeremy Hellstrom
Tagged: ataribox, Indiegogo
Yesterday was supposed to be the first day in which you could pre-order the AMD powered Ataribox retro-gaming machine from Indigogo. Unfortunately that has proved not to be the case as the launch has been delayed indefinitely. The Inquirer was not able to find out any details on what is causing the hold up, nor have we any more information on the hardware which will be inside the box. This announcement also implies the spring launch date is now questionable, hopefully we there will be some details released in the near future about the delay and its effect on delivery times.
"Earlier this week, Atari announced that it would be launching its crowdfunding campaign the Ataribox on 14 December, but just days the firm has been forced to tell prospective customers that pre-orders have been delayed indefinitely."
Here is some more Tech News from around the web:
- The 2017 Ars Technica gadget gift guide: Power-user edition
- Motherboard and VICE Are Building a Community Internet Network @ Slashdot
- BlackBerry will support BB10 until 2020, but the Priv has been flushed @ The Inquirer
- Want to really understand how bitcoin works? Here’s a gentle primer @ Ars Technica
- Please, please, c'mon, just... please, pretty please, just, like, please use our AI – Microsoft @ The Register
- Company of Heroes 2 is FREE for a Limited Time @ TechARP
Subject: Editorial | December 15, 2017 - 09:00 AM | Jim Tanous
Tagged: video, Ryan Shrout, pcper mailbag, pcper
It's time for the PCPer Mailbag, our weekly show where Ryan and the team answer your questions about the tech industry, the latest and greatest GPUs, the process of running a tech review website, and more!
This week, Ryan's back from Hawaii to tackle these topics:
00:26 - Ryzen laptop reviews?
02:45 - Component order in custom water cooling loop?
04:46 - NVIDIA variable refresh rate in HDMI 2.1?
07:19 - Next-gen VR headset resolution?
09:15 - Worth it to by a 6950X now?
11:23 - Windows on ARM battery life?
14:12 - Reverse hyper-threading?
16:03 - What is deep learning?
18:47 - Best SSD for storage server cache?
20:54 - Ryan's daughter a future gamer?
Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!
Subject: Graphics Cards | December 15, 2017 - 09:00 AM | Scott Michaud
Tagged: vega 64 liquid, vega 64, vega 56, Vega, sapphire, radeon, amd
SAPPHIRE has just launched a pair of custom cooled, factory overclocked, RX Vega-based graphics cards. As you might guess: the SAPPHIRE NITRO+ Radeon RX Vega 64 uses the Vega 64 chip with its 4096 stream processors, while the SAPPHIRE NITRO+ Radeon RX Vega 56 uses the Vega 56 chip and its 3584 stream processors. Both cards have 8GB of HBM2 memory (two stacks of 4GB). The cooler design uses three fans and vapor chambers, with separate heat pipes for the GPU+Memory (six pipes) and VRMs (two pipes).
It also has a back plate!
The clock rate is where it gets interesting. The NITRO+ RX Vega 64 will have a boost clock of 1611 MHz out-of-the-box. This is above the RX Vega 64 Air’s boost clock (1546 MHz) but below the RX Vega 64 Liquid’s boost clock (1677 MHz). The liquid-cooled Radeon RX Vega 64 still has the highest clocks, but this product sits almost exactly half-way between it (the liquid-cooled RX Vega 64) and the air-cooled RX Vega 64.
The NITRO+ Radeon RX Vega 56, with its 1572 MHz boost clock, is well above the stock RX Vega 56’s 1471 MHz boost clock, though. It’s a clear win.
As for enthusiast features, this card has quite a few ways to keep it cool. First, it will operate fanless until 56C. Second, the card accepts a 4-pin fan connector, which allows it to adjust the speed of two case fans based on the temperature readings from the card. I am a bit curious whether it’s better to let the GPU control the fans, or whether having them all attached to the same place allows them to work together more effectively. Either way, if you ran out of fan headers, then I’m guessing that this feature will be good for you anyway.
The SAPPHIRE NITRO+ Radeon RX Vega 64 and 56 are available now.
Subject: Cases and Cooling | December 14, 2017 - 07:14 PM | Jeremy Hellstrom
Tagged: optimus, V1 Silver, waterblock, watercooler
The name might fool you, but Optimus designs watercoolers for Intel's LGA 115X and 20XX sockets and their V1 Silver is the initial offering to the public, and of reviewers of course. The cooler is designed with 0.005" fins and 0.004" channels, with directional water flow which is the only challenge when installing, otherwise [H]ard|OCP were very impressed with the ease in which the V1 was installed. That said, [H] greatly improved the performance of the cooler by adding heavier springs; doing so allowed this cooler to surpass the XSPC RayStorm Neo and take the Gold at the end of the review. Check out the full review, plus a strip show, right here.
"Optimus Water Cooling may be a company that you are not familiar with. That is because this is its first publicly available computer water cooling product in retail. Optimus is a US company building products made and sourced right here in the good ole US of A. We just took delivery of it new V1 Silver water block for Intel sockets."
Here are some more Cases & Cooling reviews from around the web:
Subject: Storage | December 14, 2017 - 04:43 PM | Jeremy Hellstrom
Tagged: western digital, wd gold, hdd, 12TB
The 12TB WD Gold is not quite as impressive as Toshiba's 14TB drive but it should be more affordable for consumers with specific needs or for SMBs. Like the Toshiba drive it uses PMR as opposed to a shingled design, which again helps keep the drive's price under $600 and in the price range Ryan would like to see SSDs reach. The drive is rated at 2.5 million hours MTBF and as far as performance, Kitguru saw 245.58MB/s for writes and 237.01MB/s reads. This is not a drive for most, but for those with huge amounts of data who need to be able to move it frequently and at decent speeds, this review is worth looking at.
"Western Digital’s Gold range of hard drives have been designed to service nearline enterprise environments and as such they have a range of sensors and technologies onboard to help them maintain peak performance in such environments."
Here are some more Storage reviews from around the web:
- HP SSD M700 @ Benchmark Reviews
- LaCie 2big Dock Thunderbolt 3 @ Kitguru
- Synology DiskStation DS418play NAS @ Modders-Inc
- ASUSTOR AS6302T NAS Server Review @ NikKTech
- Synology DS918+ 4-Bay NAS @ TechPowerUp
- SilverStone TS421S 4-Disk SATA/SAS Disk Enclosure @ Phoronix
- Thecus N4350 4-Bay NAS @ Kitguru
- Synology DiskStation DS418j @ Kitguru
Subject: General Tech | December 14, 2017 - 01:49 PM | Jeremy Hellstrom
Tagged: AT%26T, direcTV, security, networking, linksys
To start off with the bad news, as is our wont, DirecTV kits have a rather serious code injection problem. A researcher was able access the root shell on the Linksys WVBR0-25 wireless video bridge in less than 30 seconds, once he had access to one of the devices that the bridge was streaming to. As there are many infected machines out there, often PC's used only as video players as simple, poorly secured machines, this would mean your machines could be recruited into a botnet or mining pool quite easily. The researcher passed on his research to AT&T and Linksys 181 days ago he is quite disappointed they have yet to start develop a patch, according to The Register.
On a more positive note, AT&T is testing broadband over powerlines in Georgia and an undisclosed location outside the USA. They did not release any specifics of the current bandwidth which they can provide, though their goal is to surpass 1 gigabit per second. This will be quite the project as the testing we have done with powerline adapters did not show network connectivity anywhere near that speed in the best case scenarios, let alone when less than perfect wiring nor distance degraded the overall performance. You can check out more on that topic over at Slashdot.
"AT&T's DirecTV wireless kit has an embarrassing vulnerability in its firmware that can be trivially exploited by miscreants and malware to install hidden backdoors on the home network equipment, according to a security researcher."
Here is some more Tech News from around the web:
- How fast is a piece of string? Boffin shoots ADSL signal down twine @ The Register
- Crytek sues Star Citizen developers over game engine @ Ars Technica
- Flash bang walloped: Toshiba, Western Digital sign peace treaty over memory chip fabs @ The Register
- The Last Mile to Civilization 2.0: Technologies From Our Not Too Distant Future @ Techspot
- Intel to slap hardware lock on Management Engine code to thwart downgrade attacks @ The Register
- TSMC to spend $20bn on 3-nanometer chips @ Nikkei
- New battery boffinry could 'triple range' of electric vehicles @ The Register
Subject: General Tech | December 14, 2017 - 12:09 PM | Alex Lustenberg
Tagged: video, vesa, toshiba, titan v, synaptics, Silverstone, shazam, radeon, podcast, PBT, nvidia, nervana, keylogger, jonsbo, Intel, hp, hdr, corsair, Clear ID, apple, amd, Adrenalin, 14tb
PC Perspective Podcast #479 - 12/14/17
Join us for discussion on NVIDIA Titan V, AMD Adrenalin, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, Allyn Malventano,
Peanut Gallery: Ken Addison, Alex Lustenberg
Program length: 1:12:23
Subject: General Tech | December 13, 2017 - 03:43 PM | Jeremy Hellstrom
TechSpot took a look at the effect of system RAM on gaming performance when using a GPU with less VRAM than a game prefers. They tested both the 3GB and 6GB models of the GTX 1060, with 4, 8, 16 and 32GB of system memory installed. With the rising costs of RAM, their findings suggest paying for the extra VRAM is worth it as the 3GB model didn't really start to offer proper performance with less than 16GB of DDR in the system. That extra RAM will often cost you more than purchasing a better GPU, though perhaps not enough to justify that GTX 1080 Ti. Check out the full review to see what effect extra RAM has on your favourite games.
"Measuring the impact that RAM capacity has on gaming is harder than it sounds because of all the factors at play. However we've tested different hardware configurations to determine how much memory is truly useful for gaming from 4GB up to 32GB."
Here is some more Tech News from around the web:
- Star Wars Battlefront 2’s new trailer teases December plans @ Rock, Paper, SHOTGUN
- Wolfenstein II: The New Colossus Review @ OCC
- Playerunknown’s Battlegrounds new map made me feel lost again, and I love it @ Rock, Paper, SHOTGUN
- Humble Racing Bundle
- Tom Clancy vs. Predator: Wildlands introduces movie monster @ Rock, Paper, SHOTGUN
- Ubisoft are testing For Honour dedicated servers, and you can join them even if you don’t own the game @ Rock, Paper, SHOTGUN
- GOG Winter Sale
- Giant wars and a rain of elves in Total Warhammer 2’s free Laboratory DLC @ Rock, Paper, SHOTGUN
Subject: Graphics Cards | December 13, 2017 - 02:05 PM | Jeremy Hellstrom
Tagged: gtx 1080 ti, phanteks, G1080, watercooling
Phanteks announced their G1080 water block for GTX 1080 Ti's a while back but we hadn't seen it in action until now. [H]ard|OCP installed the cooler on a Founders Edition card and created a video of the process. Not only do they show how to properly install the water block they also cover a few of the possible issues you might encounter while doing so. They also made a video showing how the coolant flows through the waterblock which is not only pretty but can help you determine where to insert your GPU into your watercooling loop.
"Phanteks recently sent us its Glacier series water block for our Founders Edition GTX 1080 Ti. We take you through the full process of getting it installed. We check out the mating surfaces of the GPU, capacitors, and MOSFETs and show you just how well it all fits together. Then finally we show exactly how the coolant flows in 4K!"
Here are some more Graphics Card articles from around the web:
- ASUS ROG STRIX GTX 1070 Ti GAMING Advanced Edition @ [H]ard|OCP
- Zotac GTX 1070 Ti AMP Extreme @ Kitguru
- EVGA GTX 1070 Ti FTW2 iCX 8 GB @ TechPowerUp
- MSI GTX 1080 Ti Gaming X Trio Review @ OCC
- Radeon vs. NVIDIA With Windows 10 & Ubuntu Linux @ Phoronix
- AMDGPU-PRO 17.50 vs. RADV/RadeonSI Radeon Linux Gaming Performance @ Phoronix
- AMD Radeon Software Adrenalin Edition Introduction @ [H]ard|OCP
- AMD Adrenalin Drivers: Monitor & Control Your System From A Smartphone @ Techgage
- The AMD Radeon Software Adrenalin Edition Tech Report @ TechARP
- Radeon Software Adrenalin Edition Driver December 2017 Analysis @ Guru of 3D
- AMD Finally Pushing Out Open-Source Vulkan Driver @ Phoronix
- AMD Radeon Software Adrenalin Edition Overview @ TechPowerUp
Subject: Graphics Cards | December 12, 2017 - 07:51 PM | Ryan Shrout
Tagged: nvidia, titan, titan v, Volta, video, teardown, unboxing
NVIDIA launched the new Titan V graphics card last week, a $2999 part targeted not at gamers (thankfully) but instead at developers of machine learning applications. Based on the GV100 GPU and 12GB of HBM2 memory, the Titan V is an incredibly powerful graphics card. We have every intention of looking at the gaming performance of this card as a "preview" of potential consumer Volta cards that may come out next year. (This is identical to our stance of testing the Vega Frontier Edition cards.)
But for now, enjoy this unboxing and teardown video that takes apart the card to get a good glimpse of that GV100 GPU.
A couple of quick interesting notes:
- This implementation has 25% of the memory and ROPs disabled, giving us 12GB of HBM2, a 3072-bit bus, and 96 ROPs.
- Clock speeds in our testing look to be much higher than the base AND boost ratings.
- So far, even though the price takes this out of the gaming segment completely, we are impressed with some of the gaming results we have found.
- The cooler might LOOK the same, but it definitely is heavier than the cooler and build for the Titan Xp.
- Champagne. It's champagne colored.
- Double precision performance is insanely good, spanking the Titan Xp and Vega so far in many tests.
- More soon!
Subject: General Tech, Processors | December 12, 2017 - 04:52 PM | Tim Verry
Tagged: training, nnp, nervana, Intel, flexpoint, deep learning, asic, artificial intelligence
Intel recently provided a few insights into its upcoming Nervana Neural Network Processor (NNP) on its blog. Built in partnership with deep learning startup Nervana Systems which Intel acquired last year for over $400 million, the AI-focused chip previously codenamed Lake Crest is built on a new architecture designed from the ground up to accelerate neural network training and AI modeling.
The full details of the Intel NNP are still unknown, but it is a custom ASIC with a Tensor-based architecture placed on a multi-chip module (MCM) along with 32GB of HBM2 memory. The Nervana NNP supports optimized and power efficient Flexpoint math and interconnectivity is huge on this scalable platform. Each AI accelerator features 12 processing clusters (with an as-yet-unannounced number of "cores" or processing elements) paired with 12 proprietary inter-chip links that 20-times faster than PCI-E, four HBM2 memory controllers, a management-controller CPU, as well as standard SPI, I2C, GPIO, PCI-E x16, and DMA I/O. The processor is designed to be highly configurable and to meet both mode and data parallelism goals.
The processing elements are all software controlled and can communicate with each other using high speed bi-directional links at up to a terabit per second. Each processing element has more than 2MB of local memory and the Nervana NNP has 30MB in total of local memory. Memory accesses and data sharing is managed with QOS software which controls adjustable bandwidth over multiple virtual channels with multiple priorities per channel. Processing elements can talk to and send/receive data between each other and the HBM2 stacks locally as well as off die to processing elements and HBM2 on other NNP chips. The idea is to allow as much internal sharing as possible and to keep as much data stored and transformed in local data as possible in order to save precious HBM2 bandwidth (1TB/s) for pre-fetching upcoming tensors, reduce the number of hops and resulting latency by not having to go out to the HBM2 memory and back to transfer data between cores and/or processors, and to save power. This setup also helps Intel achieve an extremely parallel and scalable platform where multiple Nervana NNP Xeon co-processors on the same and remote boards effectively act as a massive singular compute unit!
Intel's Flexpoint is also at the heart of the Nervana NNP and allegedly allows Intel to achieve similar results to FP32 with twice the memory bandwidth while being more power efficient than FP16. Flexpoint is used for the scalar math required for deep learning and uses fixed point 16-bit multiply and addition operations with a shared 5-bit exponent. Unlike FP16, Flexpoint uses all 16-bits of address space for the mantissa and passes the exponent in the instruction. The NNP architecture also features zero cycle transpose operations and optimizations for matrix multiplication and convolutions to optimize silicon usage.
Software control allows users to dial in the performance for their specific workloads, and since many of the math operations and data movement are known or expected in advance, users can keep data as close to the compute units working on that data as possible while minimizing HBM2 memory accesses and data movements across the die to prevent congestion and optimize power usage.
Intel is currently working with Facebook and hopes to have its deep learning products out early next year. The company may have axed Knights Hill, but it is far from giving up on this extremely lucrative market as it continues to push towards exascale computing and AI. Intel is pushing for a 100x increase in neural network performance by 2020 which is a tall order but Intel throwing its weight around in this ring is something that should give GPU makers pause as such an achievement could cut heavily into their GPGPU-powered entries into this market that is only just starting to heat up.
You won't be running Crysis or even Minecraft on this thing, but you might be using software on your phone for augmented reality or in your autonomous car that is running inference routines on a neural network that was trained on one of these chips soon enough! It's specialized and niche, but still very interesting.
- Intel Launches Stratix 10 FPGA With ARM CPU and HBM2
- Intel's Nervana chip targets Nvidia on artificial intelligence
- New AI products will Crest Computex
- Intel to Ship FPGA-Accelerated Xeons in Early 2016
- Intel Kills Knights Hill, Will Launch Xeon Phi Architecture for Exascale Computing @ ExtremeTech
- NVIDIA Discusses Multi-Die GPUs
Subject: General Tech | December 12, 2017 - 04:25 PM | Jeremy Hellstrom
Tagged: destiny 2, Bungie
With the newest expansion to Destiny 2 came an unpleasant surprise for those who did not pay for the DLC, an inability to access parts of the game they previously had access to. The Prestige level challenges were intended to scale with a player's power and with the new cap available it was no longer available for those who did not fork over cash for the DLC, as well there were several PvP modes that were also blocked. They have since realized the error of their ways and both restored access and apologized; you can read that apology at [H]ard|OCP.
"The Prestige Raid was a novel experience that players value, even if they don’t own Curse of Osiris, and it was a mistake to move that experience out of reach. Throughout the lifetime of the Destiny Franchise, Trials has always required that players owned the latest Expansion."
Here is some more Tech News from around the web:
- Samsung vs. Acer Mixed Reality headsets: Which handles VR best? @ Engadget
- TR's 2017 Christmas giveaway: goodies from MSI, Antec, and OCZ @ The Tech Report
- 129 Million Americans Can Only Get Internet Service From Companies That Have Violated Net Neutrality @ Slashdot
- Hackers' delight: Mobile bank app security flaw could have smacked millions @ The Register
- Netflix silent about ridicule as it discusses punters' viewing habits @ The Register
- Holiday Guide to the Best 4K TV PC Gaming @ BabelTechReviews
- TR's 2017 Christmas giveaway featuring goodies from MSI, Antec, and Toshiba OCZ
Non-profit standards association VESA has put forth a new open standard called DisplayHDR for defining HDR specifications and performance for PC laptop and desktop LCDs. The new test specification, dubbed Display HDR 1.0, defines a transparent testing methodology and definitions along with specifying three tiers of HDR system performance that will identify displays as being certified for minimum, mid-range, and high-end HDR with their respective badges of DisplayHDR 400, DisplayHDR 600, and DisplayHDR 1000. Consumers will be able to easily identify which panels have HDR and how they stack up.
The new HDR standard was devised by VESA with input from over two dozen of its member companies including major OEMs of displays, panels, graphics cards, CPUs, display drivers, and color calibration providers. DisplayHDR is reportedly a fully open and transparent standard with automated tools that end users can download and run to verify the results for themselves. The standard includes three peak luminance tests, two contrast measurement tests (native and local dimming), color testing and validation of BT.709 and DCI-P3 color gamuts, bit-depth requirement tests (see below), and HDR backlight response time measurements.
DisplayHDR 400 represents the minimum entry-level tier of HDR per the VESA specification and specifies that a LCD display must feature at least 400 nits brightness (both short, local bursts and full screen flashes), 8-bit color depth, HDR-10, and global dimming. VESA notes that many non-HDR displays that advertise as supporting 8-bit colors, it is actually a 6-bit panel that uses a dithering algorithm to achieve a simulated 8-bits. DisplayHDR specifies true 8-bit at a minimum, and for DisplayHDR 600 and DisplayHDR 1000 displays must achieve 10-bit depth using 8-bit panels combined with 2-bit dithering at a minimum.
Display and PC manufacturers have reportedly had their hands on the DisplayHDR test specification for some time now and are working on validating their displays so that they can offer products with the DisplayHDR logos. New product announcements and demonstrations are expected during CES 2018 next month with DisplayHDR compatible products showing up as early as Q1 2018. VESA notes that while DisplayHDR currently only targets LCDs, it hopes to extend the open standard to include OLED displays in the future.
I think this is a good thing as there is a lot of confusing and conflicting advertising out there when it comes to HDR. A vendor neutral specification and badge that can also be independently tested may be just what the display market needs to push HDR into the mainstream.
Subject: General Tech, Mobile | December 12, 2017 - 02:37 AM | Tim Verry
Tagged: synaptics, security patch, security, keylogger, hp, Cyber Security
HP has issued security patches for more than 460 models of the company's laptops and thin clients to address a hidden keylogger present in the Synaptics touchpad drivers. Discovered by security researcher Michael Myng while delving into the Synaptics Touchpad Software in an attempt to change the backlight behavior of the keyboard, the keylogger was reportedly built into the software stack to debug errors. While it shipped to customers disabled by default, an attacker that was able to achieve administrative privileges could change the appropriate registry value and enable keylogging to locally record all of the user's keystrokes without their knowledge. Further malicious code or local physical access could then be used to retrieve data for analysis of possible passwords, usernames, account numbers, and other personal information.
Image courtesy Robbert van der Steeg via Flickr Creative Commons
HP claims in its security bulletin that at no time did it or Synaptics have access to customer data and that this security vulnerability is a "local loss of confidentiality" and should be acted upon as soon as possible by downloading the security patch for your laptop from HP or by running Windows Update.
According to the HP security bulletin, the vulnerability reportedly affects all Synaptics OEM partners including HP that have shipped systems with certain Synaptics Touchpad driver versions. In the case of HP this includes commercial / enterprise notebooks, tablets, thin clients, and mobile workstations from their G2, G4, G6, Elite X2, EliteBook, Thin Client, ProBook, Spectre Pro, Stream, X360, and ZBook Mobile Workstation series and consumer devices with Compaq, Beats, ENVY, OMEN, Pavilion, Spectre, Split, Stream, and even the 15" Star Wars Special Edition laptop!
While this is a serious security risk, there is no need to panic. You should apply the patch manually or through Windows Update as soon as possible, but so long as you have been and continue to follow security best practices (strong passwords, running anti-virus and anti-malware scans regularly, restricting physical access, and not running as administrator on your daily driver user account, ect) you should be safe as there are several steps that would need to be completed before an attacker could take advantage of this hidden keylogger, especially remotely.
You can find the full list of affected laptops and their associated security patches on HP's support website. For a PGP signed version of the page you can email firstname.lastname@example.org.
Subject: General Tech, Mobile | December 12, 2017 - 12:00 AM | Tim Verry
Tagged: synaptics, security, fingerprint sensor, fingerprint, biometrics
Synaptics, the California-based "human interface solution developer" founded in 1986, is no stranger to PC input and interface devices with more than 5 billion units shipped to OEM partners and a large patent portfolio. Today, the company is getting into the lucrative smartphone market in a big way with its Clear ID In-Display fingerprint sensor which sits just under and scans a user's fingerprint through a standard smartphone display (including glass overlay and screen protector).
Current sensors sit in a fixed position at OEM discretion and activate only when needed. Future in-display sensors will target larger areas and eventually the entire display.
Synaptics has designed and positioned the FS9500 fingerprint sensor at smartphones with so-called Infinity Displays and seeks to address the problem of where and how to mount a biometrics sensor (or even a camera for that matter, but that's a different problem) on smartphones that are moving towards edge to edge displays with no physical button area or bezels to mount front facing sensors! Rather than requiring an unusual cut out display (like the iPhone X) or settling for larger bezels, Synaptics has instead opted to take advantage of the large display area by laminating the thin fingerprint senor module to the underside of the display and using the OLED display itself as the light source to illuminate the user's fingerprint so that the optimized CMOS image sensor can scan the fingerprint from the reflected light bounced through the gaps in between pixels.
Synaptics claims it is using "Quantum Matcher" and "PurePrint" machine learning technology to enhance security as well as to adapt to different external lighting environments (e.g. direct sunlight), and as a result its fingerprint sensor is able to work faster and in more situations than competing 3D facial recognition systems. Specifically, the company claims its fingerprint sensor is able to accurately read a user's fingerprint in 0.7 seconds versus 1.4 seconds for a facial recognition camera biometrics sensor. The Clear ID In-Display fingerprint sensor is rated at an approximate 99% spoof rejection rate thanks to the AI-powered PurePrint technology that discerns real fingerprints from fakes along with an on-board encryption module that establishes an encrypted connection from the biometrics sensor to the phone.
With the biggest hurdle to the continued relevance of the fingerprint sensor on modern smartphones solved by placing it under the display itself, smartphone users are able to continue to enjoy the benefits of fingerprint biometrics versus the fancy new facial recognition systems (which are admittedly cool, but do seem a bit "gimmicky" to me) including being able to unlock the phone naturally by picking it up, not having to look directly at it before it will unlock, and being able to quickly unlock the phone even in bad lighting (too dark or too bright) situations.
If it works as well as they claim, it seems like a neat way to integrate a secure fingerprint reader. Fortunately, we should not have to wait long to see it in action with devices using the technology expected as soon as next year. Perhaps we will see more information or even product announcements and design wins at CES! (Also, how the heck is it almost CES already?? heh)
Interestingly, Synaptics is not the first company to attempt the under display fingerprint sensor, with Qualcomm showing off an "ultrasonic" sensor earlier this year and Samsung reportedly filing patents for a pressure sensitive in display sensor, but Synaptics may well be the first to actually deliver, and with a product that is faster and able to work with thicker "top stacks" (the distance between top of sensor and finger including the display, glass overlay, and screen protector) up to 1.5mm.
Subject: General Tech | December 11, 2017 - 06:46 PM | Tim Verry
Tagged: shazam, music streaming, augmented reality, apple music, apple
Apple has confirmed its plans to acquire the London-based company Shazam who is most well-known for its song recognition app for smartphones. The deal, which industry sources estimate to be worth a bit over $400 million, would see Shazam and its employees become part of Apple who has been in talks with Shazam for the past five months and exclusively dating for two.
TechCrunch quotes Apple in stating:
“We are thrilled that Shazam and its talented team will be joining Apple. Since the launch of the App Store, Shazam has consistently ranked as one of the most popular apps for iOS. Today, it’s used by hundreds of millions of people around the world, across multiple platforms. Apple Music and Shazam are a natural fit, sharing a passion for music discovery and delivering great music experiences to our users. We have exciting plans in store, and we look forward to combining with Shazam upon approval of today’s agreement.”
Currently, Shazam is available on a massive number of devices with apps for Android, iOS, Watch OS (Apple Watch), BlackBerry OS, Mac OS, and Windows machines equipped with a microphone. Its apps have been downloaded well over 1 billion times and its users have performed more than 30 billion song searches – "Shazams" – since its launch. The Shazam app allows users to identify songs by recording short clips which Shazam creates a time-frequency spectrograph with to compare to its database of known spectrographs of 11 million songs in an attempt to find a match. IT's not perfect, especially if you are in a loud bar or at home and the song you want to identify is in the background of a TV show with a lot of dialogue over it, but it works for the most part. Shazam has further updated its app through the years to incorporate social networking aspects, link YouTube videos of identified songs, provide links to Amazon Music and Apple Music to purchase the song, and display song lyrics and information on the music artist. The app development company Shazam has also branched out into marketing partnerships as well as image recognition and augmented reality projects which may have also piqued Apple's interest.
Interestingly, Apple was not the only – or even the first – company to approach Shazam about a possible acquisition. Specifically, Snap (the company behind Snapchat) and Spotify were also interested in buying up the London-based developers. While the talks with Spotify fell through, Snap originally approached Shazam six months ago, beating Apple to the punch, but apparently neither company was able to muster up a stable-enough or sizeable enough offer. It is natural that these three companies would be interested in folding Shazam into their own business units since they already have partnerships in place with Shazam for various functionality and marketing reasons. Ars Technica notes that Shazam is used on the backend when asking Siri to identify a song, for example. Further, Spotify members with paid subscriptions could listen to full songs from within the Shazam app, and Shazam can be used within Snapchat to discover and share out songs.
With Apple winning the war for Shazam, I am curious what this will mean for the future of Apple Music as well as the future of the standalone Shazam apps (especially those on non-Apple platforms like the Android app and the song recognition functionality from within third party apps). Bringing Shazam in house is a smart move for Apple which is looking to advance its streaming music service. If anything, it will open the Play Store up for new apps to move in if Apple does pull Shazam inside its walled garden as an Apple exclusive offering.
What are your thoughts on the acquisition? Do you use Shazam?
Subject: Motherboards | December 11, 2017 - 05:11 PM | Jeremy Hellstrom
Tagged: Z370 Aorus Ultra Gaming, Z370, Intel, gigabyte, coffee lake
The Z370 for Coffee Lake may look the same as a Z270 for Kaby Lake but unfortunately that is not the case and your Kaby CPU is not going to work. For those who did not upgrade during the previous generation and have been patiently awaiting the availability of Coffee Lake CPUs, [H]ard|OCP's review of the Gigabyte Z370 Aorus Ultra Gaming is worth checking out. The board can be had for around $170 and currently includes a free PCIe WiFi card, for that price there are a lot of extras to be had. The board is also able to offer the possibility of a decent overclock as well!
"Intel’s launched yet another chipset, so for better or worse that means new motherboards for Intel’s mainstream market. We look at GIGABYTE’s Z370 Aorus Ultra Gaming to see if it’s worthy of a Coffee Lake CPU. And now that you can actually find the 8700K in stock, it is worth talking about."
Here are some more Motherboard articles from around the web:
- Gigabyte Z370 Aorus Ultra Gaming @ Kitguru
- The EVGA Z370 FTW MB review – the i7-8700K Road to 5.0 GHz @ BabelTechReviews
- MSI Z370 GAMING PRO CARBON AC @ TechPowerUp
- Asus ROG Rampage VI Extreme @ Guru of 3D
- ASUS Republic Of Gamers Maximus X Apex @ Guru of 3D
- ASRock X299 Taichi XE @ Kitguru
Subject: General Tech, Mobile | December 11, 2017 - 04:30 PM | Tim Verry
Tagged: Windows 10 S, snapdragon 835, qualcomm, NovaGo, asus
The Asus NovaGo was announced last week at Qualcomm’s Snapdragon Tech Summit, and now the company is sharing additional specifications on one of the first Windows On Snapdragon devices. Powered by a Qualcomm Snapdragon 835 and running Windows 10 S, Asus is promising a convertible tablet with up to 22 hours of battery life capable of running most of your usual Windows applications (even non-Store / UWP apps so long as they are 32-bit and don’t require kernel mode drivers).
Measuring 316 x 221.6 x 14.9mm, the Asus NovaGo TP370 is constructed of dark gray plastic (and some metal bits) and weighs in at just over 3.06 pounds (1.39 kg). The top half of the device is dominated by a 13.3” 1920 x 1080 LTPS “NanoEdge” display with 8.9mm bezels and also hosts the 720p webcam which isn’t great but does apparently support Windows Hello. The display offers 10-point multi-touch as well as stylus support in the form of the Asus Pen with 1024 levels of pressure sensitivity.
A 360° silver colored hinge connects the two halves of the PC and enables tablet and tent modes. The bottom half of the NovaGo holds most of the hardware of the device along with the external I/O ports. The NovaGo has a chiclet style keyboard with flat looking keys and the arrow keys nestled in the bottom right corner. The trackpad does appear to be fairly large though. There are two SonicMaster stereo speakers, two USB 3.1 Gen 1 (5Gbps) Type A ports, one HDMI video output, a audio combo jack, microSD card slot, Nano SIM slot, and DC power input (no USB Type-C charging here unfortunately).
Internal hardware centers around the 10nm Snapdragon 835 SoC and its X16 LTE modem. The Snapdragon 835 features eight Kryo 280 64-bit ARM cores clocked at up to 2.45 GHz, an Adreno 540 GPU at 710 MHz, Hexagonn 682 DSP, support for aptX audio and Aqstic audio codec, Spectra 180 ISP (which seems to be underutilized here with only a 1MP webcam in play), and platform security module. The SoC is paired with up to 8GB of RAM and 256 GB of UFS 2.0 flash storage (rated at up to 175 MB/s or 4000 Mbps).
The NovaGo has four antennas and supports Gigabit LTE (1 Gbps down, 150 Mbps up) and dual-band 802.11ac MU-MIMO Wi-Fi. Users can use a Nano SIM or eSIM (embedded SIM) functionality to connect to their wireless carriers with the eSIM able to be set up through the Windows Store by purchasing a data plan locally when traveling. A 52 watt-hour battery allegedly keeps the NovaGo running for up to 22 hours and sitting in connected standby for up to a month. Windows 10 S is bundled with the system, but power users can upgrade to Windows 10 Pro for free until September 2018.
Hexus.net reports that the NovaGo will be available in early spring 2018 and will hit the US, UK, Italy, France, China, and Taiwan first with other countries to follow later. There are several models at play with 4GB, 6GB, and 8GB of RAM as well as 64GB, 128GB, and 256GB of UFS 2.0 storage. The base model has a MSRP of $599 and the top end SKU has a MSRP of $799.
The pricing does seem to be on the more expensive side, but these devices are aimed at mobile professionals and businesses with expense accounts so it’s not that out of line, and if the build quality is there and the battery life gets close to the lofty promises I can see them catching on.
- AMD Partners With Qualcomm For Always Connected Ryzen Mobile PCs
- Qualcomm aptX Bluetooth Review: Improving Wireless Sound