Subject: General Tech | January 26, 2016 - 12:13 PM | Jeremy Hellstrom
Tagged: security, Lenovo, idiots
Lenovo chose the third most popular password of 2015 to secure its ShareIT for Windows application and for bonus points have made it hard coded, which there is utterly no excuse for in this day and age. If you aren't familiar with the software, it is another Dropbox type app which allows you to share files and folders, apparently with anyone now that this password ridiculousness has been exposed. As you read on at The Inquirer the story gets even better, files are transferred in the clear without any encryption and it even creates an open WiFi hotspot for you, to make sharing your files even easier for all and sundry. There are more than enough unintentional vulnerabilities in software and hardware, we really don't need companies programming them in on purpose. If you have ShareIT, you should probably DumpIT.
We received word that there is an updated version of ShareIT available for those who do use the app and would like to continue to do so.
They can also access the latest versions which are posted and available for download on the Lenovo site. The updated Android version of SHAREit is also available for download on the Google Play store. Please visit the Lenovo security advisory page for the latest information and updates: (https://support.lenovo.com/us/en/product_security/len_4058)
"HOLY COW! Lenovo may have lost its mind. The firm has created vulnerabilities in ShareIT that could be exploited by anyone who can guess that '12345678' could be a password."
Here is some more Tech News from around the web:
- Simple solution foils lithium-battery freeze @ Nanotechweb
- Terrible infections, bad practices, unclean kit – welcome to hospital IT @ The Register
- Microsoft struggles against self-inflicted Office 365 IMAP outage @ The Register
- AI pioneer Marvin Minsky passes away aged 88 @ The Inquirer
- Surface Pro 3 users are getting all kinds of borkage after bad driver drop @ The Inquirer
- Exposed HP LaserJet Printers Offer Anonymous FTP To the Public @ Slashdot
Subject: Graphics Cards | January 25, 2016 - 03:19 PM | Jeremy Hellstrom
Tagged: XFX R9 380 Double Dissipation Black Edition OC 4GB, xfx, gtx 960
In one corner is the XFX R9 380 DD Black Edition OC 4GB, at factory settings and with an overclock of 1170MHz core and 6.4GHz memory and in the other corner is a GTX 960 with a 1178MHz Boost clock and 7GHz memory. These two contenders will compete in a six round 1080p match featuring Fallout 4, Project Cars, Witcher 3, GTAV, Dying Light and BF4 to see which is worthy of your hard earned buckaroos. Your referee for today will be [H]ard|OCP, tune in to see the final results.
"Today we evaluate a custom R9 380 from XFX, the XFX R9 380 DD BLACK EDITION OC 4GB. Sporting a hefty factory overclock and the Ghost Thermal 3.0 custom cooling with Double Dissipation, we compare it to an equally priced reference GeForce GTX 960. Find out which video card provides the better bargain."
Here are some more Graphics Card articles from around the web:
- ASUS STRIX R9 380X DirectCU II OC 1080p @ [H]ard|OCP
- Sapphire R9 390 Nitro 8 GB @ techPowerUp
- The OpenGL Speed & Perf-Per-Watt From The Radeon HD 2000/3000 Series Through The R9 Fury @ Phoronix
- 1080p NVIDIA Linux Comparison From GeForce 8 To GeForce 900 Series @ Phoronix
Subject: General Tech | January 25, 2016 - 01:27 PM | Jeremy Hellstrom
Tagged: input, Steam Controller
The claims seem suspect; how exactly can a Steam Controller replace a mouse and keyboard when gaming? That suspicion is being tested over at The Tech Report who recently tried out Valve's new Steam Controller, comparing it not only to a standard PC input setup but also to a XBox controller. For the test they used Rocket League, Team Fortress 2, Just Cause 3, and Helldivers with mixed results. In the end the Steam Controller was just not as useful as the Logitech M570 trackball, wireless keyboard, and Xbox 360 controller the reviewer is used to. That said, with a lot of practice and time spent tweaking your input profiles you could find the Steam Controller is for you ... if you want it.
"Valve's Steam Controller is supposed to obviate the mouse and keyboard for PC gaming in the living room. We put our thumbs on the Steam Controller's twin trackpads and took it for a spin to see whether it does the job."
Here is some more Tech News from around the web:
- CMStorm Sentinel III Gaming Mouse @ Kitguru
- Cougar 450M Optical Gaming Mouse @ eTeknix
- Razer Diamondback Gaming Mouse @ eTeknix
- Rantopad MXX @ techPowerUp
- Cooler Master Quick Fire XTi Mechanical Gaming Keyboard Review @ Techgage
- COUGAR 450k Hybrid Mechanical Gaming Keyboard Review @ NikKTech
- Mad Catz S.T.R.I.K.E. M Wireless Keyboard Review @ Madshrimps
Subject: General Tech | January 25, 2016 - 12:40 PM | Jeremy Hellstrom
Tagged: wifi, camera, DIY, iot
Hack a Day has posted a perfect example of how inexpensive and easy it is to build yourself useful things instead of shopping for expensive electronics. If you have looked at the prices of cameras or adapters which allow you to wirelessly take a picture you have probably been disappointed, but you don't have to stay that way. Instead, take an existing manual remote trigger, add in a WiFi enabled SoC module like the ESP8266 suggested in the video, download and compile the code and the next thing you know you will have a camera with wireless focus and shutter trigger. Not too shabby for a ~$5 investment.
"It’s just ridiculous how cheap and easy it is to do some things today that were both costly and difficult just two or three years ago. Case in point: Hackaday.io user [gamaral] built a WiFi remote control for his Canon E3 camera out of just three parts"
Here is some more Tech News from around the web:
- Thought you were safe from the Fortinet SSH backdoor? Think again @ The Register
- Windows 10: Microsoft confirms results of its Sophie's Choice on chipset support @ The Register
- Five technologies you shouldn't bother looking out for in 2016 @ The Register
- Cabling horrors unplugged: Reg readers reveal worst nightmares @ The Register
Introduction and First Impressions
The new Corsair Carbide 600Q and 600C enclosures are the company's first inverted ATX designs, and the layout promises improved airflow for better cooling.
The Carbide Series from Corsair has encompassed enclosures from the company's least expensive budget-friendly options such as the $59 Carbide 100R, to high-performance options like the $159 Carbide Air 540. This new Carbide 600 enclosure is available in two versions, the 600C and 600Q, which both carry an MSRP of $149. This positions the 600C/600Q enclosures near the Graphite and Obsidian series models, but this is only fitting as there is nothing "budget" about these new Carbide 600 models.
The Carbide Series 600Q in for review differs from the 600C most obviously in its lack of the latter's hinged, latching side-panel, which also contains a large window. But the differences extend to the internal makeup of the enclosure, as the 600Q includes significant noise damping inside the front, top, and side panels. We'll be taking a close look at the noise levels along with thermal performance with this "Q" version of the new enclosure in our review.
Subject: Graphics Cards | January 25, 2016 - 11:51 AM | Ryan Shrout
Tagged: fury x2, Fiji, dual fiji, amd
Lo and behold! The dual-Fiji card that we have previous dubbed the AMD Radeon Fury X2 still lives! Based on a tweet from AMD PR dude Antal Tungler, a PC from Falcon Northwest at the VRLA convention was utilizing a dual-GPU Fiji graphics card to power some demos.
— Antal Tungler (@coloredrocks) January 23, 2016
This prototype Falcon Northwest Tiki system was housing the GPU beast but no images were shown of the interior of the system. Still, it's good to see AMD at least recognize that this piece of hardware still exists at all, since it was initially promised to the enthusiast market by "fall of 2015." Even in October we had hints that the card might be coming soon after seeing some shipping manifests leak out to the web.
Better late than never, right? One theory floating around inside the offices here is that AMD is going to release the Fury X2 along with the VR headsets coming out this spring, with hopes of making it THE VR graphics card of choice. The value of using multi-GPU for VR is interesting, with one GPU dedicated to each eye, though the pitfalls that could haunt both AMD and NVIDIA in this regard (latency, frame time consistency) make the technological capability a debate.
Subject: Processors | January 24, 2016 - 12:19 PM | Sebastian Peak
Tagged: Tigerlake, rumor, report, processor, process node, Intel, Icelake, cpu, Cannonlake, 10 nm
A report from financial website The Motley Fool discusses Intel's plan to introduce three architectures at the 10 nm node, rather than the expected two. This comes after news that Kaby Lake will remain at the present 14 nm, interrupting Intel's 2-year manufacturing tech pace.
(Image credit: wccftech)
"Management has told investors that they are pushing to try to get back to a two-year cadence post-10-nanometer (presumably they mean a two-year transition from 10-nanometer to 7-nanometer), however, from what I have just learned from a source familiar with Intel's plans, the company is working on three, not two, architectures for the 10-nanometer node."
Intel's first 10 nm processor architecture will be known as Cannonlake, with Icelake expected to follow about a year afterward. With Tigerlake expected to be the third architecture build on 10 nm, and not coming until "the second half of 2019", we probably won't see 7 nm from Intel until the second half of 2020 at the earliest.
It appears that the days of two-year, two product process node changes are numbered for Intel, as the report continues:
"If all goes well for the company, then 7-nanometer could be a two-product node, implying a transition to the 5-nanometer technology node by the second half of 2022. However, the source that I spoke to expressed significant doubts that Intel will be able to return to a two-years-per-technology cycle."
(Image credit: The Motley Fool)
It will be interesting to see how players like TSMC, themselves "planning to start mass production of 7-nanometer in the first half of 2018", will fare moving forward as Intel's process development (apparently) slows.
Subject: Systems | January 23, 2016 - 02:26 AM | Scott Michaud
Tagged: microsoft, surface, surface pro 4, surface book
The Microsoft Surface Book and Surface Pro 4 launched back in October, and Ryan published a review of them in December. He didn't really make reference to it, but the highest-end model of each were unavailable until a later date. As it turns out, that time is roughly now. I say “roughly” because, while Microsoft has launched the devices, Amazon's landing page doesn't list them, and searching for the product directly shows a price tag of just under $10,000. I assume Amazon hasn't pushed the appropriate buttons yet.
The only real improvement that you will see, versus the second-highest SKU, is a jump in SSD capacity from 512GB to 1TB. This extra storage will cost roughly 1$/GB, but this is also a very fast NVMe SSD. If 512GB was too small, and you were holding out for availability of the 1TB model, then your wait should (basically) be over.
Although, since you waited this long, you might want to hold off a little longer. Microsoft is supposed to be correcting (some say) severe issues with upcoming firmware. You may want to see whether the problems are solved before dropping two-and-a-half to three grand.
My new desk mate
Earlier this month at the 2016 edition of the Consumer Electronics Show, Logitech released a new product for the gaming market that might have gone unnoticed by some. The G502 Proteus Spectrum is a new gaming mouse that takes an amazing product and makes it just a little better with the help of some RGB goodness. The G502 Proteus Core has been around for a while now and has quickly become one of the best selling gaming mice on Amazon, a testament to its quality and popularity. (It has been as high as #1 overall in recent days.)
We have been using the G502 Proteus Core in our gaming test beds at the office for some months and during that time I often lamented about how I wanted to upgrade the mouse on my own desk to one. While I waited for myself stop being lazy and not just switching one for the G402 currently in use at my workstation, Logitech released the new G502 Proteus Spectrum and handed me a sample at CES to bring home. Perfect!
|Logitech G502 Proteus Spectrum Specifications|
|Resolution||200 - 12,000 DPI|
|Max Speed||>300 IPS|
|USB Data||16 bits/axis|
|USB Report Rate||1000 Hz (1 ms)|
|Button rating||20 million clicks|
|Feet rating||250 kilometers|
|Price||$79 - Amazon.com|
The G502 Proteus Spectrum is very similar to the Core model, with the only difference being the addition of an RGB light under the G logo and DPI resolution indicators. This allows you to use the Logitech Gaming Software to customize its color, its pattern (breathing, still or rotating) as well as pair it up and sync with the RGB lights of other Logitech accessories you might have. If you happen to own a Logitech G910 or G410 keyboard, or one of the new headsets (G633/933) then you'll quickly find yourself in color-coordinated heaven.
In the box you'll find the mouse, attached to a lengthy cable that works great even with my standing desk, and a set of five weights that you can install on the bottom if you like a heavier feel to your mousing action. I installed as many as I could under the magnetic door on the underside of the mouse and definitely prefer it. The benefit of the weights (as opposed to just a heavier mouse out of the box) is that users can customize it as they see fit.
Subject: General Tech | January 22, 2016 - 01:47 PM | Jeremy Hellstrom
Tagged: hifiman, headphones, HE-1000, audio
HiFiMAN have been producing mid level and high end audio products for quite some time, straddling the line between affordable and audiophile quality. The HE-1000 are of the aforementioned audiophile level, at $3000 you really have to have discerning ears to want to pick up these cans. The headset is quite pretty, built with leather, wood, and aluminium with soft cloth for the earcups and a window blind design on the exterior which HiFiMAN claims has a positive effect on the audio quality. techPowerUp tested these headphones out, you can read the description of their experience in the audio soundstage these headphones create in their review ... or not.
"HiFiMAN is constantly developing their planar technology, and today, we will take a look at their latest state-of-the-art headphone. It is dubbed the HE-1000 and features a nanometer thick diaphragm, leather headband, and milled aluminum. We take HiFiMAN's most audacious and pricey headphone for a ride!"
Here is some more Tech News from around the web:
- Logitech G933 Artemis Spectrum Wireless Headset @ eTeknix
- Rock Jaw Audio Alfa Genus V2 earphones @ Kitguru
- Corsair VOID USB Dolby 7.1 Gaming Headset Review @ Madshrimps
Subject: General Tech | January 22, 2016 - 12:15 PM | Jeremy Hellstrom
Tagged: microsoft, Windows Store, windows 10
If the complaints of the developers in this story over at The Register are accurate then the problem with the Windows Store might not be that there are no good apps but instead that you simply can't find them. If a developer can't find their own app in the store using keywords in the title or description or even the ones they submitted to the store then how can you expect to? If the only good way to find an app is to know its exact name, how many apps are there in the store that no one but the developer has even seen? It is still possible that an improved search function will not help the Windows Store but at this point its reputation could not get all that much worse.
"Looking at the developer forums though, it seems that official guidance and assistance for this issue is not easy to find, which will not help Microsoft in its efforts to establish a strong Windows 10 app ecosystem."
Here is some more Tech News from around the web:
- Containers! Containers! Containers! And RHEL 7.2. Employ as you wish @ The Register
- Microsoft launches Office Insider programme for Mac OS X users @ The Inquirer
- Security company RSA wants your plain text Twitter log-in @ The Inquirer
- Alcatel Flash 2 Upgrades @ TechARP
Subject: Graphics Cards, Memory | January 22, 2016 - 11:08 AM | Ryan Shrout
Tagged: Polaris, pascal, nvidia, jedec, gddr5x, GDDR5, amd
Though information about the technology has been making rounds over the last several weeks, GDDR5X technology finally gets official with an announcement from JEDEC this morning. The JEDEC Solid State Foundation is, as Wikipedia tells us, an "independent semiconductor engineering trade organization and standardization body" that is responsible for creating memory standards. Getting the official nod from the org means we are likely to see implementations of GDDR5X in the near future.
The press release is short and sweet. Take a look.
ARLINGTON, Va., USA – JANUARY 21, 2016 –JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of JESD232 Graphics Double Data Rate (GDDR5X) SGRAM. Available for free download from the JEDEC website, the new memory standard is designed to satisfy the increasing need for more memory bandwidth in graphics, gaming, compute, and networking applications.
Derived from the widely adopted GDDR5 SGRAM JEDEC standard, GDDR5X specifies key elements related to the design and operability of memory chips for applications requiring very high memory bandwidth. With the intent to address the needs of high-performance applications demanding ever higher data rates, GDDR5X is targeting data rates of 10 to 14 Gb/s, a 2X increase over GDDR5. In order to allow a smooth transition from GDDR5, GDDR5X utilizes the same, proven pseudo open drain (POD) signaling as GDDR5.
“GDDR5X represents a significant leap forward for high end GPU design,” said Mian Quddus, JEDEC Board of Directors Chairman. “Its performance improvements over the prior standard will help enable the next generation of graphics and other high-performance applications.”
JEDEC claims that by using the same signaling type as GDDR5 but it is able to double the per-pin data rate to 10-14 Gb/s. In fact, based on leaked slides about GDDR5X from October, JEDEC actually calls GDDR5X an extension to GDDR5, not a new standard. How does GDDR5X reach these new speeds? By doubling the prefech from 32 bytes to 64 bytes. This will require a redesign of the memory controller for any processor that wants to integrate it.
Image source: VR-Zone.com
As for usable bandwidth, though information isn't quoted directly, it would likely see a much lower increase than we are seeing in the per-pin statements from the press release. Because the memory bus width would remain unchanged, and GDDR5X just grabs twice the chunk sizes in prefetch, we should expect an incremental change. No mention of power efficiency is mentioned either and that was one of the driving factors in the development of HBM.
Performance efficiency graph from AMD's HBM presentation
I am excited about any improvement in memory technology that will increase GPU performance, but I can tell you that from my conversations with both AMD and NVIDIA, no one appears to be jumping at the chance to integrate GDDR5X into upcoming graphics cards. That doesn't mean it won't happen with some version of Polaris or Pascal, but it seems that there may be concerns other than bandwidth that keep it from taking hold.
Subject: General Tech | January 22, 2016 - 02:06 AM | Tim Verry
Tagged: tomb raider, pc gaming, min specs, can it run
Crystal Dynamics has revealed the minimum system requirements for Rise of the Tomb Raider on PC. This latest Lara Croft adventure sees the ever-resilient tomb raider following in the footsteps of her father in search of an artifact said to grant immortality amidst the lost city of Kitezh. Fortunately for gamers, Rise of the Tomb Raider has quite a low bar for entry with modest minimum system requirements. You will need more powerful hardware than its 2013 predecessor (Tomb Raider), but it is still quite manageable.
PC gamers will need a 64-bit version of Windows, a dual core Intel Core i3-2100 (2 core, 4 thread at 3.1 GHz) or, for example, AMD FX 4100 processor, 6 GB of system memory, 25 GB of storage space for all the game files, and, of course, a graphics card with 2 GB of video memory such as the NVIDIA GTX 650 or AMD Radeon HD7770. Naturally, hardware with higher specifications/capabilities will get you better performance and visuals, but the above is what you will need to play.
Minimum PC System Requirements:
- Dual Core Processor (e.g. Core i3-2100 or FX 4100)
- 6 GB RAM
- 25 GB Available Storage Space
- 2 GB Graphics Card (e.g. GTX 650 or Radeon HD7770)
For those curious, Tomb Raider (2013) required XP SP3 32-bit, a dual core Intel Core 2 Duo E6300 or Athlon 64 X2 4050+ CPU, 1 GB of RAM (2 GB for Vista), and an NVIDIA 8600 or AMD HD2600 XT GPU with 512MB of video memory.
Rise of the Tomb Raider will reportedly add new stealth and crafting components along with new weapons and options for close quarters combat. Further, the game will feature day and night cycles with realistic weather which should make for cold snow-filled nights in Siberia as well as opportunities to sneak up on unwitting guards freezing their buns off!
The game is set to release on January 28th for the PC and joins the the Xbox One version that launched back in November 2015 where it will be a timed exclusive (it will come to the PS4 later this year).
Personally, I am excited for this game. I picked up its predecessor during a Steam sale for super cheap only to let it sit in my inventory for about a year. It was one of those 'I'll play it eventually, but it's not really a priority' things where the price finally got me (heh). Little did I know how wrong I was, because once I finally got around to firing up the game, I played it near constantly until I beat it! It was a surprisingly fun reboot of the series, and I am hopeful that RofTR will be more of the same!
Subject: Cases and Cooling | January 21, 2016 - 10:44 PM | Tim Verry
Tagged: modular psu, gigabyte, ATX PSU
Gigabyte made an announcement teasing two new power supplies last week. The G750H and B700H are 80 PLUS rated models topping out at 750W and 700W respectively. A company most well-known for its motherboards, it was somewhat surprising to see it tease power supplies and to discover that these PSUs are not even the first to be sold by Gigabyte with its branding.
The G750H and B700H are ATX form factor and use a semi-modular design that leaves the 24-pin ATX and 8-pin CPU power cables permanently attached and uses modular cables for all other connections (see below). One neat thing is that Gigabyte is using all black flat individually sleeved cables which may make it easier to hide and route them behind the motherboard tray (which on some cases can be an especially narrow channel). Both models are rated for SLI and Crossfire multi-GPU setups, use at least some Japanese capacitors (the G750H uses all Japanese capacitors), have a MTBF of 100,000 hours, and five year warranties.
In addition to the motherboard and CPU power, users can install two eight pin PCI-E, five SATA power, three Molex, and one floppy power connector. The modular cable configuration is the same on both PSU models.
The G750H is up to 90% efficient (80+ Gold) and uses a 140mm temperature controlled fan to keep noise levels low and the internal components cool (and efficient). Gigabyte has opted for a single rail design that sees the 12V rail rated at up to 62 amps.
On the other hand, the B700H is up to 85% efficient (80+ Bronze) at typical loads. It has a smaller 120mm temperature controlled fan for cooling. This model also uses a single 12V rail, but it tops out at 54 amps.
Several sites around the Internet have indicated (including Maximum PC) that Gigabyte has made the G750H and B700H available now, but they do not seem to be for sale yet in the US. I have tried to unearth pricing as well as the identity of the ODM Gigabyte is using for these new units, but no such luck so far. From my research, it appears that Gigabyte has used a number of different ODM/OEMs of varying quality for their past power supplies. It seems that we will have to wait for reviews to know for sure how these new PSUs will perform. I hope that Gigabyte has stepped up its power supply game as it has quite a bit of competition these days!
Subject: General Tech | January 21, 2016 - 06:39 PM | Scott Michaud
Tagged: windows 10, microsoft
I wasn't planning on reporting every Windows 10 Insider build, but I actually have something to say about this one. The last couple of builds were examples of Microsoft switching to a faster release cycle for preview users, although the known issues list were quite benign. The semi-frequent upgrade cycles would shake users away from the Insider program, or transition to Slow.
They now seem ready to start rolling back to less QA for Fast users.
Five issues are known about build 11102. First, internal changes to the Windows Graphics architecture cause some games to crash on launch, full screen, or resolution changes. The Witcher 3, Fallout 4, Tomb Raider, Assassin's Creed, and Metal Gear Solid V are known to be affected by this bug, but other games (and maybe even other software) could be affected too. Second, screen readers and other accessibility software may crash randomly. If you require those accommodations, then this build could make your device functionally unusable to you.
Beyond those two, big issues, three other ones are present. There is an error message on login with a workaround, a breaking change for old wireless drivers that you should probably upgrade beforehand if you rely upon wireless to download drivers, and “The Connect button does not show up in Action Center.”
Microsoft is currently updating the deep insides of the OS, which means that they will be poking around the source code in weird places. Once it's completed, this should make Windows more maintainable, especially for multiple types of hardware. But again, if you're not wanting to be a part of this, switch to Slow or leave Insider.
Subject: Mobile | January 21, 2016 - 06:14 PM | Jeremy Hellstrom
Tagged: toshiba, Satellite Radius 12
Ah, the old Toshiba Satellite; like a Volvo it was never the best nor the prettiest but short of a major collision nothing could kill it. Since those times Toshiba has had a rough go of it, The Inquirer states they have predicted a $4.5bn loss, just after being caught cooking the books. That has not stopped them from improving their Satellite lineup and the Satellite Radius 12 ultraportable is a great example of that.
The screen on this 300x209x15.4mm (11.8x8.2x0.6") and 1.32kg (2.9lb), 12.1" convertible laptop is an impressive 3840x2160 IPS display which can be fully flipped open to a tablet like form factor. An i7-6500U, 8GB RAM and an unspecified 256GB SSD offer great performance, although battery life does suffer somewhat due to the screen and components. Toshiba also included a dedicated Cortana button, cellphone like volume rocker, 0.9MP webcam and an infrared camera which works with Windows Hello but is not a RealSense camera. The Inquirer found a lot to like about this laptop as well as some fairly serious shortcomings, read about them all in their review.
"This is the latest in Toshiba's rotating display convertible line, and the first of its kind to include a so-called 4K screen, making it an interesting proposition regardless of its creator's misfortunes."
Here are some more Mobile articles from around the web:
- HP Pavilion Gaming Notebook 15-ak020NB Review @ Madshrimps
- MSI GT80S 6QF Titan (SLI GTX980’s) @ Kitguru
- Microsoft Lumia 950 XL @ The Inquirer
- ASUS ZenFone Zoom Offers 3X Optical Zoom & OIS @ Tech ARP
Subject: General Tech | January 21, 2016 - 02:34 PM | Ken Addison
Tagged: x99-m, X170, X150, video, Silent Base 800, Q4 2015, Predator X34, podcast, gigabyte, g-sync, freesync, earnings, be quiet, asus, amd, acer
PC Perspective Podcast #383 - 01/21/2016
Join us this week as we discuss the Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:25:53
Subject: General Tech | January 21, 2016 - 12:52 PM | Jeremy Hellstrom
Tagged: Intel, intel driver update utility, security
The Intel Driver Update Utility is not the most commonly found application on PCs but someone you know may have stumbled upon it or had it installed by Geek Squad or the local equivalent. Since Windows Vista the tool has been available, it checks your system for any Intel parts, from your APU to your NIC and then looks for any applicable drivers that are available. Unfortunately it was doing so over a non-SSL URL which leaves the utility wide open to a man in the middle attack and you really do not want a compromised NIC driver. The Inquirer reports today that Intel quietly updated the tool on January 19th to resolve the issue, ensuring all communication and downloads are over SSL. If you know anyone using this tool, recommend they update it immediately.
"Intel has issued a fix for a major security vulnerability in a driver utility tool that could have allowed a man-in-the-middle attack and a malware maelstrom on victims' computers."
Here is some more Tech News from around the web:
- Eighteen year old server trumped by functional 486 fleet! @ The Register
- Solus Project: No Longer Just A Chrome OS Alternative @ Linux.com
- Google Chrome is getting a speed boost of up to 25 percent - and soon @ The Inquirer
- E-Mail Spam Goes Artisanal @ Slashdot
- Facebook Messenger: All your numbers are belong to us @ The Register
- IRS 'inadvertently' wiped hard drive Microsoft demanded in audit row @ The Register
- Synaptics IronVeil Fingerprint Security Technology @ eTeknix
- Through The Looking Glass - Questions Of VR's Viability On The PC @ Techgage
Introduction and First Impressions
The Scythe Ninja 4 (SCNJ-4000) is the latest model in the Ninja series, and an imposing air cooler with dimensions similar to Noctua's massive NH-D14. But there's more to the story than size, as this is engineered for silence above all else. Read on to see just how quiet it is, and of course how well it's able to cope with CPU loads.
"The Ninja 4 is the latest model in the Ninja CPU Cooler Series, developed for uncompromising performance. It features the new T-M.A.P.S technology, an optimized alignment of heatpipes, and the back-plate based Hyper Precision Mounting System (H.P.M.S) for firm mounting and easy installation procedure. These improvements and a special, adjustable Glide Stream 120mm PWM fan result in an increased cooling performance while reducing the weight compared to his predecessor. Also the design of the heat-sink allows fan mounting on all four sides. This enables the optimal integration of the Ninja 4 in the air flow of the pc-case and reduces turbulence and the emergence of hotspots."
The Ninja 4 is built around a very large, square heatsink, which allows the single 120 mm fan to be mounted on any side, and this PWM fan offers three speed settings to further control noise. And noise is really what the Ninja is all about, with some really low minimum speeds possible on what is a very quiet Scythe fan to begin with.
Will a single low-speed fan design affect the ability to keep a CPU cool under stress? Will the Ninja 4's fan spin up and become less quiet under full load? These questions will soon be answered.
Subject: General Tech | January 21, 2016 - 02:59 AM | Scott Michaud
Tagged: google, chrome
Web browsers are typically on rapid release cycles so they can get features out frequently. The Web is changing on a constant basis to help it become an effective application platform, which is cross-compatible with competing implementations. A common complaint is that the cycle is to yield high version numbers for marketing, to give a false sense of maturity, but I'd expect that frequent, breaking changes are kind-of necessary to synchronize features between implementations. If Google lands a feature a month after Mozilla publishes a new version, should they really wait two years for their next one? Granted, they probably knew about it pre-release, but you get the idea. Also, even if the theory is true, artificially high version numbers is one of the most benign things a company could do.
Some versions introduce some fairly interesting features, though. This one, Google Chrome 48, deprecates RC4 encryption for HTTPS, which forces web servers to use newer cyphers or they will fail to load.
Another major one, and probably more interesting for our audience, is the introduction of VP9 to WebRTC. This video codec is Google's open competitor to H.265. At similar quality settings, VP9 will use about half of the bandwidth (or storage) as VP8. WebRTC is mostly used for video conferencing, but it's really an open platform for webcam, microphone, audio, video, and raw, peer-to-peer data connections. There are even examples of it being used to synchronize objects in multiplayer video games, which has nothing to do with video or audio streaming. I'm not sure what is possible with this support, but it might even lead to web applications that can edit video.
Google Chrome 48 is available today. Also, as a related note, Firefox 44 should release next week with its own features, like experimental rendering of WebGL images offscreen and multi-threaded. The full changelog for Google Chrome 48 from Git is about 42 MB large and, ironically, tends to crash Firefox.