All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | November 9, 2016 - 01:10 PM | Jeremy Hellstrom
Tagged: hack, iot, phillips, hue
If you were hoping to drive someone a wee bit crazy by remote controlling their light bulbs you have probably missed your opportunity as Phillips have patched the vulnerability. This is a good thing as it was a very impressive flaw. Security researchers figured out a vulnerability in the ZigBee system used to control Phillips Hue smart light bulbs and they did not need to be anywhere near the lights to do so. They used a drone from over 1000 feet away to break into the system to cause the lights to flash and even worse, they were able to ensure that the bulb would no longer accept firmware updates which made their modifications permanent. Unpatched systems could be leveraged to turn all the lights off permanently, or to start an unexpected disco light show if you wanted to be creative. You can pop by Slashdot for a bit more information on the way this was carried out.
"Researchers were able to take control of some Philips Hue lights using a drone. Based on an exploit for the ZigBee Light Link Touchlink system, white hat hackers were able to remotely control the Hue lights via drone and cause them to blink S-O-S in Morse code. The drone carried out the attack from more than a thousand feet away."
Here is some more Tech News from around the web:
- NASA Puts its 3D Models Up on GitHub @ Hack a Day
- Google to patch Chrome mobile hole after bank trojan hits 318k users @ The Register
- Microsoft prises open Azure containers, pours in a little Kubernetes @ The Register
- TSMC board approves US$4.91 billion for capacity expansion @ DigiTimes
- Microsoft launches Skype Insiders Programme - but don't tell anyone @ The Inquirer
- Tobii Tracker 4C Review @ OCC
Subject: Cases and Cooling | November 8, 2016 - 04:28 PM | Jeremy Hellstrom
Tagged: supernova, evga, 80 Plus Gold, modular psu, SuperNOVA G3
Lee has reviewed more than a few EVGA SuperNOVA PSUs over the years and they tend to win the Gold as they are very well made. The new G3 models are a bit smaller in volume than the previous G2 PSUs at 85mm high, 150mm wide and 150mm long, previously they tended to be 165mm long. The new size will also mean the Hydraulic Dynamic Bearing fan size is reduced to 130mm, it will continue to support the EVGA ECO mode for those who desire quiet operation. There is a seven year warranty on the 550W and 650W models, the higher powered are covered for a full decade. We don't have a review up so keep an eye out for that, in the mean time have some PR.
November 8th, 2016 - EVGA SuperNOVA power supplies are well-known for their extreme efficiency, performance and reliability. In fact, over the last 3 years EVGA SuperNOVA power supplies have won over 70 awards from leading review sites. EVGA’s dedication to performance has forged the latest power supply platform: the EVGA SuperNOVA G3 Series. With these new power supplies, EVGA took the best features of the award-winning G2 lineup and made them even better. The SuperNOVA G3’s smaller size, improved performance and a new Hydraulic Dynamic Bearing fan give you ultra-quiet performance with an increased lifespan.
Small Size, Big Performance
A reduced size does not mean reduced performance with the EVGA SuperNOVA G3 power supplies. At only 150mm long, this makes these some of the smallest power supplies on the market today, while offering improved performance and features.
Whisper Silent Hydraulic Dynamic Bearing Fan
Better performance, quieter operation and a longer lifespan. EVGA ECO mode gives you gives you silent operation at low to medium loads.
Next Gen Performance
Next-generation systems deserve next-generation performance. The EVGA SuperNOVA G3 takes the already efficient G2 series and makes it even better with improved efficiency and lower ripple. These power supplies are over 91% efficient!
Fully Modular Design
Use only the cables you need, reducing cable clutter and improving case airflow.
Subject: Displays | November 8, 2016 - 03:44 PM | Jeremy Hellstrom
Tagged: ZeroFrame, ips, Acer BE270U, acer, 75hz, 1440p
Acer has just release a new model in their ZeroFrame series of displays, the BE270U. It is a 27" IPS 100% sRGB display @ 1440p resolution with a refresh rate of 75Hz and a 6ms response time. For connectivity you can choose between a pair of Mobile High Definition Link (MHL) ports for charging and displaying from portable devices, DisplayPort 1.2 in and out which allows you to chain displays, MiniDP and a USB 3.0 hub with one up and four down). Two 2W speakers are slipped into the display as well, if you so desire to make use of them as well as Picture in Picture functionality. The display is aimed for content creators and other professionals but it is still capable of offering decent performance when gaming.
It is available for $500 from Acer. PR after the picture.
SAN JOSE, Calif. (Nov. 8, 2016) Acer America today announced the U.S. availability of the Acer BE270U monitor, boasting stunning images with a brilliant WQHD 2560x1440@75Hz resolution and a ZeroFrame design in a 27-inch panel.
This premium, feature-rich display is ideal for graphics professionals, such as art directors, photographers, videographers and website designers as well as enthusiasts who enjoy photography and video editing as hobbies. Thanks to an IPS panel, wide 178-degree viewing angles enhance visual collaboration with others on joint projects or simply sharing photos and videos with friends.
“Our newest monitor delivers gorgeous images in a feature-rich, ergonomic design for customers wanting a no-compromise viewing experience,” said Ronald Lau, Acer America director – stationary computing. “In addition to optimizing viewing comfort, it provides superb ability for multi-tasking with multi-streaming and picture-in picture capability in an energy-efficient design.”
Picture-in-Picture, Multi-Streaming Increase Productivity
Picture-in-picture capability allows customers to watch a movie or video while working. Multi-stream technology supports up to three additional monitors through a single cable leveraging a video hub or daisy-chainable displays via DisplayPort. Thanks to the ZeroFrame design, it provides seamless viewing among all linked monitors.
The Acer BE270U meets the highest standards for color accuracy with 100 percent sRGB coverage and 6-axis color adjustment. It boasts a 16:9 aspect ratio, a 350 cd/m2 brightness and 16.07 million colors. A crisp 100,000,000:1 maximum contrast ratio and a 6ms response time contribute to the stunning picture quality. Acer EyeProtect may help reduce eye fatigue, incorporating several features that take into consideration prolonged usage by heavy users such as programmers, writers and graphic designers. Ergonomic, a multi-function ErgoStand tilts from -5 - 35 degrees, swivels up to 60 degrees, tilts up to 5.9 inches and pivots + 90 degrees. A quick release design lets users separate the monitor from its stand, so it can be VESA wall-mounted to conserve desk space.
This practical monitor provides an array of connectivity options including MHLx2 for charging portable devices, DisplayPort (v1.2), Mini DisplayPort, DisplayPort out+SPK, and a USB 3.0 hub (1 up/4 down) for connecting multiple peripherals simultaneously. Two 2W speakers deliver quality audio.
In addition to being ENERGY STAR and TCO 7.0 qualified, the Acer BE270U monitor is EPEAT Gold registered, the highest level of EPEAT registration available. Mercury-free and LED-backlit, the Acer BE270U reduces energy costs by consuming less power than standard CCFL-backlit displays making them safer for the environment.
Pricing and Availability
The new Acer BE270U monitor is offered through online channel partners in the United States with a three-year limited parts and labor warranty. Estimated selling prices begin at $499.99.
Subject: General Tech | November 8, 2016 - 01:59 PM | Jeremy Hellstrom
Tagged: long term storage, nitrogen vacancy, diamond
Atomic impurities in diamonds, specifically negatively charged nitrogen vacancy centres in those diamonds, could be used for extremely long term storage. Researchers have used optical microscopy to read, write and reset the charge state and spin properties of those defects. This would mean that you could store data, in three dimensions, within these diamonds almost perpetually. There is one drawback, as the storage medium uses light, similar to a Blue-Ray or other optical media, exposure to light can degrade the storage over time. You can read more about this over at Nanotechweb.
"The nitrogen vacancy (NV) centre can be used for long-term information storage. So say researchers at City University of New York–City College of New York who have used optical microscopy to read, write and reset information in a diamond crystal defect."
Here is some more Tech News from around the web:
- Orange Pi Releases Two Boards @ Hack a Day
- The Sega Genesis Is Officially Back In Production @ Slashdot
- What the Dell? NAND flash drought hits Texan monster – sources @ The Register
- 'F*cking crap' aside, Linus Torvalds says Linux 4.9 is coming along nicely @ The Register
- Google plugs Gmail vulnerability that allowed hackers to post from your account @ The Inquirer
- Samsung smartphone explodes in France, and it's not the Galaxy Note 7 @ The Inquirer
- Engineered spinach detects explosives @ Nanotechweb
- Archeer Foldable Solar Charger @ Benchmark Reviews
Subject: General Tech | November 8, 2016 - 03:00 AM | Scott Michaud
Tagged: voco, stylit, premiere pro, clovervr, audition, Adobe
At their annual MAX show, Adobe hosts a keynote called “Sneak Peeks”. Some of theses contain segments that are jaw-dropping. For instance, there was an experimental plug-in at Adobe MAX 2011 that analyzed how a camera moved while its shutter was open, and used that data to intelligently reduce the resulting motion blur from the image. Two years later, the technology eventually made its way into Photoshop. If you're wondering, the shadowy host on the right was Rainn Wilson from the US version of The Office, which should give some context to the humor.
While I couldn't find a stream of this segment as it happened, Adobe published three videos after-the-fact. The keynote was co-hosted by Jordan Peele and, while I couldn't see her listed anywhere, I believe the other co-host is
Elissa Dunn Scott from Adobe. ((Update, November 8th @ 12pm EST: Turns out I was wrong, and it was Kim Chambers from Adobe. Thanks Anonymous commenter!))
The first (and most popular one to be reported on) is VoCo, which is basically an impressive form of text-to-speech. Given an audio waveform of a person talking, you are able to make edits by modifying the transcript. In fact, you are even able to write content that wasn't even in the original recording, and the plug-in will synthesize it based on what it knows of that person's voice. They claim that about 20 minutes of continuous speech is required to train the plug-in, so it's mostly for editing bloopers in audio books and podcasts.
In terms of legal concerns, Adobe is working on watermarking and other technologies to prevent spoofing. Still, it proves that the algorithm is possible (and on today's hardware) so I'm sure that someone else, if they weren't already working on it, might be now, and they might not be implementing the same protections. This is not Adobe's problem, of course. A company can't (and shouldn't be able to) prevent society from inventing something (although I'm sure the MPAA would love that). They can only research it themselves, and be as ethical with it as they can, or sit aside while someone else does it. Also, it's really on society to treat the situations correctly in the first place.
Moving on to the second demo: Stylit. This one is impressive in its own way, although not quite as profound. Basically, using a 2D drawing of a sphere, an artist can generate a material that can be applied to a 3D render. Using whatever they like, from pencil crayons to clay, the image will define the color and pattern of the shading ramp on the sphere, the shadow it casts, the background, and the floor. It's a cute alternating to mathematically-generated cell shading materials, and it even works in animation.
I guess you could call this a... 3D studio to the MAX... ... Mayabe?
The Stylit demo is available for free at their website. It is based on CUDA, and requires a fairly modern card (they call out the GTX 970 specifically) and a decent webcam (C920) or Android smartphone.
Lastly, CloverVR is and Adobe Premiere Pro interface in VR. This will seem familiar if you were following Unreal Engine 4's VR editor development. Rather than placing objects in a 3D scene, though, it helps the editor visualize what's going on in their shot. The on-stage use case is to align views between shots, so someone staring at a specific object will cut to another object without needing to correct with their head and neck, which is unnecessarily jarring.
Annnd that's all they have on their YouTube at the moment.
Subject: General Tech | November 7, 2016 - 09:07 PM | Scott Michaud
Tagged: valve, steam, pc gaming
As we mentioned last week, Valve was working on a major refresh of the Steam homepage, with a heavy emphasis on letting users find products that interest them. This update is now live, and will be presented to you the
new next time you load (or reload) the store page. They also have a banner link, right near the top, that highlights changes, including a few they've already made over the course of 2016.
One glaring thing that I note is the “Recently Viewed” block. There doesn't seem to be a way to disable this or otherwise limit the amount of history that it stores. While this is only visible to your account, which should be fairly obvious, it could be a concern for someone who shares a PC or streams regularly. It's not a big issue, but it's one that you would expect to have been considered.
Otherwise, I'd have to say that the update looks better. The dark gray and blue color scheme seems a bit more consistent than it was, and I definitely prefer the new carousel design.
What do you all think?
Subject: Graphics Cards | November 7, 2016 - 06:00 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Update, November 7th @ 5:25pm EST:
First, NVIDIA gave Ryan their official statement, which I included below verbatem.
GeForce Experience collects data to improve the application experience; this includes crash and bug reports as well as system information needed to deliver the correct drivers and optimal settings. NVIDIA does not share any personally identifiable information collected by GeForce Experience outside the company. NVIDIA may share aggregate-level data with select partners, but does not share user-level data. The nature of the information collected has remained consistent since the introduction of GeForce Experience 1.0.The change with GeForce Experience 3.0 is that this error reporting and data collection is now being done in real-time.
They also pointed to their GeForce Experience FAQ.
It sounds like there's a general consensus, both from NVIDIA and even their harshest critics, that telemetry only affects GeForce Experience, and not their base driver. I still believe that there should be a more granular opt-out that still allows access to GeForce Experience, like web browsers and Visual Studio prompt with a checkbox during install. Still, if this concerns you, and, like Windows 10, it might not and that's okay, you can remove GeForce Experience.
Also, GamersNexus yet again did a very technical breakdown of the situation. I think they made an error, though, since they claimed to have recorded traffic "for about an hour", which may not have included the once-per-day reporting time from Windows Task Scheduler. (My image below suggests, at least for my system, monitor once per hour but report at 12:25pm and user login.) I reached out to them on Twitter for clarification, but it looks like they may have just captured GeForce Experience's typical traffic.
Update, November 7th @ 7:15pm EST: Heard back from GamersNexus. They did check at the Windows Task Scheduler time as well, and they claim that they didn't see anything unusual. They aren't finished with their research, though.
Original news, posted November 6th @ 4:25pm EST, below.
Over the last day, users have found NVIDIA Telemetry Monitor added to Windows Task Scheduler. We currently don't know what it is or exactly when it was added, but we do know its schedule. When the user logs in, it runs an application that monitors... something... once every hour while the computer is active. Then, once per day (at just after noon on my PC) and once on login, it runs an application that reports that data, which I assume means sends it to NVIDIA.
Before we begin, NVIDIA (or anyone) should absolutely not be collecting data from personal devices without clearly explaining the bounds and giving a clear option to disable it. Lots of applications, from browsers to software development tools, include crash and error reporting, but they usually and rightfully ask you to opt-in. Microsoft is receiving a lot of crap for this practice in Windows 10, even with their “Basic” option, and, while most of those points are nonsense, there is ground for some concern.
I've asked NVIDIA if they have a statement regarding what it is, what it collects, and what their policy will be for opt-in and opt-out. I haven't received a response yet, because I sent it less than an hour ago on a weekend, but we'll keep you updated.
Subject: Mobile | November 7, 2016 - 03:04 PM | Jeremy Hellstrom
Tagged: msi, GS73 6RF Stealth Pro, 4k, GTX 1060M
You have two choices of display when purchasing an MSI GS73 6RF Stealth Pro, a 120Hz 1080p which is neither FreeSync nor GSYNC or a 4K display. It is the 4K version which Kitguru has reviewed, powered by the mobile version of the GTX 1060, an i7-6700HQ and 16GB of DDR4-2400. Storage is handled by a PCIe based M.2 SSD as well as a HDD for extra storage. Kitguru loved the look of the panel but unfortunately the 1060M just doesn't have the power to game at that resolution; it also came with more third party software than they would have liked but that did not ruin it for them. Check out the full review here.
"MSI have been producing a fine line of gaming-oriented laptops for the last couple of years and today we look at their latest super slimline 17 inch model which features a Core i7 processor, Nvidia GTX 1060 graphics, and a 4k IPS panel along with Steelseries keyboard and Killer networking."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Surface 4 Pro - A Real Laptop Replacement @ Hardware Secrets
- Microsoft's Surface Studio desk-slab, Dial knob, Surface Book: We get our claws on new kit @ The Register
- The honor 8 Aurora Glass Smartphone @ TechARP
Subject: General Tech | November 7, 2016 - 01:24 PM | Jeremy Hellstrom
Tagged: contest, Dtto
This year there were over 1000 entries to the Hack a Day prize, they needed to be new projects which exemplified the five themes of the contest; Assistive Technologies, Automation, Citizen Scientist, Anything Goes, and Design Your Concept. The top prize winner is a modular robot, made from 3D printed parts, servo motors, magnets, and electronics you can easily source. There was also a Reflectance Transformation Imaging project to photograph a fixed object in varying light conditions, an optics bench for making science projects involving light much easier to set up and new high resolution tilt sensors and stepper motors. Check out the projects over at Hack a Day, they include the notes on how to replicate these buids yourself.
"Dtto, a modular robot designed with search and rescue in mind, has just been named the winner of the 2016 Hackaday Prize. In addition to the prestige of the award, Dtto will receive the grand prize of $150,000 and a residency at the Supplyframe Design Lab in Pasadena, CA."
Here is some more Tech News from around the web:
- Microsoft has ramped up the pop-up ads in Windows 10 again @ The Inquirer
- VicoVation Vico-MF3 Extreme Car Camcorder Review @ NikKTech
- The Linux Foundation Issues 2016 Guide to Open Source Cloud Projects @ Linux.com
- NikKTech & AeroCool / Thunder X3 Be Very Cool WorldWide Giveaway
Subject: Graphics Cards | November 7, 2016 - 09:32 AM | Josh Walrath
Tagged: WX 7100, WX 5100, WX 4100, workstation, radeon pro, radeon, quadro, Polaris, amd
The professional card market is a lucrative one. For many years NVIDIA has had a near strangle-hold on it with their Quadro series of cards. Offering features and extended support far beyond that of their regular desktop cards, Quadros became the go-to cards for many professional applications. AMD has not been overlooking this area though and have had a history of professional cards that have also included features and support not seen in the standard desktop arena. AMD has slowly been chipping away at Quadro’s marketshare and they hope that today’s announcement will help further that particular goal.
It has now been around five months since the initial release of the Polaris based graphics cards from AMD. Featuring the 4th generation GCN architecture and fabricated on Samsung’s latest 14nm process, the RX 4x0 series of chips have proven to be a popular option in the sub-$250 range of cards. These products may not have been the slam-dunk that many were hoping from AMD, they have kept the company competitive in terms of power and performance. AMD has also seen a positive impact from the sales of these products on the overall bottom line.
Today AMD is announcing three new professional cards based on the latest Polaris based GPUs. These range in power and performance from a sub 50 watt part up to a very reasonable 130 watts. These currently do not feature the SSD that was shown off earlier this year.
The lowest end offering is the Radeon Pro WX 4100. This is a low profile, single slot card that consumes less than 50 watts. It features 1024 stream units, which is greater than that of the desktop RX 460’s 896. The WX 4100 features 2.4 TFLOPS of performance while the RX 460 is at 2.2 TFLOPS. AMD did not specify exactly what chips were used in the professional cards, but the assumption here is that this one is a fully enabled Polaris 11.
The power consumption of this card is probably the most impressive part. Also of great interest is the DP 1.4 support and the four outputs. Finally the card supports 5K monitors at 60 Hz. This is a small, quiet, and cool running part that features the entire AMD Radeon Enterprise software support of the professional market.
The next card up is the Pro WX 5100. This features a sub 75 watt GPU that runs 1792 stream units. We guess that this chip is a cut down Polaris 10. On the desktop side it is similar to the RX 470, but that particular card features more stream units and a faster clockspeed. The RX 470 is rated at 4.9 TFLOPS while the WX 5100 is at 3.9 TFLOPS. Fewer stream units and a lower clockspeed allow it to hit that sub-75 watt figure.
It supports the same number of outputs as the 4100, but they are full sized DP. The card is full sized but still only single slot due to the very conservative TDP.
The final card is the WX 7100. This is based on the fully enabled Polaris 10 GPU and is physically similar to the RX 480. They both feature 2304 stream units, but the WX 7100 is slightly clocked down from the RX 480 as it features 5.7 TFLOPS of performance vs. 5.8 TFLOPS. The card is rated below 130 watts TDP which is about 20 watts lower than a standard RX 480. AMD did not explain to us how they were able to lower the TDP of this card, but it could be simple binning of parts or an upcoming revision of Polaris 10 to improve thermals.
This card is again full sized but single slot. It features the same 4 DP connectors as the WX 5100 and the full monitor support that the 1.4 standard entails.
These products will see initial availability for this month. Plans may of course change and they will be introduced slightly later. Currently the 7100 and 4100 are expected after the 10th while the 5100 should show up on the 18th.
AMD is also releasing the Radeon Pro Software. This is essentially their professional driver development that improves upon features, stability, and performance over time. AMD aims to release new drivers for this market every 4th Thursday each quarter.
This is certainly an important area for AMD to address with their new cards and this updated software scheme. NVIDIA has made a pretty penny over the years from their Quadro stack due to the extremely robust margins for these cards. The latest generation of AMD Radeon Pro WX cards look to stack up favorably against the latest products from NVIDIA.
The WX 7100 will come in at a $799 price point, while the WX 5100 and WX 4100 will hit $499 and $399 respectively.
Subject: Mobile | November 7, 2016 - 07:01 AM | Scott Michaud
Tagged: VR, google, daydream
Now that Android 7.0 Nougat is beginning to ship on phones, and the holidays are around the corner, Google is releasing accessories to support it. The Daydream View, which you can dock phones into for VR purposes, will be on sale on November 10th. The viewer will cost $79 USD, minus the phone of course, and can be purchased in the US, Canada, UK, Germany, and Australia.
Currently, the only Daydream-compatible phone is Google's Pixel. As mentioned before, Nougat is only beginning to ship on phones, and the OS is required for this feature. One thing that's not entirely clear to me, after reading several sites, is whether all Daydream-compatible phones will be able to use this specific viewer, or if some will need a different chassis. You would think that variations in attributes like screen size might complicate things, but we know it will support other, non-Pixel phones; I'm just not clear whether it's some or all.
Anyway, that concern aside, it's almost cheap enough to be a “why not?” addition to any phone that is compatible. It certainly will not put the HTC Vive PC-based VR system out of business, but I'm interested in how it works with mobile content, especially linear video, going forward.
Subject: Systems, Mobile | November 6, 2016 - 07:00 AM | Scott Michaud
Tagged: Nintendo, nes, Cortex A7, arm, Allwinner
It looks like Peter Brown, Senior Reviews Editor at GameSpot received an NES Classic and promptly disassembled it for a single photo. From there, users on Reddit searched the component model numbers and compiled specifications. According to their research, the system (unless Nintendo made multiple, interchangeable models) is based on an Allwinner R16 SoC, which has four ARM Cortex A7 cores and an ARM Mali 400 MP2 GPU. Attached to this is 256MB of DDR3 RAM and 512 MB of flash.
Image Credit: Peter Brown
Thankfully, the packaging of each chip has quite large, mostly legible branding, so it's easy to verify.
In terms of modern phone technology, this is about the bottom of the barrel. The Allwinner R16 should be roughly comparable to the Raspberry Pi 2, only that system has about four times the RAM as Nintendo's. This is not a bad thing, of course, because its entire goal is to emulate a device that was first released in 1983 (in Japan) albeit at high resolution. Not all of the games will be free for them to include, either. Mega Man 2, PAC-MAN, Final Fantasy, Castlevania 1 and 2, Ninja Gaiden, Double Dragon II, Bubble Bobble, Tecmo Bowl, Super C, and Galaga are all from third-party publishers, who will probably need some cut of sales.
Users are claiming that it doesn't look like it could be updated. Counting the ports, it doesn't look like there's any way in, but I could be wrong. That said, I never expected it to be upgradeable so I guess that's that?
The NES Classic Edition goes on sale on November 11th for $59.99 USD MSRP.
Subject: General Tech | November 6, 2016 - 02:11 AM | Scott Michaud
If you're using the free version of LastPass to guard your passwords, you can now access your vault through the mobile app for free. Previously, a subscription to LastPass Premium, which is about $12 per year, was required to use the service outside of desktop browser extensions and the website. The subscription service still exists, but for its other benefits, like sharing your vault with up to five other users, two-factor authentication through YubiKey, or securely storing 1GB of files (which could be good for things like encryption keyfiles to personal web servers).
The native Windows “LastPass for Applications” is still Premium-class, though.
If you are still interested in LastPass Premium, they are participating in the Humble Lifehacker Software Bundle. Users who do not currently have LastPass Premium, who also pay more than (currently) $7.64 USD, will get 12 months. They will also get DisplayFusion and CyberGhost VPN, as well as everything in the $1 tier, like a couple of Stardock enhancements for Windows.
Subject: General Tech | November 6, 2016 - 12:00 AM | Scott Michaud
Tagged: left 4 dead, pc gaming
The first Left 4 Dead was developed by Turtle Rock Studios during their brief time as a Valve subsidiary. At some point, they were working on a fifth campaign that was intended to bridge the gap between Dead Air and Blood Harvest. Players start at the landed plane and work their way toward a hydroelectric dam. One section, contrary to the game's theme, was apparently designed to encourage survivors to split up, some in a sniper tower and others running through a trench.
Image Credit: Wikipedia
Turtle Rock Studios has apparently streamed gameplay at some point. Since then, they published the campaign as a Left 4 Dead mod. It's not finished, and apparently quite glitchy, although I haven't played it yet, but it's free.
Check out the post at their forums, which has a link to concept material and the mod itself.
Subject: Graphics Cards | November 5, 2016 - 08:19 PM | Scott Michaud
Tagged: linux, DOTA 2, valve, nvidia, vulkan, opengl
Phoronix published interesting benchmark results for OpenGL vs Vulkan on Linux, across a wide spread of thirteen NVIDIA GPUs. Before we begin, the CPU they chose was an 80W Intel Xeon E3-1280 v5, which fits somewhere between the Skylake-based Core i7-6700k and Core i7-6700 (no suffix). You may think that Xeon v5 would be based on Broadwell, but, for some reason, Intel chose the E3-1200 series to be based on Skylake. Regardless, the choice of CPU will come in to play.
They will apparently follow up this article with AMD results.
A trend arose throughout the whole article. At 1080p, everything, from the GTX 760 to the GTX 1080, was rendering at ~101 FPS on OpenGL and ~115 FPS on Vulkan. The obvious explanation is that the game is 100% CPU-bound on both APIs, but Vulkan is able to relax the main CPU thread enough to squeeze out about 14% more frames.
The thing is, the Xeon E3-1280 v5 is about as high-end of a mainstream CPU as you can get. It runs the most modern architecture and it can achieve clocks up to 4 GHz on all cores. DOTA 2 can get harsh on the CPU when a lot of units are on screen, but this is a little surprisingly low. Then again, I don't have any experience running DOTA 2 benchmarks, so maybe it's a known thing, or maybe even a Linux-version thing?
Moving on, running the game at 4K, the results get more interesting. In GPU-bound scenarios, NVIDIA's driver shows a fairly high performance gain on OpenGL. Basically all GPUs up to the GTX 1060 run at a higher frame rate in OpenGL, only switching to Vulkan with the GTX 1070 and GTX 1080, where OpenGL hits that 101 FPS ceiling and Vulkan goes a little above.
Again, it will be interesting to see how AMD fairs against this line of products, both in Vulkan and OpenGL. Those will apparently come “soon”.
Subject: General Tech | November 5, 2016 - 04:09 PM | Scott Michaud
Tagged: Windows 7, windows 10, microsoft
For the second month in a row, NetMarketShare are reporting that Windows 7 is gaining market-share faster than Windows 10. It's difficult to know exactly what this means, and for who, but one possible explanation is that users upgraded to Windows 10 and rolled back to 7 in significant amounts. It will be interesting to monitor the next couple of months, now that Windows 7 is no longer available at retail, to see how its market-share shifts. Then, a few months after that, we'll need to see how Zen and Kaby Lake, which are not supported by Windows 7 and Windows 8.x, changes that further.
I'll now spend the rest of the post discussing statistics... because I can visualize the comments.
NetMarketShare records browser identification strings from partnered websites. As you would expect, there's a bit of controversy regarding how accurate their numbers are. Some of this criticism is simply wrong, usually misunderstanding how small a truly random sample needs to be to converge to the same ratios you will see in a large sample. Just a thousand truly random samples can get you within a few percent of hundreds of millions of people. Studies like this, if they are truly random, have plenty enough data to get a very precise ratio.
A valid concern, however, is whether their pool of websites under- or over-represent certain groups, especially when you attempt to make comparisons on the order of a hundredth of a percent. NetMarketShare claims that they try to get a global representation, including government websites, and they correct their traffic based on the CIA's per-country statistics. Still, it's good to question whether the group of people you are trying to investigate are represented by NetMarketShare's traffic, and how their limitations lower your effective precision.
Subject: General Tech | November 5, 2016 - 07:01 AM | Scott Michaud
Tagged: Samsung, euv, 7nm, 14nm, 10nm
As the comments usually remind us, the smallest feature size varies in interpretation from company to company, and node to node. You cannot assume how Samsung compares with Intel, GlobalFoundries, or TSMC based on the nanometer rating alone, better or worse. In fact, any specific fabrication process, when compared to another one, might be better in some ways yet worse in others.
With all of that in mind, Samsung has announced the progress they've made with 14nm, 10nm, and 7nm fabrication processes. First, they plan to expand 14nm production with 14LPU. I haven't been able to figure out what this specific branding stands for, but I'm guessing it's something like “Low Power Ultra” given that it's an engineering name and those are usually super literal (like the other suffixes).
As for the other suffixes, Samsung begins manufacturing nodes with Low Power Early (LPE). From there, they improve upon their technique, providing higher performance and/or lower power, and call this new process Low Power Plus (LPP). LPC, which I believe stands for something like Low Power Cost, although I haven't seen this acronym officially expanded, removes a few manufacturing steps to make the end product cheaper. LPU is an extension of LPC with higher performance. Add the appropriate acronym as a suffix to the claimed smallest feature size, and you get the name of the node: xxLPX.
14LPU is still a ways out, though. Their second announcement, 10LPU, is expected to be their cost-reduction step for 10nm, which I interpret to mean they are omitting LPC from their 10nm production. You may think this is very soon, given how 10LPE has just started mass production a few weeks ago. Really, this is a quite early announcement in terms of overall 10nm production. The process design kits (PDKs) for both 14LPU and 10LPU, which are used by hardware vendors to design their integrated circuits, won't ship until 2Q17. As such, products will be a while behind that.
To close out, Samsung reiterated that 7nm is planned to use extreme ultraviolet lithography (EUV). They have apparently created a wafer using 7nm EUV, but images do not seem to be provided.
Development kits for 14LPU and 10LPU are expected to ship in the second quarter of 2017.
Subject: Memory | November 5, 2016 - 01:42 AM | Tim Verry
Tagged: G.Skill Trident Z, G.Skill, ddr4
Yesterday G.Skill announced the launch of the soon-to-be fastest DDR4 64 GB kit using 16 GB modules running at 3600 MHz. The new Trident Z kit uses Samsung 8Gb chips and pairs four 16 GB DIMMs supporting XMP 2.0 and CAS latencies of 17-19-19-39.
The DDR4 memory kit features stylized brushed aluminum heatspreaders with red accents similar to those used on existing Trident Z kits. Out of the box the kit runs at 1.35 voilts though overclockers should be able to push them further to eke out a bit more speed beyond the stock 3600 MHz!
Beyond that there is not much to the announcement other than G.Skill claiming the speed crown. Looking online, it seems the previous highest speed offered was 3466 MHz so the new modules are a decent bit faster.
According to G.Skill, the new Trident Z 64GB kit will be available in December. They have not yet released pricing, but I would expect it to MSRP for at least $570 considering G.Skill and Corsair currently have DDR4 3466 MHz 64GB kits priced at $540 and $530 respectively. If you are into overclocking, you can probably save a few bucks and overclock some lower specc'ed memory, but these might be good if you are building a workstation that doesn't need ECC (e.g. a video editing and streaming monster heh).
Subject: General Tech | November 4, 2016 - 06:21 PM | Scott Michaud
Tagged: blizzard, pc gaming, diablo, diablo iii
Starting in the Public Test Realm next week, Diablo 3 will receive a campaign that is based on the original game, which turns 20 years old on New Year's Eve. Like the original game, you will fight down the levels of a dungeon into Hell where you fight and kill Diablo. Pardon the spoilers.
The patch, called The Darkening of Tristram, seems to be taken light-heartedly by the company. They announce that it will add a low-quality rendering mode to pay homage to graphical limitations of 1996. More functionality, they also force the character to move in eight directions, which I'm not sure it's tongue-in-cheek or actually implemented for gameplay reasons.
Either way, you can check it out next week by joining the beta realm.
Subject: Graphics Cards | November 4, 2016 - 05:57 PM | Scott Michaud
Tagged: amd, graphics drivers, crimson
AMD has released another hotfix driver, just a day after releasing 16.11.1. This version has a single listed change: “Improved Shader Cache storage limit”. I'm not sure why the company decided to release this update so abruptly, since I'm not aware of any critical issue that relies upon it, but there's certainly nothing wrong with rapidly releasing optional software. I'm guessing at least one new game has a performance issue with the previous maximum, though.
If this has been an issue for you, and you are able to install drivers that are unsigned, it's available.