All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | February 8, 2017 - 10:46 PM | Scott Michaud
Tagged: webkit, webgpu, metal, vulkan, webgl
Apple’s WebKit team has just announced their proposal for WebGPU, which competes with WebGL to provide graphics and GPU compute to websites. Being from Apple, it is based on the Metal API, so it has a lot of potential, especially as a Web graphics API.
Okay, so I have mixed feelings about this announcement.
First, and most concerning, is that Apple has attempted to legally block open standards in the past. For instance, when The Khronos Group created WebCL based on OpenCL, which Apple owns the trademark and several patents to, Apple shut the door on extending their licensing agreement to the new standard. If the W3C considers Apple’s proposal, they should be really careful about what legal control they allow Apple to retain.
From a functionality standpoint, this is very interesting, though. With the aforementioned death of WebCL, as well as the sluggish progress of WebGL Compute Shaders, there’s a lot of room to use one (or more) GPUs in a system for high-end compute tasks. Even if you are not interested in gaming on a web browser, although many people are, especially if you count the market that Adobe Flash dominated for the last ten years, you might want to GPU-accelerate photo and video tasks. Having an API that allows for this would be very helpful going forward, although, as stated, others are working on it, like The Khronos Group with WebGL compute shaders. On the other-other hand, an API that allows explicit multi-GPU would be even more interesting.
Further, it sounds like they’re even intending to ingest byte-code, like what DirectX 12 and Vulkan are doing with DXIL and SPIR-V, respectively, but it currently accepts shader code as a string and compiles it in the driver. This is interesting from a security standpoint, because it obfuscates what GPU-executed code consists of, but that’s up to the graphics and browser vendors to figure out... for now.
So when will we see it? No idea! There’s an experimental WebKit patch, which requires the Metal API, and an API proposal... a couple blog posts... a tweet or two... and that’s about it.
So what do you think? Does the API sound interesting? Does Apple’s involvement scare you? Or does getting scared about Apple’s involvement annoy you? Comment away! : D
Subject: Processors | February 8, 2017 - 09:38 PM | Josh Walrath
Tagged: Zen, Skylake, Samsung, ryzen, kaby lake, ISSCC, Intel, GLOBALFOUNDRIES, amd, AM4, 14 nm FinFET
Yesterday EE Times posted some interesting information that they had gleaned at ISSCC. AMD released a paper describing the design process and advances they were able to achieve with the Zen architecture manufactured on Samsung’s/GF’s 14nm FinFETT process. AMD went over some of the basic measurements at the transistor scale and how it compares to what Intel currently has on their latest 14nm process.
The first thing that jumps out is that AMD claimes that their 4 core/8 thread x86 core is about 10% smaller than what Intel has with one of their latest CPUs. We assume it is either Kaby Lake or Skylake. AMD did not exactly go over exactly what they were counting when looking at the cores because there are some significant differences between the two architectures. We are not sure if that 44mm sq. figure includes the L3 cache or the L2 caches. My guess is that it probably includes L2 cache but not L3. I could be easily wrong here.
Going down the table we see that AMD and Samsung/GF are able to get their SRAM sizes down smaller than what Intel is able to do. AMD has double the amount of L2 cache per core, but it is only about 60% larger than Intel’s 256 KB L2. AMD also has a much smaller L3 cache as well than Intel. Both are 8 MB units but AMD comes in at 16 mm sq. while Intel is at 19.1 mm sq. There will be differences in how AMD and Intel set up these caches, and until we see L3 performance comparisons we cannot assume too much.
(Image courtesy of ISSCC)
In some of the basic measurements of the different processes we see that Intel has advantages throughout. This is not surprising as Intel has been well known to push process technology beyond what others are able to do. In theory their products will have denser logic throughout, including the SRAM cells. When looking at this information we wonder how AMD has been able to make their cores and caches smaller. Part of that is due to the likely setup of cache control and access.
One of the most likely culprits of this smaller size is that the less advanced FPU/SSE/AVX units that AMD has in Zen. They support AVX-256, but it has to be done in double the cycles. They can do single cycle AVX-128, but Intel’s throughput is much higher than what AMD can achieve. AVX is not the end-all, be-all but it is gaining in importance in high performance computing and editing applications. David Kanter in his article covering the architecture explicitly said that AMD made this decision to lower the die size and power constraints for this product.
Ryzen will undoubtedly be a pretty large chip overall once both modules and 16 MB of L3 cache are put together. My guess would be in the 220 mm sq. range, but again that is only a guess once all is said and done (northbridge, southbridge, PCI-E controllers, etc.). What is perhaps most interesting of it all is that AMD has a part that on the surface is very close to the Broadwell-E based Intel i7 chips. The i7-6900K runs at 3.2 to 3.7 GHz, features 8 cores and 16 threads, and around 20 MB of L2/L3 cache. AMD’s top end looks to run at 3.6 GHz, features the same number of cores and threads, and has 20 MB of L2/L3 cache. The Intel part is rated at 140 watts TDP while the AMD part will have a max of 95 watts TDP.
If Ryzen is truly competitive in this top end space (with a price to undercut Intel, yet not destroy their own margins) then AMD is going to be in a good position for the rest of this year. We will find out exactly what is coming our way next month, but all indications point to Ryzen being competitive in overall performance while being able to undercut Intel in TDPs for comparable cores/threads. We are counting down the days...
Subject: Mobile | February 8, 2017 - 07:26 PM | Scott Michaud
Tagged: zte, axon 7, google, nougat, Android, android 7.0
Well that was quick. About two weeks ago, we reported on ZTE Mobile Deutschland’s Facebook post that said Android 7.0 would miss January, but arrive some time in Q1. For North America, that apparently means the second week of February, because my device was just notified, about an hour ago, that A2017UV1.1.0B15 was available for over-the-air update. It just finished installing.
In my case, I needed to hold the on button a few times to get the phone to boot into the second stage of installation, but ZTE mentions it in the pre-install notes, so that’s good. Then, when the phone moved on to the new lock screen, my fingerprint reader didn’t work until after I typed in the lock screen password. I’m not sure why the phone didn’t accept the fingerprint reader until after I successfully logged in, especially since it used the fingerprints on file from Android 6.0, I didn’t need to set it up again, but it’s a small inconvenience. Just don’t perform the update if you can’t access your password manager and you don’t remember the unlock code off the top of your head.
While I don’t have a Daydream VR headset, I’ll probably pick one up soon and give it a test. The Daydream app is installed on the device, though, so you can finally enjoy Android-based VR content if you pick one up.
If your phone hasn’t alerted you yet, find your unlock password and check for updates in the settings app.
Subject: General Tech | February 8, 2017 - 02:24 PM | Jeremy Hellstrom
Tagged: gaming, bards tale, inxile
inXile have been very busy recently, doing a stellar job at resurrecting Wasteland into a new, modern RPG which is soon to see its third incarnation released. The long anticipated Torment: Tides of Numenara arrives at the end of this month, the beta has been a tantalizing taste as was the YouTube 'choose your own adventure' teaser. There is another project they have been working on, bringing the old Bard's Tale gaming into the modern era. A trailer showing in-game footage, including combat has just been release which you can see over at Rock, Paper, SHOTGUN. It certainly doesn't look like the Bard's Tale of old!
"On the game HUD, you can see your party occupying 2 rows of 4 spaces each. Enemies will line up on the opposite grid with the same number of slots. The exact positioning of enemies, as well as your own party, will determine which attacks can land, and which will swing wild past their mark."
Here is some more Tech News from around the web:
- Necromunda has bullet casings the size of tower blocks @ Rock, Paper, SHOTGUN
- Starship Commander VR game is purely voice controlled @ HEXUS
- Skyblivion looks even more like Oblivion rebuilt in Skyrim @ Rock, Paper, SHOTGUN
- The Radeon FreeSync 2 HDR Gaming Tech Report @ TechARP
- The Surge brings limb-theft to the Dark Souls party @ Rock, Paper, SHOTGUN
- Stellaris: Utopia expansion to blast off with Banks update @ Rock, Paper, SHOTGUN
Subject: Processors | February 8, 2017 - 01:16 PM | Jeremy Hellstrom
Tagged: kaby lake, i5-7600K, Intel
[H]ard|OCP followed up their series on replacing the TIM underneath the heatspreader on Kaby Lake processors with another series depicting the i5-7600K in the buff. They removed the heatspreader completely and tried watercooling the die directly. As you can see in the video this requires more work than you might immediately assume, it was not simply shimming which was involved, some of the socket on the motherboard needed to be trimmed with a knife in order to get the waterblock to sit directly on the core. In the end the results were somewhat depressing, the risks involved are high and the benefits almost non-existent. If you are willing to risk it, replacing the TIM and reattaching the heatspreader is a far better choice.
"After our recent experiments with delidding and relidding our 7700K and 7600K to see if we could get better operating temperatures, we decided it was time to go topless! Popping the top on your CPU is one thing, and getting it to work in the current processor socket is another. Get out your pocket knife, we are going to have to make some cuts."
Here are some more Processor articles from around the web:
- Intel Celeron G3930 On Linux: A Dual-Core Kabylake CPU For $40 @ Phoronix
- Intel Pentium G4600: A Surprising 3.6GHz Kabylake CPU For $90 @ Phoronix
Subject: General Tech | February 8, 2017 - 12:44 PM | Jeremy Hellstrom
Tagged: amd, FreeSync2, David Glen, Syed Hussain
TechARP published a video of their interview with AMD's David Glen and Syed Hussain in which they discussed what to expect from FreeSync 2. They also listed some key points for those who do not wish to watch the full video; either can be found right here. The question on most people's minds is answered immediately, this will not be a Vega only product and if your GPU supports the current version it will support the sequel. We will not see support for it until a new driver is released, then again we also await new monitors to hit the market as well so it is hard to be upset at AMD for the delay.
"While waiting for AMD to finalise Radeon FreeSync 2 and its certification program for their partners, let’s share with you our Q&A session with the two key AMD engineers in charge of the Radeon FreeSync 2 project – David Glen and Syed Athar Hussain."
Here is some more Tech News from around the web:
- 72% of 'Anonymous' Browsing History Can Be Attached To the Real User @ Slashdot
- Google AI's zoom and enhance photo tech gives you nowhere to hide @ The Inquirer
- AMD's daring new money-making strategy: Sue everyone! Mwahaha @ The Register
Subject: Motherboards | February 8, 2017 - 10:15 AM | Sebastian Peak
Tagged: small form factor, SFF, PCI-E 3.0, MXM, motherboard, mobile gpu, mini-stx, H110-STX-MXM, asrock
ASRock has announced a new mini-STX motherboard with an interesting twist, as the H110-STX MXM motherboard offers support for current MXM (version 3.0b, up to 120W) mobile graphics cards.
Like the ECS H110 motherboard featured in our recent Mini-STX build, the ASRock H110-STX MXM is based on the LGA1151 socket (though CPU TDP was not in the source post), offers a pair a DDR SODIMM slots for up to 32GB of DDR4 notebook memory. Storage support is excellent with dual SATA ports and M.2 SSD support. Importantly, this ASRock board uses PCI Express 3.0 on both the MXM (PCIe 3.0 x16) and M.2 (PCIe 3.0 x4) slots. Display output capability is excellent as well, quoting the TechPowerUp post:
"Display connectivity includes one HDMI port that's wired to the CPU's onboard graphics, a second HDMI port wired to the MXM slot, a full-size DisplayPort wired to the MXM, and a Thunderbolt port with mini-DisplayPort wiring to the MXM."
There are some roadblocks to building up a gaming system with this motherboard, not the least of which is cost. Consider that compatible MXM 3.0b options (with a recent GPU) are hundreds of dollars from a place like Eurocom (a GTX 980M is around $800, for example). Naturally, if you had a damaged gaming notebook with a usable MXM GPU, this board might be a nice option for re-purposing that graphics card. Cooling for the MXM card is another issue, however, though harvesting an MXM card from a notebook could potentially allow implementing the existing thermal solution from the laptop.
Look closely and you will see a Z270 product name in this ASRock photo
Update: We now have full specifications from ASRock's product page, which include:
- Socket LGA1151 for Intel Core i7/i5/i3/Pentium/Celeron (Kabylake)
- Supports MXM Graphics Card (Type-B , Up to 120W)
- Supports DDR4 2400MHz, 2 x SO-DIMM, up to 32GB system memory
- 1 x HDMI (4K@60Hz), 1x HDMI, 1x DisplayPort, 1x Mini-DisplayPort
- 3x USB3.0 Type-A, 1x Thunderbolt 3 with USB 3.1 Type-C
- 1x M.2 (Key E), 2x M.2 (Key M)
- 1x Intel i219V Gigabit LAN
- DC 19V / 220W power input
Of note, the chipset is listed as Z270, though the product name and primary motherboard photo suggest H110. The H110-STX MXM is part of ASRocks industrial motherboard offerings (with signage and gaming the mentioned applications), and includes a 220W power supply. Pricing and availability were not mentioned.
Subject: General Tech | February 7, 2017 - 03:01 PM | Jeremy Hellstrom
Tagged: gaming, Star Wars, humble bundle
Do like Star Wars games, PCPer and Unicef? If so there is a Humble Bundle perfect for you running for the next two weeks. Depending on how much you pay you can get up to 15 games and an X-Wing verus TIE Fighter t-shirt, with a percentage of your purchase helping us to continue to provide the content you love. There is some overlap with previous bundles you may have picked up but for those of you missing KOTOR 1 or 2, The Force Unleashed 1 or 2, Shadows of the Empire or even the second Star Wars Battlefront game it is well worth the cost.
How can you resist that t-shirt?
Subject: Motherboards | February 7, 2017 - 02:35 PM | Jeremy Hellstrom
Tagged: z270 express, Maximus IX Formula, intel z270, ASUS ROG, asus
ASUS' Maximus Formula series have become familiar to high end system builders and the newest member looks to live up to our expectations. The list of features is comprehensive, including two M.2 slots and a U.2 slot, two USB 3.1 ports including a Type-C and an ASUS 2T2R dual band 802.11a/b/g/n/ac antenna. [H]ard|OCP had mixed results when overclocking, some testers had a perfect experience while others ran into some hurdles, that may be due to the processors they used so do not immediately write this motherboard off. Take a look at the full review before you decide one way or the other.
"ASUS is nothing like Hollywood. ASUS can actually turn out sequels which not only match the originals, but surpass them. ASUS Republic of Gamers Maximus IX Formula is another sequel in the long line of Maximus motherboards. Can ASUS continue its long history of awesome sequels? One things for certain, it’s no Robocop 3."
Here are some more Motherboard articles from around the web:
- ASRock Z270 Extreme4 @ Kitguru
- MSI Z270 Gaming M7 @ [H]ard|OCP
- MSI B250M Mortar @ Modders-Inc
- MSI Z270 Gaming Pro Carbon Motherboard Review @ Hardware Canucks
- MSI Z270 XPower Titanium Review @ Neoseeker
- ASUS ROG STRIX Z270F Gaming @ Kitguru
- ASUS Maximus IX Hero Motherboard Review @ Hardware Canucks
Subject: General Tech | February 7, 2017 - 01:31 PM | Jeremy Hellstrom
Tagged: Intel, c2000, Avoton
"System May Experience Inability to Boot or May Cease Operation" is not the errata note you want to read, but for those running devices powered by an Intel Avoton C2xxx family Atom processor it is something to pay attention to. The Low Pin Count bus clock may stop functioning permanently after the chip has been in service for a time, rendering the device non-functional. Intel had little to say about the issue when approached by The Register but did state that there is a board level workaround available to resolve the issue.
The Avoton famliy of chips were released in 2013 and were designed to compete against ARM's new low powered server chips. The flaw is likely responsible for the issues with Cisco routers that have been reported on recently; the chip can also be found in the Synology DS1815+ and some Dell server products. It will be interesting to see how Intel responds to this issue, they have a history of reluctance discussing flaws in their product's architecture.
"Intel's Atom C2000 processor family has a fault that effectively bricks devices, costing the company a significant amount of money to correct. But the semiconductor giant won't disclosed precisely how many chips are affected nor which products are at risk."
Here is some more Tech News from around the web:
- Giant ion-trapped quantum computer takes shape @ Nanotechweb
- Boffins create quantum cloning machine to intercept 'secure' messages @ The Inquirer
- 'Wet' metallic MoS2 makes ultrafast supercapacitor @ Nanotechweb
- Vizio fined $2.2m for selling screen-scraped data from Smart TVs @ The Inquirer
- Reverse Engineering Ikea’s New Smart Bulbs @ Hack a Day
- Canadian telco bans a little four-letter dirty word from texts: U B E R @ The Register
- Hacker hijacks 160,000 insecure printers to teach a lesson about security @ The Inquirer
- Western Digital Unveils First-Ever 512Gb 64-Layer 3D NAND Chip @ Slashdot
- If You Owned a PC With a DVD Drive You Might Be Able To Claim $10 @ Slashdot Inbox x General x
- Wheel of Destiny Case Mods #1 Giveaway by EVGA and Modders-Inc
Subject: General Tech | February 7, 2017 - 04:31 AM | Scott Michaud
Tagged: logitech, webcam, brio, 4k, hdr
Today’s announcement of the Logitech BRIO rolls in many features that have been lacking in webcams. With it, you can record in 720p30, 720p60, 1080p30, 1080p60, and, the big reveal, 4K30. It is also capable of shooting in HDR using RightLight 3, although they don’t specify color space formats, so it’s unclear what you will be able to capture with video recording software.
On top of these interesting video modes, the camera also supports infrared for Windows Hello “or other facial recognition software”. Unlike Intel’s RealSense, the webcam claims support for the relatively ancient Core 2 and higher, which sounds promising for AMD users. I’m curious what open-source developers will be able to accomplish, especially if it’s general enough to do background rejection (and so forth). Obviously, this is just my speculation -- Logitech hasn’t even hinted at this in their documentation.
As you would expect for a 4K sensor, Logitech is also advertising quite a bit of digital zoom. They claim up to 5X and FOVs user-configurable between 65 degrees and 90 degrees.
Finally, the price is $199 USD / $249 CDN and it ships today.
Subject: General Tech | February 7, 2017 - 02:47 AM | Scott Michaud
Tagged: mozilla, firefox, web browser, Rust, llvm
Firefox 52 will be the company’s next Extended Support Release (ESR) branch of their popular web browser. After this release, Mozilla is planning a few changes that will break compatibility, especially if you’re building the browser from source. If you’re an end-user, the major one to look out for is Mozilla disabling NPAPI-based plugins (except Flash) unless you are using Firefox 52 ESR. This change will land in the consumer version of Firefox 52, though. It’s not really clear why they didn’t just wait until Firefox 53, rather than add a soft-kill in Firefox 52 and hard-code it the next version, but that’s their decision. It really does not affect me in the slightest.
The more interesting change, however, is that Mozilla will begin requiring Rust (and LLVM) in an upcoming version. I’ve seen multiple sources claim Firefox 53, Firefox 54, and Firefox 55 as possible targets for this, but, at some point around those versions, critical components of the browser will be written in Rust. As more of the browser is migrated to this language, it should be progressively faster and more secure, as this language is designed to enforce memory safety and task concurrency.
Firefox 52 is expected in March.
Subject: General Tech | February 6, 2017 - 05:26 PM | Jeremy Hellstrom
Tagged: MK Fission, mechanical keyboard, input, Cherry MX
If you wanted MechanicalKeyboards.com then TechPowerUp has some bad news for you, as it is already taken. When not brainstorming with Captain Obvious, they are the North American retailer for Ducky Keyboards, a name you might have possibly heard before. Their MK Fission comes in 18 flavours, you can only choose black or white keycaps but you have your choice of the full range of Cherry switches. If you have lost track of the score that includes Red, Brown, Blue, Black, Silent Red, Speed Silver, Green, Clear and White. The keyboard has blue backlighting and the RGB disease has only infected the outer casing of the keyboard, giving it a look which might be familiar to anyone who knew someone in the 90s' with questionable taste in car accessories.
"MechanicalKeyboards.com is a prominent retailer of mechanical keyboards, as the name would suggest, based in the USA. Today we get to take a look at their new MK Fission full size keyboard that comes in 18 possible options to choose from, Yes, there is RGB included but perhaps not the way you think."
Here is some more Tech News from around the web:
- G.SKILL Ripjaws KM570 MX Keyboard @ techPowerUp
- Razer BlackWidow Chroma V2 Mechanical Gaming Keyboard @ Custom PC Review
- Rosewill RK-9000 V2 RGB Keyboard @ techPowerUp
- SpeedLink Quinox Gamepad @ Kitguru
- Gamdias Zeus P1 RGB Mouse @ Benchmark Reviews
- Corsair Scimitar Pro RGB Gaming Mouse Review @ Techgage
Subject: Systems, Mobile | February 6, 2017 - 03:37 PM | Sebastian Peak
Tagged: xeon, Thinkpad, quadro, P71, P51s, P51, nvidia, notebook, mobile workstation, Lenovo, kaby lake, core i7
Lenovo has announced a trio of new ThinkPad mobile workstations, featuring updated Intel 7th-generation Core (Kaby Lake) processors and NVIDIA Quadro graphics, and among these is the thinnest and lightest ThinkPad mobile workstation to date in the P51s.
"Engineered to deliver breakthrough levels of performance, reliability and long battery life, the ThinkPad P51s features a new chassis, designed to meet customer demands for a powerful but portable machine. Developed with engineers and professional designers in mind, this mobile workstation features Intel’s 7th generation Core i7 processors and the latest NVIDIA Quadro dedicated workstation graphics, as well as a 4K UHD IPS display with optional IR camera."
Lenovo says that the ThinkPad P51s is more than a half pound lighter than the previous generation (P50s), stating that "the P51s is the lightest and thinnest mobile workstation ever developed by ThinkPad" at 14.4 x 9.95 x 0.79 inches, and weight starting at 4.3 lbs.
Specs for the P51s include:
- Up to a 7th Generation Intel Core i7 Processor
- NVIDIA Quadro M520M Graphics
- Choice of standard or touchscreen FHD (1920 x 1080) IPS, or 4K UHD (3840 x 2160) IPS display
- Up to 32 GB DDR4 2133 RAM (2x SODIMM slots)
- Storage options including up to 1 TB (5400 rpm) HDD and 1 TB NVMe PCIe SSDs
- USB-C with Intel Thunderbolt 3
- 802.11ac and LTE-A wireless connectivity
Lenovo also announced the ThinkPad P51, which is slightly larger than the P51s, but brings the option of Intel Xeon E3-v6 processors (in addition to Kaby Lake Core i7 CPUs), Quadro M2200M graphics, faster 2400 MHz memory up to 64 GB (4x SODIMM slots), and up to a 4K IPS display with X-Rite Pantone color calibration.
Finally there is the new VR-ready P71 mobile workstation, which offers up to an NVIDIA Quadro P5000M GPU along with Oculus and HTC VR certification.
"Lenovo is also bringing virtual reality to life with the new ThinkPad P71. One of the most talked about technologies today, VR has the ability to bring a new visual perspective and immersive experience to our customers’ workflow. In our new P71, the NVIDIA Pascal-based Quadro GPUs offer a stunning level of performance never before seen in a mobile workstation, and it comes equipped with full Oculus and HTC certifications, along with NVIDIA’s VR-ready certification."
Pricing and availability is as follows:
- ThinkPad P51s, starting at $1049, March
- ThinkPad P51, starting at $1399, April
- ThinkPad P71, starting at $1849, April
Subject: General Tech | February 6, 2017 - 02:46 PM | Jeremy Hellstrom
Tagged: turtle beach, microphone, audio, Stream Mic
Upgrading your microphone from the one found on your gaming headset can make a significant difference in the way you sound online. Being able to do so for around $50 and to be able to use the same device on your PC, Xbox and PS4 might just convince some that the upgrade is worth it. The Turtle Beach Multi-Format Stream Microphone can be transferred between devices with a simple switch and it will run without any drivers. It also has a a built in headphone amplifier so you can move your headset with you without unplugging. Drop by eTeknix for a look at it.
"While many of us only need a standard headset with a simple boom mic, there’s a growing demand for higher quality microphones for both gamers and streamers, on Twitch, YouTube Live and much more. Turtle Beach are not the first to make a dedicated streaming microphone, but they are one of the more affordable options too, and their new Stream Mic comes with support for Xbox One, PlayStation 4 and PC, making it a tempting solution for the multi-format gamer and streamer."
Here is some more Tech News from around the web:
- Antlion Audio ModMic 5 @ Kitguru
- Apple's AirPods Bring Wireless to New Heights @ Hardware Secrets
- OZONE EKHO H80 Origen Pro Gaming Headset Review @ NikKTech
- Razer Kraken Pro V2 eSports Gaming Headset @ eTeknix
Subject: General Tech | February 6, 2017 - 01:36 PM | Jeremy Hellstrom
Tagged: darpa, ai, security, Usenix Enigma 2017
DARPA hosted the first Cyber Grand Challenge last summer, in which the software from seven machine learning projects competed to find and patch vulnerabilities in a network, and to attack each other. While the specific vulnerabilities discovered have not been made public you can read a bit about what was revealed about the contest at Usenix Enigma 2017 over at The Register. For instance, one of the programs managed to find a flaw in the OS all the machines were running on and then hack into another to steal data. A different machine noticed this occurring and patched itself on the fly, making sure that it was protected from that particular attack. Also worth noting is that the entire contest was over in 20 minutes.
"The exact nature of these new bug types remains under wraps, although we hear that at least one involves exploitable vulnerabilities in data queues."
Here is some more Tech News from around the web:
- New SMB bug: How to crash Windows system with a 'link of death' @ The Register
- Windows Cloud: Microsoft's Chrome OS rival revealed in leaked screenshots @ The Inquirer
- Olimex Announces Their Open Source Laptop @ Hack a Day
- Google will restrict Gmail in Windows XP and Vista this year @ The Inquirer
- Denuvo: Our cracked RE7 protection is still better than nothing @ Ars Technica
- FYI: Ticking time-bomb fault will brick Cisco gear after 18 months @ The Register
Subject: Graphics Cards | February 6, 2017 - 11:43 AM | Sebastian Peak
Tagged: video card, silent, Passive, palit, nvidia, KalmX, GTX 1050 Ti, graphics card, gpu, geforce
Palit is offering a passively-cooled GTX 1050 Ti option with their new KalmX card, which features a large heatsink and (of course) zero fan noise.
"With passive cooler and the advanced powerful Pascal architecture, Palit GeForce GTX 1050 Ti KalmX - pursue the silent 0dB gaming environment. Palit GeForce GTX 1050 Ti gives you the gaming horsepower to take on today’s most demanding titles in full 1080p HD @ 60 FPS."
The specs are identical to a reference GTX 1050 Ti (4GB GDDR5 @ 7 Gb/s, Base 1290/Boost 1392 MHz, etc.), so expect the full performance of this GPU - with some moderate case airflow, no doubt.
We don't have specifics on pricing or availablity just yet.
Subject: General Tech | February 4, 2017 - 11:54 PM | Tim Verry
Tagged: windows insider, Windows Game Mode, windows 10, pc gaming, creators update, beta
Last month Microsoft confirmed that a new "Game Mode" would be part of the upcoming Windows 10 Creator's Update planned for a spring release ("early 2017"). Microsoft has recently started rolling out the Game Mode to its beta testers in the latest Windows Insider preview build (for those on the fast track anyway, I am currently on the slow ring and do not have Game Mode yet). Now that it is rolled out in preview form, gamers have naturally started benchmarking it, and PC Games News has posted an article on their testing of the new feature and their findings on two Win32 and one UWP game. Not to spoil the results, but at this point Game Mode does not appear to offer anything and can even result in less frames per second with it turned on with its only saving grace being that in some situations it does offer increased performance when the Game DVR feature is also being used to record gameplay. They tested both a NVIDIA GTX 1060 and an AMD RX 480, and Game Mode in it's current preview software on a preview OS appears to have more benefits for NVIDIA while the AMD card PC Games News tested mostly just did it's thing regardless of whether Game Mode was turned on or off (heh, not necessarily a bad thing).
With Game Mode now rolling out to Windows Insiders, there is more information on how Microsoft plans to implement it. Rather than hiding it in the Xbox app, Microsoft has thankfully put it in the main settings app under the Gaming category and users access it by bringing up a Game Bar menu in-game for those games that support it (PC Games News noted Doom and GTA V did not work). Game Mode is an OS-level feature that will dedicate a certain amount of CPU threads to the game when it is turned on and leaves the remaining threads to be used by background processes (which themselves are reportedly minimized). Currently, this seems to work better with multi-threaded games and older games that were coded to only use one or two threads may not see any benefit in turning Game Mode on (and it may actually result in lower FPS). To Microsoft's credit, they are not over promising with Game Mode and note that it should be good for around 2% better performance when enabled with Game Mode having a bigger impact on UWP titles.
I encourage you to check out the PC Games News article where they have their benchmark results presented in a number of bar graphs. Most of the tests saw little to no benefit from using Game Mode, but not a negative effect. Some games like Hitman saw a 6% increase in average frames per second on the GTX 1060. On the other side of things, Forza Horizon 3 (ironically, a UWP game) performance actually drops when Game Mode is turned on to the tune of 13% to 23% less FPS with the RX 480 and 9% to 15% less with the GTX 1060. As far as Tomb Raider, things are more in the middle and things stay the same or get slightly better minimum frames per second when Game Mode and Game DVR are both turned on (though oddly there is a result in there that shows a performance drop with Game Mode on and Game DVR off).
It ia also worth noting that overall, the trend seems to be that Game Mode is going to be most beneficial at increasing the minimum frame rates on games with the Game DVR feature is being used moreso than getting overall maximum or average FPS out of a game. The biggest hurdle is going to be game compatiblity, especially for older games, and Microsoft tweaking things so that at worst Game Mode won't tank performance (like it currently does with Hitman minimum frame rates when Game Mode is on but DVR is off) and things will stay the same as if Game Mode was not on at all and at best gamers will get slightly smoother gameplay.
Right now Game Mode is not compelling, but it is still a work in progress and if Microsoft can get Game Mode right it could be a useful addition (and incentive to upgrade to Windows 10 is probably why they are interested in pursuing this feature) and could come in handy especially on gaming laptops! I am not writing off the feature yet, and neither should you, but I do hope that compatibility is improved and the performance hits are reduced or eliminated when it is enabled. My guess is that the games that will work well with Game Mode are going to be almost all newer games and especially games that are developed post Creator's Update final release with Game Mode in mind.
Hopefully we can use our frame rating on the final product to see how well it truly works as far as user experience and smooth gameplay. What are your thoughts on Windows 10's Game Mode?
Subject: Graphics Cards | February 4, 2017 - 03:29 PM | Tim Verry
Tagged: micron, graphics memory, gddr6
This year is shaping up to be a good year for memory with the promise of 3D XPoint (Intel/Micron), HBM2 (SK Hynix and Samsung), and now GDDR6 graphics memory from Micron launching this year. While GDDR6 was originally planned to be launched next year, Micron recently announced its intentions to start producing the memory chips by the later half of 2017 which would put it much earlier than previously expected.
Computer World reports that Micron is citing the rise of e-sports and gaming driving the computer market that now sees three year upgrade cycles rather than five year cycles (I am not sure how accurate that is, however as it seems like PCs are actually lasting longer between upgrade as far as relevance but i digress) as the primary reason for shifting GDDR6 production into high gear and moving up the launch window. The company expects the e-sports market to grow to 500 million fans by 2020, and it is a growing market that Micron wants to stay relevant in.
If you missed our previous coverage, GDDR6 is the successor to GDDR5 and offers twice the bandwidth at 16 Gb/s (gigabits per second) per die. It is also faster than GDDR5X (12 Gb/s) and uses 20% less power which the gaming laptop market will appreciate. HBM2 still holds the bandwidth crown though as it offers 256 GB/s per stack and up to 1TB/s with four stacks connected to a GPU on package.
As such, High Bandwidth Memory (HBM2 and then HBM3) will power the high end gaming and professional graphics cards while GDDR6 will become the memory used for mid range cards and GDDR5X (which is actually capable of going faster but will likely not be pushed much past 12 Gbps after all if GDDR6 does come out this soon) will replace GDDR5 on most if not all of the lower end products.
I am not sure if Micron’s reasoning of e-sports, faster upgrade cycles, and VR being the motivating factor(s) to ramping up production early is sound or not, but I will certainly take the faster memory coming out sooner rather than later! Depending on exactly when in 2017 the chips start rolling off the fabs, we could see graphics cards using the new memory technology as soon as early 2018 (just in time for CES announcements? oh boy I can see the PR flooding in already! hehe).
Will Samsung change course as well and try for a 2017 release for its GDDR6 memory as well?
Are you ready for GDDR6?
Subject: General Tech | February 4, 2017 - 02:25 AM | Tim Verry
Tagged: usb 3.0, sony, ps4 pro, ps4, gaming, console
Sony is taking the wraps off of its latest firmware with the release of version 4.50 “Sasuke” beta firmware for the PS4. With the new firmware, Sony is rolling out a number of UI/UX improvement and users will finally be able to use external storage with the game console. On the PS4 Pro front, Sony will be adding a “boost mode” in a future update (it may not be ready in time for a production 4.50 release) that lets legacy games access the additional GPU horsepower of the Pro version of the console to smooth out frame rates without needing any special patches from the game developers.
The new firmware adds support for USB 3.0 hard drives (or SSDs) up to 8TB. Users will be able to use the external storage to store games, downloaded applications, screenshots, and videos and have it all show up on the main system menu along with the local storage. Users will not need to shuffle game data back and forth in order to play their games either. Note that currently, the actual save game data is still stored locally even if the game itself is stored on the external hard drive. Fans of the PlayStation VR (PS VR) also get an update with firmware 4.50 in the form of support for watching 3D Blu Rays. Beyond those big feature updates, Sony is also changing up the interface slightly. The Quick Menu now takes up less screen space and will allow gamers to create and join parties right from there rather than going to a separate app. In the notification area, Sony has condensed all the various notification types into a single unified list. Further, users will be able to set in game screenshots as the home screen wallpaper.
Perhaps most interesting is the planned “boost mode” for the PS4 Pro which is currently in beta. Gamers are reporting that titles such as The Evil Within and Just Cause 3 are significantly smoother frame rates with noticeably reduced stuttering. Reportedly, the boost mode will work with most PS4 games that were programmed with unlocked frame rates though the exact benefits will vary. Games that have a hard cap on the frame rate will still need specific patches from the game developers to get any improvements. Ars Technica speculates that the “boost mode” is simply Sony removing its own blocks it put in place to force compatibility with older games that were developed with the base PS4 in mind. When the boost mode is off, the PS4 Pro GPU has part of itself turned off such that it functions exactly as the PS4’s GPU and activating boost mode takes away the blocks and allows the full GPU (with it's 36 CUs) to process the game data as best it can. Getting things like native higher resolutions or more detailed textures will still require patches, of course.
If you have a PS4 or PS4 Pro, keep an eye on the beta Sasuke 4.50 firmware.
- Console Gaming on the PC: PS4 Remote Play vs. Xbox One Streaming
- PS4 Remote Play Now Available On PCs and Macs With 3.50 Firmware Update