All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | October 3, 2016 - 05:27 PM | Jeremy Hellstrom
Tagged: Windows 7, windows 10, microsoft, market share
A change of one percent may seem tiny at first glance but historically it is an incredibly large shift in market share for an operating system. Unfortunately for Microsoft it is Windows 7 which has gained share, up to 48.27% of the market with Windows 10 dropping half a point to 22.53% while the various flavours of Windows 8 sit at 9.61%. This would make it almost impossible for Microsoft to reach their goal of
two one billion machines running Windows 10 in the two years after release and spells bad news for their income from consumers.
Enterprise have barely touched the new OS for a wide variety of reasons, though companies still provide significant income thanks to corporate licenses for Microsoft products and older operating systems. It should be very interesting to see how Microsoft will react to this information, especially if the trend continues. The sales data matches many of the comments we have seen here; the changes which they made were not well received by their customer base and the justifications they've used in the design of the new OS are not holding water. It shouldn't be long before we here more out of Redmond, in the mean time you can pop over to The Inquirer to see Net Applications' data if you so desire.
"The latest figures from Net Applications’ Netmarketshare service show Windows 7, now over seven years old, gain a full percentage point to bolster its place as the world’s most popular desktop operating system with 48.27 per cent (+1.02 on last month)."
Here is some more Tech News from around the web:
- HUDWAY Glass Head-Up Display Review @ NikKTech
- AMD prepares Zen for CES 2017 launch; aggressively clearing inventory for platform transition @ DigiTimes
- How to steal the mind of an AI: Machine-learning models vulnerable to reverse engineering @ The Register
- Linus Torvalds Officially Announces the Release of Linux Kernel 4.8 @ Slashdot
- Security analyst says Yahoo!, Dropbox, LinkedIn, Tumblr all popped by same gang @ The Register
- Source code for 'record-breaking' Mirai IoT botnet released online @ The Inquirer
- iPhone 7 Finishes Last In New Test of Battery Life @ Slashdot
Subject: Graphics Cards | October 2, 2016 - 04:12 PM | Sebastian Peak
Tagged: rumor, report, pascal, nvidia, GTX 1050 Ti, graphics card, gpu, GP107, geforce
A report published by VideoCardz.com (via Baidu) contains pictures of an alleged NVIDIA GeForce GTX 1050 Ti graphics card, which is apparently based on a new Pascal GP107 GPU.
Image credit: VideoCardz
The card shown is also equipped with 4GB of GDDR5 memory, and contains a 6-pin power connector - though such a power requirement might be specific to this particular version of the upcoming GPU.
Image credit: VideoCardz
Specifications for the GTX 1050 Ti were previously reported by VideoCardz, with a reported GPU-Z screenshot. The card will apparently feature 768 CUDA cores and a 128-bit memory bus, with clock speeds (for this particular sample) of 1291 MHz base, 1392 MHz boost (with some room to overclock, from this screenshot).
Image credit: VideoCardz
An official announcement for the new GPU has not been made by NVIDIA, though if these PCB photos are real it probably won't be far off.
Subject: Motherboards | October 2, 2016 - 03:20 AM | Tim Verry
Tagged: Zen, micro ATX, Excavator, Bristol Ridge, b350, amd, AM4
Thanks to a recent leak over at Bodnara.co.kr (which has since been taken down), pictures emerged online that give a first look at an AMD socket AM4 motherboard using the mid-range B350 chipset. The Gigabyte B350M-DS3H is a Micro ATX motherboard supporting Bristol Ridge processors at launch and Zen-based processors next year.
The mid-range AM4 board has a very simple layout that leaves little mystery. There are no large heatsinks and no northbridge thanks to AMD moving most of the connectivity to the SoC itself. In fact there is only a small passively cooled chip in the bottom right corner (the B350 chipset) that between the SoC and it can offer up PCI-E 3.0, SATA 6.0, USB 3.1, USB 3.0, NVMe SSD, and DDR4 memory support. This post outlines how the duties are split between the processor and southbridge.
The B350M-DS3H is powered by a 24-pin ATX and 8-pin EPS and Gigabyte is using a seven phase VRM to power the processor and memory. The board hosts a 1331 pin AM4 socket up top with four DDR4 slots to the right. The CMOS battery is placed just above the PCI-E slots in a position that Morry would be proud of (so long as your CPU cooler is not too massive). Below that are two PCI-E 3.0 x16 slots (electrically x16/x4 or x8/x8), a single PCI-E 3.0 x1 slot, and a NVMe M.2 (PCI-E) slot. The bottom right corner of the board hosts six SATA 6 Gbps ports.
Rear I/O on the AMD motherboard includes:
- 2 x USB 2.0
- 1 x PS/2
- 3 x Video Outputs
- 1 x VGA
- 1 x DVI
- 1 x HDMI
- 4 x USB 3.0
- 2 x USB 3.1
- 1 x Gigabit Ethernet
- 3 x Audio Jacks
Several websites are reporting that AMD will be unleashing the floodgates of socket AM4 motherboards using the A320 and B350 chipsets in October (it is saving the launch of the enthusiast X370 chipset for next year alongside Summit Ridge). I have to say that it is nice to see an AMD motherboard with updated I/O which is a nice change from the ancient 990X AM3+ platform and even the FM2+ motherboards which were newer but still .ot as full featured as the competition.
- AMD Officially Launches Bristol Ridge Processors And Zen-Ready AM4 Platform
- Report: AMD Socket AM4 Compatible with Existing AM2/AM3 Coolers
- AMD Zen Architecture and Performance Preview
- AMD Introduces 7th Generation APUs: Bristol Ridge Takes Center Stage
Subject: Processors | October 1, 2016 - 10:11 PM | Tim Verry
Tagged: xavier, Volta, tegra, SoC, nvidia, machine learning, gpu, drive px 2, deep neural network, deep learning
Earlier this week at its first GTC Europe event in Amsterdam, NVIDIA CEO Jen-Hsun Huang teased a new SoC code-named Xavier that will be used in self-driving cars and feature the company's newest custom ARM CPU cores and Volta GPU. The new chip will begin sampling at the end of 2017 with product releases using the future Tegra (if they keep that name) processor as soon as 2018.
NVIDIA's Xavier is promised to be the successor to the company's Drive PX 2 system which uses two Tegra X2 SoCs and two discrete Pascal MXM GPUs on a single water cooled platform. These claims are even more impressive when considering that NVIDIA is not only promising to replace the four processors but it will reportedly do that at 20W – less than a tenth of the TDP!
The company has not revealed all the nitty-gritty details, but they did tease out a few bits of information. The new processor will feature 7 billion transistors and will be based on a refined 16nm FinFET process while consuming a mere 20W. It can process two 8k HDR video streams and can hit 20 TOPS (NVIDIA's own rating for deep learning int(8) operations).
Specifically, NVIDIA claims that the Xavier SoC will use eight custom ARMv8 (64-bit) CPU cores (it is unclear whether these cores will be a refined Denver architecture or something else) and a GPU based on its upcoming Volta architecture with 512 CUDA cores. Also, in an interesting twist, NVIDIA is including a "Computer Vision Accelerator" on the SoC as well though the company did not go into many details. This bit of silicon may explain how the ~300mm2 die with 7 billion transistors is able to match the 7.2 billion transistor Pascal-based Telsa P4 (2560 CUDA cores) graphics card at deep learning (tera-operations per second) tasks. Of course in addition to the incremental improvements by moving to Volta and a new ARMv8 CPU architectures on a refined 16nm FF+ process.
|Drive PX||Drive PX 2||NVIDIA Xavier||Tesla P4|
|CPU||2 x Tegra X1 (8 x A57 total)||2 x Tegra X2 (8 x A57 + 4 x Denver total)||1 x Xavier SoC (8 x Custom ARM + 1 x CVA)||N/A|
|GPU||2 x Tegra X1 (Maxwell) (512 CUDA cores total||2 x Tegra X2 GPUs + 2 x Pascal GPUs||1 x Xavier SoC GPU (Volta) (512 CUDA Cores)||2560 CUDA Cores (Pascal)|
|TFLOPS||2.3 TFLOPS||8 TFLOPS||?||5.5 TFLOPS|
|DL TOPS||?||24 TOPS||20 TOPS||22 TOPS|
|TDP||~30W (2 x 15W)||250W||20W||up to 75W|
|Process Tech||20nm||16nm FinFET||16nm FinFET+||16nm FinFET|
|Transistors||?||?||7 billion||7.2 billion|
For comparison, the currently available Tesla P4 based on its Pascal architecture has a TDP of up to 75W and is rated at 22 TOPs. This would suggest that Volta is a much more efficient architecture (at least for deep learning and half precision)! I am not sure how NVIDIA is able to match its GP104 with only 512 Volta CUDA cores though their definition of a "core" could have changed and/or the CVA processor may be responsible for closing that gap. Unfortunately, NVIDIA did not disclose what it rates the Xavier at in TFLOPS so it is difficult to compare and it may not match GP104 at higher precision workloads. It could be wholly optimized for int(8) operations rather than floating point performance. Beyond that I will let Scott dive into those particulars once we have more information!
Xavier is more of a teaser than anything and the chip could very well change dramatically and/or not hit the claimed performance targets. Still, it sounds promising and it is always nice to speculate over road maps. It is an intriguing chip and I am ready for more details, especially on the Volta GPU and just what exactly that Computer Vision Accelerator is (and will it be easy to program for?). I am a big fan of the "self-driving car" and I hope that it succeeds. It certainly looks to continue as Tesla, VW, BMW, and other automakers continue to push the envelope of what is possible and plan future cars that will include smart driving assists and even cars that can drive themselves. The more local computing power we can throw at automobiles the better and while massive datacenters can be used to train the neural networks, local hardware to run and make decisions are necessary (you don't want internet latency contributing to the decision of whether to brake or not!).
I hope that NVIDIA's self-proclaimed "AI Supercomputer" turns out to be at least close to the performance they claim! Stay tuned for more information as it gets closer to launch (hopefully more details will emerge at GTC 2017 in the US).
What are your thoughts on Xavier and the whole self-driving car future?
- NVIDIA Teases Xavier, a High-Performance ARM SoC for Drive PX & AI @ AnandTech
- Tegra Related News @ PC Perspective
- Tesla P4 Specifications @ NVIDIA
- CES 2016: NVIDIA Launches DRIVE PX 2 With Dual Pascal GPUs Driving A Deep Neural Network @ PC Perspective
Subject: General Tech | October 1, 2016 - 02:58 AM | Scott Michaud
Blender 2.78 has been a fairly anticipated release. First off, people who have purchased a Pascal-based graphics card will now be able to GPU-accelerate their renders in Cycles. Previously, it would outright fail, complaining that it didn't have a compatible CUDA kernel. At the same time, the Blender Foundation fixed a few performance issues, especially with Maxwell-based GM200 parts, such as the GeForce 980 Ti. Pre-release builds included these fixes for over a month, but 2.78 is the first build for the general public that supports it.
In terms of actual features, Blender 2.78 starts to expand the suite's feature set into the space that is currently occupied by Adobe Animate CC (Flash Professional). The Blender Foundation noticed that users were doing 2D animations using the Grease Pencil, so they have been evolving the tool in that direction. You can now simulate different types of strokes, parent these to objects, paint geometry along surfaces, and so forth. It also has onion skinning, to see how the current frame matches its neighbors, but I'm pretty sure that is not new to 2.78, though.
As you would expect, there are still many differences between these two applications. Blender does not output to Flash, and interactivity would need to be done through the Blender Game Engine. On the other hand, Blender allows the camera, itself, to be animated. In Animate CC, you would need to move, rotate, and scale objects around the stage by the amount of pixels on an individual basis. In Blender, you would just fly the camera around.
This leads in to what the Blender Foundation is planning for Blender 2.8x. This upcoming release focuses on common workflow issues. Asset management is one area, but Viewport Renderer is a particularly interesting one. Blender 2.78 increases the functionality that materials can exhibit in the viewport, but Blender 2.8x is working toward a full physically-based renderer, such as the one seen in Unreal Engine 4. While it cannot handle the complex lighting effects that their full renderer, Cycles, can, some animations don't require this. Restricting yourself to the types of effects seen in current video games could decrease your render time from seconds or minutes per frame to around real-time.
Subject: General Tech | October 1, 2016 - 02:07 AM | Scott Michaud
Tagged: microsoft, windows 10
I've been seeing a lot of people discussing how frequently Windows 10 seems to be getting updated. This discussion usually circles back to how many issues have been reported with the latest Anniversary Update, and how Microsoft has been slow in rolling it out. The thing is, while the slow roll-out is interesting, the way Windows 10 1607 is being patched is not too unusual.
The odd part is how Microsoft has been releasing the feature updates, themselves.
In the past, Microsoft has tried to release updates on the second Tuesday of every month. This provides a predictable schedule for administrators to test patches before deploying them to an entire enterprise, in case the update breaks something that is mission-critical. With Windows 10, Microsoft has declared that patches will be cumulative and can occur at any time. This led to discussion about whether or not “Patch Tuesday” is dead. Now, a little over a year has gone by, and we can actually quantify how the OS gets updated.
There seems to be a pattern that starts with each major version release, which has (thus far) been builds 10240, 10586, and 14393. Immediately before and after these builds start to roll out to the public, Microsoft releases a flurry of updates to fix issues.
For instance, Windows 10 version 1507 had seven sub-versions of 10240 prior to general release, and five hotfixes pushed down Windows Update within the first month of release. The following month, September 2015, had an update on Patch Tuesday, as well as an extra one on September 30th. The following month also had two updates, the first of which on October's Patch Tuesday. It was then patched once for every following Patch Tuesday.
The same trend occurred with Build 10586 (Windows 10 version 1511). Microsoft released the update to the public on November 12th, but pushed a patch through Windows Update on November 10th, and five more over Windows Update in the following month-and-a-bit. It mostly settled down to Patch Tuesday after that, although a few months had a second hotfix sometime in the middle.
We are now seeing the same trend happen with Windows 10 version 1607. Immediately after release, Microsoft pushed a bunch of hotfixes. If history repeats itself, we should start to see about two updates per month for the next couple of months, then we will slow down to Patch Tuesday until Redstone 2 arrives sometime in 2017.
So, while this seems to fit a recurring trend, I do wonder why this trend exists.
Part of it makes sense. When Microsoft is developing Windows 10, it is trying to merge additions from a variety of teams into a single branch, and do so once or twice each year. This likely means that Microsoft has a “last call” date for these teams to merge their additions into the public branch, and then QA needs to polish this up for the general public. While they can attempt to have these groups check in mid-way, pushing their work out to Windows Insiders in a pre-release build, you can't really know how the final build will behave until after the cut-off.
At the same time, the massive flood of patches within the first month would suggest that Microsoft is pushing the final build to the public about a month or two too early. If this trend continues, it would make the people who update within the first month basically another ring of the Insider program. The difference is that it is less out-in, because you get it when Windows Update tells you to.
It will be interesting to see how this continues going forward, too. Microsoft has already delayed Redstone 2 until 2017, as I mentioned earlier. This could be a sign that Microsoft is learning from past releases, and optimizing their release schedule based on these lessons. I wonder how soon before release will Microsoft settle on a “final build” next time. It seems like Microsoft could avoid many stability problems by simply setting an earlier merge date, and aggressively performing QA for a longer period until it is released to the public.
Or I could be completely off. What do you all think?
Subject: General Tech | September 30, 2016 - 04:25 AM | Sebastian Peak
Tagged: webcam, skype, Pro Stream Webcam, logitech, C922x, C922, C920, 720p/60
Logitech has announced the successor to the popular C920 with the C922 Pro Stream Webcam, and this new model includes a 720p/60 mode, along with the 1080p/30 capability of its predecessor.
“C922 Pro Stream Webcam offers full HD quality and features for all streaming needs. At either 1080p 30 FPS or 720p 60 FPS, C922 is the perfect solution for streaming to Twitch, YouTube and any other video streaming application imaginable. Advanced 20-step autofocus through a full HD glass lens with F-stop F 2.8 and 78-degree field of view means no matter what action is happening, C922 can capture those crucial moments in perfect HD clarity.”
Logitech lists these specs for the C922:
- Video streaming or recording: 1080p30 FPS / 720p60 FPS / 720p30 FPS with supported apps
- Video calling: Full HD 1080p with the latest version of Skype for Windows or 720p with
- supported clients
- H.264 video compression (Skype only at this time)
- Full HD Glass lens (F=2.8) with 20-step autofocus
- 78° horizontal field of view
- Dual stereo microphone with automatic noise cancellation
- Automatic low light correction
- Tripod ready universal clip fits laptops and monitors (C922 SKU only)
- Width: 95mm
- Depth: 24mm - 71mm including clip
- Height: 29mm - 43.5mm including clip
- Weight: 162g
- USB cable: 6-ft
The C922 includes a tripod, while the C922x does not
There will be two SKUs of the C922, each of which retail for $99.99:
- C922 - exclusive to Best Buy and bestbuy.com, includes tripod and a 3 month XSplit license
- C922x - available on Amazon.com - does not include the tripod but includes a longer 6 month XSplit license
Both versions are available now.
Full press release after the break.
Subject: Systems | September 29, 2016 - 08:26 PM | Jeremy Hellstrom
Tagged: gigabyte, BRIX Gaming UHD
Gigabyte did not have a lot of space to fit components into the BRIX Gaming UHD, let alone cooling, as it is 220x110x110mm in size or 2.6L in volume. Into this tiny tower you will find an i7-6700HQ with 16GB of dual channel DDR4-2400 and a 512GB Samsung 950 PRO with two M.2 slots for storage expansion, the third is on wireless duty. Gigabyte chose a 4GB GTX 950 to power the video, not new by any means but able to fulfill gaming duties at 1080p and allows the system to be powered by a 180W power brick. 4k gaming is a bit of a stretch for this but it is impressively designed, check out the benchmarks at Kitguru to see its performance in games.
"Gigabyte’s BRIX line of barebones PCs are typically small and low-powered – at least, when compared with a mini-ITX desktop system, for example. However, the new BRIX Gaming UHD aims to change all of that."
Here is some more Tech News from around the web:
- Zoostorm EVOLVE @ eTeknix
- ECS LIVA One Mini-PC (H110/Skylake) @ techPowerUp
- Wired2Fire Diablo Elite GTX 1080 Gaming PC @ eTeknix
Subject: General Tech | September 29, 2016 - 07:14 PM | Jeremy Hellstrom
Tagged: nvidia, competition, jen-hsun huang, Founder's Edition
When Microsoft launched the Surface there were negative reactions from vendors who saw this as new competition from what was previously their partner. Today DigiTimes reports that certain unnamed GPU vendors have similar feelings about NVIDIA's Founder's Edition cards. Jen-Hsun responded to these comments today, stating that the Founders Editions were "purely to solve problems in graphics card design".
While he did not say that NVIDIA would not consider continuing practice in future cards he does correctly point out that they did share everything about the design and results with the vendors. Those vendors are still somewhat upset about the month in which only Founder's Editions were available for sale as they feel they lost some possible profits by not being able to sell their custom designed GPUs. Then again, considering the limited supply on the market, the amount of sales they could have made that extra month would certainly have been limited. It will be interesting to see if we hear more about this directly from the vendors in the coming weeks.
"Since Nvidia has restricted its graphics card brand partners from releasing in-house designed graphics cards within a month after the releases of its Founders Edition card, the graphics card vendors are displeased with the decision as it had given Nvidia time to earn early profits without competition."
Here is some more Tech News from around the web:
- HP Inc: No DRM in our 3D printers, we swear (unlike our 2D ones) @ The Register
- HP offers optional patch to de-bork its printers after EFF rant @ The Inquirer
- macOS 10.12 Sierra vs. Ubuntu 16.04 Linux Benchmarks @ Phoronix
- Surprise! Leading 4-socket server vendor isn’t Dell or HPE @ The Register
- D-Link DWR-932 B owner? Trash it, says security bug-hunter @ The Register
- Microsoft hails pointless Privacy Shield status for its cloud services @ The Register
- Polish car mechanic is still load-balancing with a Commodore 64 after 25 years @ The Inquirer
Subject: General Tech | September 29, 2016 - 04:48 PM | Ryan Shrout
Tagged: video, toshiba, Silverstone, S340, rampage v edition 10, podcast, ocz, nzxt, gtx 1070, fsp, Evoluent, evga, asus, AOC, amd, A12-9800
PC Perspective Podcast #419 - 09/29/16
Join us this week as we discuss the Edition 10 of the Rampage V motherboard, a VerticalMouse, a shiny SilverStone case, the AMD A12-9800 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath and Jeremy Hellstrom
Program length: 1:05:25
Subject: General Tech, Mobile | September 29, 2016 - 06:15 AM | Scott Michaud
Tagged: mozilla, Firefox OS, firefox
Update: There has been a little confusion. The web browser, Firefox, is still going strong. In fact, they're focusing their engineering efforts more on it, by cutting back on these secondary projects.
Less than a year after their decision to stop developing and selling smartphones through carriers, Mozilla has decided to end all commercial development of Firefox OS. Releases after Firefox OS 2.6 will be handled by third parties, such as Panasonic, should they wish to continue using it for their smart TV platform. Further, source code for the underlying operating system, Boot-to-Gecko (B2G), will be removed from their repository, mozilla-central, so it doesn't hinder development of their other products.
Regardless, Mozilla needs to consider their long-term financial stability, and throwing resources at Firefox OS apparently doesn't return enough value for them, both directly and for its impact on society.
Subject: Cases and Cooling | September 29, 2016 - 04:38 AM | Tim Verry
Tagged: water cooling, liquid cooler, Intel, copper radiator, be quiet!, amd, AIO
Be Quiet!, a popular German manufacturer of PC cases and power supplies is jumping into the liquid cooling game with the introduction of its new Silent Loop all-in-one (AIO) liquid coolers. Through a partnership with Alphacool, Be Quiet! Is launching three new coolers with 120mm, 240mm, and 280mm radiators. It is not clear exactly when they will be arriving stateside but pricing is approximately $124, $143, and $170 respectively.
The Silent Loop 280 AIO liquid CPU cooler.
The new coolers come clad in all black and feature a new pump design paired with copper cold plates and copper radiators. This is nice to see in the wake of aluminum radiators because using the same metals throughout the loop mitigates the risk of galvanic corrosion that will eventually occur in loops that use mixed metals.
The AIO loop is paired with two Silent Wings 2 fans which use rifle bearings and can spin up to 2,000 RPM. To further set the Silent Loop series apart, Be Quiet! uses a nickel plated CPU cold plate, a radiator with a fill port to allow users to top up the fluids over time, and a reportedly innovative (read: not infringing on Asetek IP) "decoupled reverse flow pump" that spins at 2,200 RPM and allegedly reduces noise to nearly inaudible levels. The pump pulls water into the block and over the cold plate and then pulls it through the pump which is in a sectioned off area of the block.
As for the copper radiators, Be Quiet is using 30mm radiators on the Silent Looop 240 and Silent Loop 280 coolers with two fans side by side and a thicker 45mm radiator on the Silent Loop 120 with two fans in a push-pull configuration. Be Quiet! claims that the 120mm, 240mm, and 280mm coolers can handle wattages of 270W, 350W, and 400W respectively (these numbers are likely with the fans cranked to their maximum speeds heh). The included fans can be controlled via PWM and Be Quiet! includes a Y splitter that allows users to attach both fans to one PWM motherboard header – which is good since the CPU_Fan header is sometimes the only "true" PWM header offered.
The liquid coolers use Philips screws throughout for mounting the radiator, fans, and CPU mount and they are compatible with all the usual Intel and AMD sockets.
Several sites already have reviews of the new coolers including Kit Guru and Guru3D. According to Leo Waldock from Kit Guru, the Be Quiet! Silent Loop 240 is a "funky and nice piece of hardware" and while it did not blow him away it is competitively priced and performs very closely to the Corsair H100i V2. Out of the box the cooler was reportedly inaudible but with lackluster cooling performance; however, once the fans were cranked up from their normal 1,100 RPM to 1,400 RPM cooling performance greatly improved without sound getting too out of control.
In all it looks good aesthetically and appears to be easy to install. If you are in the market for an AIO and do not need fancy extras (LEDs, monitoring software, ect), the Silent Loop coolers might be worth looking into. Hopefully we can get one in for review so that Sebastian or Morry can take it apart... I mean test it! (heh).
Subject: Mobile | September 29, 2016 - 01:04 AM | Scott Michaud
Tagged: Samsung, recall, galaxy note 7
Bloomberg is reporting that a 25-year-old customer from China, Hui Renjie, claims to have received a replacement Galaxy Note 7, and that it caught fire within 24 hours. A representative of the company immediately visited him and asked to take the phone to investigate, but the customer wished to go public first, assuming that he wouldn't get any answers if he just gave up the phone silently. The explosion allegedly caused minor burns to two of the customer's fingers, as well as damaged his MacBook.
Naturally, Samsung is very interested in what happened. The previous incident involved Samsung-developed batteries. The manufacturing process accidentally pushed some the battery batch's two terminals together. Shorting out a battery causes it to release energy quickly as heat, which is often undesirable, to say the least.
Samsung is waiting to examine the device before they comment further. If you have also receive a replacement, then you might want to keep it powered off and disconnected from the charger until we find out what happened.
Subject: General Tech | September 28, 2016 - 11:36 PM | Scott Michaud
Machine translation is quite difficult, especially between certain pairs of languages that vary greatly in how they handle implied context and intonation. At Google, the current translation system picks out known words and phrases, converts them to the target language, and blindly outputs them. This, unfortunately, ignores how the phrases are structured together.
Google has been working toward a newer system, though. Google Neural Machine Translation (GNMT) considers whole sentences, rather than individual words and phrases. It lists all possible translations, and weighs them based on how humans rate their quality. These values are stored and used to better predict following choices, which should be a familiar concept to those who have been reading up on deep learning over the last couple of years.
This new system makes use of Google's “TensorFlow” library, released to the public last year under a permissive, Apache 2.0 license. It will also be compatible with Google's custom Tensor Processing Unit (TPU) ASICs that were announced last May at Google I/O. The advantage of TPUs is that they can reach extremely high parallelism because they operate on extremely low-precision values.
The GNMT announcement showed the new system attempting to translate English to and from Spanish, French, and Chinese. Each pairing, in both directions, showed a definite increase, with French to English almost matching a human translation according to their quality metric. GNMT is currently live to the public when attempting to translate between Chinese and English, and Google will expand this to other languages “over the coming months”.
Subject: General Tech | September 28, 2016 - 10:53 PM | Scott Michaud
Tagged: hp, DRM
Recently, HP released a firmware update for some inkjet printers that disabled certain third-party cartridges. The claim is that the customer “is exposed to quality and potential security risks” when using counterfeit cartridges. I'm curious why HP is claiming that users shouldn't trust HP's abilities to secure their devices against attacks from malicious cartridges, but that's probably not an implication that HP considered when publishing this press release.
Also, if the intent was to inform users about counterfeit and potentially malicious cartridges, you would think that they would have provided an override method from the start. Thankfully, they are now. HP is preparing an optional firmware update that does not check cartridges. They claim that it will be available in a couple of weeks, and provide a link to where it will be hosted.
So yeah, they are doing the right thing now. Still... come on.
Subject: General Tech | September 28, 2016 - 05:47 PM | Jeremy Hellstrom
Tagged: VR, sword master vr, htc vive, gaming
With the amount of VR benchmarks coming out of [H]ard|OCP lately we wonder if they are in danger of becoming the worlds first VR addicts. They tested the usual suite of two AMD cards and five NVIDIA to determine the amount of dropped frames and average render times in this particular game. As it turns out the game is harder on the player than it is the GPU, all were able to provide decent experiences when swashbuckling. The developer recommends you clear a 2x1.5m area to play this game and from what [H]ard|OCP experienced while playing this is no joke; you will get exercise while you are duelling some of the harder opponents.
"Do you want to fight the Black Knight in a sword fight? There is not exactly a "Black Knight" in Sword Master VR, but you can certainly get that feeling. In fact, you can fight him and a couple of his friends at the same time if you are up to the challenge. Just pull the sword from the stone for $10."
Here is some more Tech News from around the web:
- Electric Heart: Deus Ex Story DLC System Rift Released @ Rock, Paper, SHOTGUN
- Battlefield 1 single player uses a 'war story' anthology format @ HEXUS
- Erected: Civilization VI System Requirements Finalised @ Rock, Paper, SHOTGUN
- Respawn provides detailed Titanfall 2 PC specs @ HEXUS
- For The Emp, Er, Uh: WH40k Eternal Crusade Released @ Rock, Paper, SHOTGUN
- Wasteland 3 will have multiplayer, XCOM-style cinematic camera @ Polygon
- Back to school sale @ GOG
- Warhammer 40,000: Dawn Of War 3 Shows Off Eldar @ Rock, Paper, SHOTGUN
Subject: Memory | September 28, 2016 - 05:07 PM | Jeremy Hellstrom
Tagged: DOMINATOR PLATINUM Special Edition, corsair, ddr4, ddr4-3200, DHX
Corsair's DOMINATOR PLATINUM Special Edition series comes in 32GB kits, either four 8GB DIMMs or a pair of 16GB DIMMs, in your choice of Chrome or Blackout finishes. All kits are DDR4-3200MHz but with the 10-layer PCB and DHX heatsinks Corsair feels that reaching 3600MHz will be trivial and higher frequencies possible for talented tweakers. They will be available directly from Corsair, $330 for the quad-channel kit and $300 for the dual channel.
You can read the full PR by clicking below.
Subject: General Tech | September 28, 2016 - 04:30 PM | Jeremy Hellstrom
Tagged: USB 3 Type-C, headphones
There will be an improvement in audio support on Type-C USB connections which will decrease power demands, as USB Audio Device Class 3.0 specifications have just been announced. When compared to the 3.5mm headphone jack, USB audio is a power hog which will shorten the amount of time your battery will last on a phone or other mobile device but it seems that the USB-IF have been working to overcome this issue. Product manufacturers are looking forward to this as USB can be isolated from other internals far more effectively than the 3.5mm jack which would allow them to waterproof their devices.
Hopefully the new compliance testing regime brought about after the consequences of using a bad cable to charge your laptop will ensure we do not have any related problems with audio devices. The Register does remind us that Bluetooth 5 is yet to be commonly found on mobile devices and could offer yet another 3.5mm nail in the coffin.
"Hear that, children? That's the sound of another set of nails in the coffin of headphone jacks in mobile devices."
Here is some more Tech News from around the web:
- BlackBerry throws in the towel on building its own smartphones @ The Inquirer
- Official: Windows 10 has hit the 400 million device mark @ The Register
- Microsoft makes massive changes to MCSE and MCSD @ The Register
Subject: Cases and Cooling | September 28, 2016 - 03:16 PM | Sebastian Peak
Tagged: tempered glass, S340 Elite, S340, nzxt, enclosure, case, atx
NZXT has released a new, premium version of their excellent S340 mid-tower enclosure (which we reviewed last year), and the S340 Elite features a tempered-glass side panel, while case I/O now offers an HDMI port for VR builds.
"Expanding on the S340’s renowned durability, the S340 Elite features a tempered glass panel to showcase builds with crisp clarity. The top IO panel has been optimized with an HDMI port and additional USB ports for a streamlined VR experience. It includes a magnetic cable management puck to conveniently store VR or audio headsets with fast and flexible mounting access. The S340 Elite is strong, compact, and takes the S340 chassis to new heights."
NZXT lists the S340 Elite's main features, all new with this version of the enclosure:
- Tempered glass side panel: showcase your build
- VR cable management puck: move freely & clean cables
- Front VR accessibility: plugging your VR headset is easy & convenient
- Interior cable management clamps: easy cable management
- Additional SSD tray: increase storage options
As strong a performer as the original S340 was considering its affordable $69.99 price tag, and for a case with a full tempered-glass side panel the Elite version is priced very competitively at $99.99. A $30 premium for the added features seems like a very good tradeoff, and we already have one of these new S340 Elite enclosures in for testing, so expect a full review soon!
Subject: Graphics Cards | September 28, 2016 - 02:04 AM | Tim Verry
Tagged: water cooling, pascal, hybrid cooler, gtx 1070, GP104, evga
EVGA is preparing to launch the GTX 1070 FTW Hybrid which is a water cooled card that pairs NVIDIA's GTX 1070 GPU with EVGA's Hybrid cooler and custom FTW PCB. The factory overclocked graphics card is currently up for pre-order for $500 on EVGA's website.
The GTX 1070 FTW Hybrid uses EVGA's custom PCB that features two 8-pin power connectors that drive a 10+2 power phase and dual BIOS chips. The Hybrid cooler includes a shrouded 100mm axial fan and a water block that directly touches both the GPU and the memory chips. The water block connects to an external 120mm radiator and a single fan that can be swapped out and/or powered by a motherboard using a standard four pin connector. Additionally, the cooler has a metal back plate and RGB LED back-lit EVGA logos on the side and windows on the front. Display outputs include one DVI, one HDMI, and three DisplayPort connectors.
As far as specification go, EVGA did not get too crazy with the factory overclock, but users should be able to push it quite far on their own assuming they get a decent chip from the silicon lottery. The GP104 GPU has 1920 CUDA cores clocked at 1607 MHz base and 1797 MHz boost. However, the 8 GB of memory is clocked at the stock 8,000 MHz. For comparison, reference clock speeds are 1506 MHz base and 1683 MHz boost.
Interestingly, EVGA rates the GTX 1070 FTW Hybrid at 215 watts versus the reference card's 150 watts. It is also the same TDP rating as the GTX 1080 FTW Hybrid card.
The table below outlines the specifications of EVGA's water cooled card compared to the GTX 1070 reference GPU and the GTX 1080 FTW Hybrid.
|GTX 1070||GTX 1070 FTW Hybrid||GTX 1080 FTW Hybrid|
|Rated Clock||1506 MHz||1607 MHz||1721 MHz|
|Boost Clock||1683 MHz||1797 MHz||1860 MHz|
|Memory Clock||8000 MHz||8000 MHz||10000 MHz|
|TDP||150 watts||215 watts||215 watts|
|MSRP (current)||$379 ($449 FE)||$500||$730|
According to EVGA, the Hybrid cooler offers up GPU and memory temperatures to 45°C and 57°C respectively compared to reference temperatures of 80°C and 85°C. Keeping in mind that these are EVGA's own numbers (you can see our Founder's Edition temperature results here), the Hybrid cooler seems to be well suited for keeping Pascal GPUs in check even when overclocked. In reviews of the GTX 1080 FTW Hybrid, reviewers found that the Hybrid cooler allowed stable 2GHz+ GPU clock speeds that let the card hit their maximum boost clocks and stay there under load. Hopefully the GTX 1070 version will have similar results. I am interested to see whether the memory chips they are using will be capable of hitting at least the 10 GHz of the 1080 cards if not more since they are being cooled by the water loop.
You can find more information on the factory overclocked water cooled graphics card on EVGA's website. The card is available for pre-order at $500 with a 3 year warranty.
Pricing does seem a bit high at first glance, but looking around at other custom GTX 1070 cards, it is only at about a $50 premium which is not too bad in my opinion. I will wait to see actual reviews before I believe it, but if I had to guess the upcoming card should have a lot of headroom for overclocking and I'm interested to see how far people are able to push it!