Subject: Cases and Cooling | September 26, 2016 - 03:04 PM | Jeremy Hellstrom
Tagged: zalman, Z9 Neo, Z11 Neo, neo
Zalman's Z9 and Z11 NEO are fairly similar, the Z9 is 205x490x482mm and the Z11 is slightly larger at 205x520x515mm which allows for more cooling options to be installed. Using the default fan installation Overclockers Club saw slightly better CPU temperatures on the Z9, the GPU measured the same in both cases; adding fans to the Z11 will obviously help it take the lead. Drop by to see their full review of both cases, including video.
"Reviewing both cases at the same time makes it interesting. You get to directly compare them with each other. Both of these cases are similar in size, and the feature sets are also fairly close. Neither case stood out much from the other - I like the style of the Z11 a little more, but the Z9 comes with a better compliment of fans. The use of space is also similar in both cases, although the I like the cable management a little better on the Z9 with the lower compartment that hides the power supply - but then you are covering up a power supply you may want to show off. And the Z11 has the cool, removable hard drive cages."
Here are some more Cases & Cooling reviews from around the web:
- AeroCool XPredator II Full Tower Chassis Review @ NikKTech
- SilverStone Redline RL05 Mid-Tower Review @ NikKTech
- MasterLiquid Pro 240 @ Benchmark Reviews
- Cooler Master MasterLiquid Pro 240mm AIO Liquid CPU Cooler Review @ Techgage
- Cooler Master Seidon 240V AIO @ eTeknix
- Arctic Liquid Freezer 240 AIO @ Kitguru
Subject: General Tech | September 28, 2016 - 07:36 PM | Scott Michaud
Machine translation is quite difficult, especially between certain pairs of languages that vary greatly in how they handle implied context and intonation. At Google, the current translation system picks out known words and phrases, converts them to the target language, and blindly outputs them. This, unfortunately, ignores how the phrases are structured together.
Google has been working toward a newer system, though. Google Neural Machine Translation (GNMT) considers whole sentences, rather than individual words and phrases. It lists all possible translations, and weighs them based on how humans rate their quality. These values are stored and used to better predict following choices, which should be a familiar concept to those who have been reading up on deep learning over the last couple of years.
This new system makes use of Google's “TensorFlow” library, released to the public last year under a permissive, Apache 2.0 license. It will also be compatible with Google's custom Tensor Processing Unit (TPU) ASICs that were announced last May at Google I/O. The advantage of TPUs is that they can reach extremely high parallelism because they operate on extremely low-precision values.
The GNMT announcement showed the new system attempting to translate English to and from Spanish, French, and Chinese. Each pairing, in both directions, showed a definite increase, with French to English almost matching a human translation according to their quality metric. GNMT is currently live to the public when attempting to translate between Chinese and English, and Google will expand this to other languages “over the coming months”.
Introduction and Specifications
The JUSTICE from REEVEN is a tower cooler with six heatpipes, and a 120 mm PWM fan with distinctive yellow-and-black styling. But what really matters is performance, and that’s what we’re going to find out about as we pit it against the Intel Broadwell-E test system.
Have you heard of REEVEN? A search on Amazon reveals only a pair of older models, but Newegg carries the full range of coolers and fan controllers the Taiwanese company offers. Prices are low for this segment, with their CPU coolers starting at $24.99, and this JUSTICE cooler priced at $42 on Newegg. What you get for this price sounds impressive on paper, and I wasted no time in finding out how that translated into real-world results.
REEVEN sent along a second 120 mm COLDWING 12 fan for us to test with the JUSTICE, as the cooler includes installation hardware for a dual-fan setup, and I tested the cooler with my Core i7-6800K in both configurations - with both stock and overclocked CPU loads.
Subject: Cases and Cooling | September 29, 2016 - 12:38 AM | Tim Verry
Tagged: water cooling, liquid cooler, Intel, copper radiator, be quiet!, amd, AIO
Be Quiet!, a popular German manufacturer of PC cases and power supplies is jumping into the liquid cooling game with the introduction of its new Silent Loop all-in-one (AIO) liquid coolers. Through a partnership with Alphacool, Be Quiet! Is launching three new coolers with 120mm, 240mm, and 280mm radiators. It is not clear exactly when they will be arriving stateside but pricing is approximately $124, $143, and $170 respectively.
The Silent Loop 280 AIO liquid CPU cooler.
The new coolers come clad in all black and feature a new pump design paired with copper cold plates and copper radiators. This is nice to see in the wake of aluminum radiators because using the same metals throughout the loop mitigates the risk of galvanic corrosion that will eventually occur in loops that use mixed metals.
The AIO loop is paired with two Silent Wings 2 fans which use rifle bearings and can spin up to 2,000 RPM. To further set the Silent Loop series apart, Be Quiet! uses a nickel plated CPU cold plate, a radiator with a fill port to allow users to top up the fluids over time, and a reportedly innovative (read: not infringing on Asetek IP) "decoupled reverse flow pump" that spins at 2,200 RPM and allegedly reduces noise to nearly inaudible levels. The pump pulls water into the block and over the cold plate and then pulls it through the pump which is in a sectioned off area of the block.
As for the copper radiators, Be Quiet is using 30mm radiators on the Silent Looop 240 and Silent Loop 280 coolers with two fans side by side and a thicker 45mm radiator on the Silent Loop 120 with two fans in a push-pull configuration. Be Quiet! claims that the 120mm, 240mm, and 280mm coolers can handle wattages of 270W, 350W, and 400W respectively (these numbers are likely with the fans cranked to their maximum speeds heh). The included fans can be controlled via PWM and Be Quiet! includes a Y splitter that allows users to attach both fans to one PWM motherboard header – which is good since the CPU_Fan header is sometimes the only "true" PWM header offered.
The liquid coolers use Philips screws throughout for mounting the radiator, fans, and CPU mount and they are compatible with all the usual Intel and AMD sockets.
Several sites already have reviews of the new coolers including Kit Guru and Guru3D. According to Leo Waldock from Kit Guru, the Be Quiet! Silent Loop 240 is a "funky and nice piece of hardware" and while it did not blow him away it is competitively priced and performs very closely to the Corsair H100i V2. Out of the box the cooler was reportedly inaudible but with lackluster cooling performance; however, once the fans were cranked up from their normal 1,100 RPM to 1,400 RPM cooling performance greatly improved without sound getting too out of control.
In all it looks good aesthetically and appears to be easy to install. If you are in the market for an AIO and do not need fancy extras (LEDs, monitoring software, ect), the Silent Loop coolers might be worth looking into. Hopefully we can get one in for review so that Sebastian or Morry can take it apart... I mean test it! (heh).
Subject: General Tech | September 30, 2016 - 10:58 PM | Scott Michaud
Blender 2.78 has been a fairly anticipated release. First off, people who have purchased a Pascal-based graphics card will now be able to GPU-accelerate their renders in Cycles. Previously, it would outright fail, complaining that it didn't have a compatible CUDA kernel. At the same time, the Blender Foundation fixed a few performance issues, especially with Maxwell-based GM200 parts, such as the GeForce 980 Ti. Pre-release builds included these fixes for over a month, but 2.78 is the first build for the general public that supports it.
In terms of actual features, Blender 2.78 starts to expand the suite's feature set into the space that is currently occupied by Adobe Animate CC (Flash Professional). The Blender Foundation noticed that users were doing 2D animations using the Grease Pencil, so they have been evolving the tool in that direction. You can now simulate different types of strokes, parent these to objects, paint geometry along surfaces, and so forth. It also has onion skinning, to see how the current frame matches its neighbors, but I'm pretty sure that is not new to 2.78, though.
As you would expect, there are still many differences between these two applications. Blender does not output to Flash, and interactivity would need to be done through the Blender Game Engine. On the other hand, Blender allows the camera, itself, to be animated. In Animate CC, you would need to move, rotate, and scale objects around the stage by the amount of pixels on an individual basis. In Blender, you would just fly the camera around.
This leads in to what the Blender Foundation is planning for Blender 2.8x. This upcoming release focuses on common workflow issues. Asset management is one area, but Viewport Renderer is a particularly interesting one. Blender 2.78 increases the functionality that materials can exhibit in the viewport, but Blender 2.8x is working toward a full physically-based renderer, such as the one seen in Unreal Engine 4. While it cannot handle the complex lighting effects that their full renderer, Cycles, can, some animations don't require this. Restricting yourself to the types of effects seen in current video games could decrease your render time from seconds or minutes per frame to around real-time.
Subject: General Tech | September 26, 2016 - 01:01 PM | Jeremy Hellstrom
Tagged: iot, security, upnp
Over the weekend you might have noticed some issues on your favourite interwebs as there was a rather impressively sized DDOS attack going on. The attack was a mix of old and new techniques; they leveraged the uPNP protocol which has always been a favourite vector but the equipment hijacked were IoT appliances. The processing power available in toasters, DVRs and even webcams is now sufficient to be utilized and is generally a damned sight easier to control than even an old unpatched XP machine. This does not spell the end of the world which will likely be predicted on the cable news networks but does further illustrate the danger in companies producing inherently insecure IoT devices. If you are not sure what uPNP is, or are aware but do not currently need it, consider disabling it on your router or think about setting up something along the lines of ye olde three router solution.
"Brace yourselves. The rest of the media is going to be calling this an “IoT DDOS” and the hype will spin out of control. Hype aside, the facts on the ground make it look like an extremely large distributed denial-of-service attack (DDOS) was just carried out using mostly household appliances (145,607 of them!) rather than grandma’s old Win XP system running on Pentiums."
Here is some more Tech News from around the web:
- Sad reality: It's cheaper to get hacked than build strong IT defenses @ The Register
- ITRI cooperates with Nvidia to develop self-driving technology @ DigiTimes
- Surface Pro 3 branded battery borkage continues @ The Register
- OpenSSL swats a dozen bugs, one notable nasty @ The Register
- iOS 10 makes it easier to crack iPhone back-ups, says security firm @ The Register
- Double KO! Capcom's Street Fighter V installs hidden rootkit on PCs @ The Register
- Ig Nobel Prizes: GoatMan, Volkswagen, and the Personalities of Rocks @ Hack a Day
Subject: Displays | September 27, 2016 - 03:35 PM | Jeremy Hellstrom
Tagged: pimax, vr headset, steam vr
As Rock, Paper, SHOTGUN asks in the title, can the $300 Pimax VR headset be too good to be true? It ships without headphones, or you can buy the $350 which includes audio of moderate quality or provide your own if they fit comfortably under the headset. It also does not ship with any controllers, which means that Steam games which require anything other than a mouse and keyboard will simply not work; not an empty catalogue of games but definitely more limited than the two more expensive competitors.
The headset does offer better resolution, 1920x2160 per eye, which the reviewer noticed immediately as being clearer than the competition ... as long as you were looking directly at the text or object. There were issues at the edges of your view however, as well as with quickly turning your head which is likely due to the 60fps refresh rate. This is less than the 90fps the Vive or Rift can manage as well as creating concerns about reprojection and dropped frames. There were a few other concerns mentioned in the review which you should familiarize yourself with, but the Pimax is very interesting, a light VR headset with great resolution and only two connecting cord for $300.
"In the interim, here’s Chinese outfit Pimax, who are selling what they label as the first 4K VR headset for PC, which works with SteamVR. It’s also $350 (or $300 without headphones), compared to the Rift’s $599 and Vive’s $799"
Here are some more Display articles from around the web:
- From The Wirecutter: The best 4K monitors (so far) @ Ars Technica
- BenQ XR3501 Curved Gaming Monitor @ Kitguru
- Dell UltraSharp 24 InfinityEdge U2417H 24in Monitor @ Kitguru
Subject: General Tech | September 28, 2016 - 06:53 PM | Scott Michaud
Tagged: hp, DRM
Recently, HP released a firmware update for some inkjet printers that disabled certain third-party cartridges. The claim is that the customer “is exposed to quality and potential security risks” when using counterfeit cartridges. I'm curious why HP is claiming that users shouldn't trust HP's abilities to secure their devices against attacks from malicious cartridges, but that's probably not an implication that HP considered when publishing this press release.
Also, if the intent was to inform users about counterfeit and potentially malicious cartridges, you would think that they would have provided an override method from the start. Thankfully, they are now. HP is preparing an optional firmware update that does not check cartridges. They claim that it will be available in a couple of weeks, and provide a link to where it will be hosted.
So yeah, they are doing the right thing now. Still... come on.
Subject: General Tech | September 28, 2016 - 01:47 PM | Jeremy Hellstrom
Tagged: VR, sword master vr, htc vive, gaming
With the amount of VR benchmarks coming out of [H]ard|OCP lately we wonder if they are in danger of becoming the worlds first VR addicts. They tested the usual suite of two AMD cards and five NVIDIA to determine the amount of dropped frames and average render times in this particular game. As it turns out the game is harder on the player than it is the GPU, all were able to provide decent experiences when swashbuckling. The developer recommends you clear a 2x1.5m area to play this game and from what [H]ard|OCP experienced while playing this is no joke; you will get exercise while you are duelling some of the harder opponents.
"Do you want to fight the Black Knight in a sword fight? There is not exactly a "Black Knight" in Sword Master VR, but you can certainly get that feeling. In fact, you can fight him and a couple of his friends at the same time if you are up to the challenge. Just pull the sword from the stone for $10."
Here is some more Tech News from around the web:
- Electric Heart: Deus Ex Story DLC System Rift Released @ Rock, Paper, SHOTGUN
- Battlefield 1 single player uses a 'war story' anthology format @ HEXUS
- Erected: Civilization VI System Requirements Finalised @ Rock, Paper, SHOTGUN
- Respawn provides detailed Titanfall 2 PC specs @ HEXUS
- For The Emp, Er, Uh: WH40k Eternal Crusade Released @ Rock, Paper, SHOTGUN
- Wasteland 3 will have multiplayer, XCOM-style cinematic camera @ Polygon
- Back to school sale @ GOG
- Warhammer 40,000: Dawn Of War 3 Shows Off Eldar @ Rock, Paper, SHOTGUN
Subject: General Tech | September 29, 2016 - 12:48 PM | Ryan Shrout
Tagged: video, toshiba, Silverstone, S340, rampage v edition 10, podcast, ocz, nzxt, gtx 1070, fsp, Evoluent, evga, asus, AOC, amd, A12-9800
PC Perspective Podcast #419 - 09/29/16
Join us this week as we discuss the Edition 10 of the Rampage V motherboard, a VerticalMouse, a shiny SilverStone case, the AMD A12-9800 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath and Jeremy Hellstrom
Program length: 1:05:25
Subject: Storage | September 26, 2016 - 01:42 PM | Jeremy Hellstrom
Tagged: tlc, Phison PS3110-S10, AS330 Panther, apacer, 960GB SSD
Almost everyone seems to be making SATA SSDs these days, the market is much more crowded that at this time last year which can make your purchasing decisions more complicated. If you cannot afford the new M.2 and PCIe SSDs but are instead looking for a SATA SSD then your choices are varied and you cannot necessarily depend on price when you make your decision.
The internals are what really determines the value you are getting from an SSD, in this case the AS330 uses the four channel Phison PS3110-S10 controller, 15nm Toshiba TLC NAND and has a 512MB DDR3L-1600 cache. This puts it in the same class as many other value priced SSDs from companies like PNY and Kingston. Hardware Canucks' testing proves this to be true, the drive is a bit slower than the OCZ Trion 150 but is solidly in the middle of the pack of comparable SSDs. The price you can find the drive will be the deciding factor, the 960GB model should sell around $200, the 480GB model is currently $120 on Newegg.
"Apacer's AS330 Panther SSD is inexpensive, offers good performance and has capacity to burn. But can this drive roar or will a lack of brand recognition cause it to purr out to obscurity? "
Here are some more Storage reviews from around the web:
- Samsung 850 EVO 4TB SSD @ Custom PC Review
- Kingston SSDnow UV400 480GB SSD Review @ NikKTech
- SK hynix Canvas SL308 500GB @ Kitguru
- Asustor AS3104T 4-bay NAS @ Kitguru
- TerraMaster D5-300 USB 3.0 External Hard Drive RAID Enclosure Review @ NikKTech
Subject: Graphics Cards | September 27, 2016 - 01:57 PM | Jeremy Hellstrom
Tagged: VR, trickster vr, amd, nvidia, htc vive
[H]ard|OCP continues their look into the performance of VR games on NVIDIA's Titan X, GTX 1080, 1070, 1060 and 970 as well as AMD's Fury X and RX 480. This particular title allowed AMD to shine, they saw the RX 480 come within a hair of matching the GTX 1060 which is a first for them and shows that AMD can be a contender in the VR market. Pop by to see their review in full.
"Arm yourself with a bow and arrows, a magic sword that flies, or if you prefer, a handful of throwing darts. Then get ready to take on the procedurally generated fantasy world full of cartoonish Orcs, and more Orcs, and some other Orcs. Headshots count as well as chaining your shots so aim is critical. Did I mention the Orcs?"
Here are some more Graphics Card articles from around the web:
- MSI GeForce GTX 1080 GAMING X 8G @ [H]ard|OCP
- AMD Radeon RX 480 CrossFire Performance Comparison @ TechARP
Subject: Processors | October 1, 2016 - 06:11 PM | Tim Verry
Tagged: xavier, Volta, tegra, SoC, nvidia, machine learning, gpu, drive px 2, deep neural network, deep learning
Earlier this week at its first GTC Europe event in Amsterdam, NVIDIA CEO Jen-Hsun Huang teased a new SoC code-named Xavier that will be used in self-driving cars and feature the company's newest custom ARM CPU cores and Volta GPU. The new chip will begin sampling at the end of 2017 with product releases using the future Tegra (if they keep that name) processor as soon as 2018.
NVIDIA's Xavier is promised to be the successor to the company's Drive PX 2 system which uses two Tegra X2 SoCs and two discrete Pascal MXM GPUs on a single water cooled platform. These claims are even more impressive when considering that NVIDIA is not only promising to replace the four processors but it will reportedly do that at 20W – less than a tenth of the TDP!
The company has not revealed all the nitty-gritty details, but they did tease out a few bits of information. The new processor will feature 7 billion transistors and will be based on a refined 16nm FinFET process while consuming a mere 20W. It can process two 8k HDR video streams and can hit 20 TOPS (NVIDIA's own rating for deep learning int(8) operations).
Specifically, NVIDIA claims that the Xavier SoC will use eight custom ARMv8 (64-bit) CPU cores (it is unclear whether these cores will be a refined Denver architecture or something else) and a GPU based on its upcoming Volta architecture with 512 CUDA cores. Also, in an interesting twist, NVIDIA is including a "Computer Vision Accelerator" on the SoC as well though the company did not go into many details. This bit of silicon may explain how the ~300mm2 die with 7 billion transistors is able to match the 7.2 billion transistor Pascal-based Telsa P4 (2560 CUDA cores) graphics card at deep learning (tera-operations per second) tasks. Of course in addition to the incremental improvements by moving to Volta and a new ARMv8 CPU architectures on a refined 16nm FF+ process.
|Drive PX||Drive PX 2||NVIDIA Xavier||Tesla P4|
|CPU||2 x Tegra X1 (8 x A57 total)||2 x Tegra X2 (8 x A57 + 4 x Denver total)||1 x Xavier SoC (8 x Custom ARM + 1 x CVA)||N/A|
|GPU||2 x Tegra X1 (Maxwell) (512 CUDA cores total||2 x Tegra X2 GPUs + 2 x Pascal GPUs||1 x Xavier SoC GPU (Volta) (512 CUDA Cores)||2560 CUDA Cores (Pascal)|
|TFLOPS||2.3 TFLOPS||8 TFLOPS||?||5.5 TFLOPS|
|DL TOPS||?||24 TOPS||20 TOPS||22 TOPS|
|TDP||~30W (2 x 15W)||250W||20W||up to 75W|
|Process Tech||20nm||16nm FinFET||16nm FinFET+||16nm FinFET|
|Transistors||?||?||7 billion||7.2 billion|
For comparison, the currently available Tesla P4 based on its Pascal architecture has a TDP of up to 75W and is rated at 22 TOPs. This would suggest that Volta is a much more efficient architecture (at least for deep learning and half precision)! I am not sure how NVIDIA is able to match its GP104 with only 512 Volta CUDA cores though their definition of a "core" could have changed and/or the CVA processor may be responsible for closing that gap. Unfortunately, NVIDIA did not disclose what it rates the Xavier at in TFLOPS so it is difficult to compare and it may not match GP104 at higher precision workloads. It could be wholly optimized for int(8) operations rather than floating point performance. Beyond that I will let Scott dive into those particulars once we have more information!
Xavier is more of a teaser than anything and the chip could very well change dramatically and/or not hit the claimed performance targets. Still, it sounds promising and it is always nice to speculate over road maps. It is an intriguing chip and I am ready for more details, especially on the Volta GPU and just what exactly that Computer Vision Accelerator is (and will it be easy to program for?). I am a big fan of the "self-driving car" and I hope that it succeeds. It certainly looks to continue as Tesla, VW, BMW, and other automakers continue to push the envelope of what is possible and plan future cars that will include smart driving assists and even cars that can drive themselves. The more local computing power we can throw at automobiles the better and while massive datacenters can be used to train the neural networks, local hardware to run and make decisions are necessary (you don't want internet latency contributing to the decision of whether to brake or not!).
I hope that NVIDIA's self-proclaimed "AI Supercomputer" turns out to be at least close to the performance they claim! Stay tuned for more information as it gets closer to launch (hopefully more details will emerge at GTC 2017 in the US).
What are your thoughts on Xavier and the whole self-driving car future?
- NVIDIA Teases Xavier, a High-Performance ARM SoC for Drive PX & AI @ AnandTech
- Tegra Related News @ PC Perspective
- Tesla P4 Specifications @ NVIDIA
- CES 2016: NVIDIA Launches DRIVE PX 2 With Dual Pascal GPUs Driving A Deep Neural Network @ PC Perspective
- 2 of 2