The Tiniest Fiji
Way back on June 16th, AMD held a live stream event during E3 to announce a host of new products. In that group was the AMD Radeon R9 Fury X, R9 Fury and the R9 Nano. Of the three, the Nano was the most intriguing to most of the online press as it was the one we knew the least about. AMD promised a full Fiji GPU in a package with a 6-in PCB and a 175 watt TDP. Well today, AMD is, uh, re-announcing (??) the AMD Radeon R9 Nano with more details on specifications, performance and availability.
First, let’s get this out of the way: AMD is making this announcement today because they publicly promised the R9 Nano for August. And with the final days of summer creeping up on them, rather than answer questions about another delay, AMD is instead going the route of a paper launch, but one with a known end date. We will apparently get our samples of the hardware in early September with reviews and the on-sale date following shortly thereafter. (Update: AMD claims the R9 Nano will be on store shelves on September 10th and should have "critical mass" of availability.)
Now let’s get to the details that you are really here for. And rather than start with the marketing spin on the specifications that AMD presented to the media, let’s dive into the gory details right now.
|R9 Nano||R9 Fury||R9 Fury X||GTX 980 Ti||TITAN X||GTX 980||R9 290X|
|GPU||Fiji XT||Fiji Pro||Fiji XT||GM200||GM200||GM204||Hawaii XT|
|Rated Clock||1000 MHz||1000 MHz||1050 MHz||1000 MHz||1000 MHz||1126 MHz||1000 MHz|
|Memory Clock||500 MHz||500 MHz||500 MHz||7000 MHz||7000 MHz||7000 MHz||5000 MHz|
|Memory Interface||4096-bit (HBM)||4096-bit (HBM)||4096-bit (HBM)||384-bit||384-bit||256-bit||512-bit|
|Memory Bandwidth||512 GB/s||512 GB/s||512 GB/s||336 GB/s||336 GB/s||224 GB/s||320 GB/s|
|TDP||175 watts||275 watts||275 watts||250 watts||250 watts||165 watts||290 watts|
|Peak Compute||8.19 TFLOPS||7.20 TFLOPS||8.60 TFLOPS||5.63 TFLOPS||6.14 TFLOPS||4.61 TFLOPS||5.63 TFLOPS|
AMD wasn’t fooling around, the Radeon R9 Nano graphics card does indeed include a full implementation of the Fiji GPU and HBM, including 4096 stream processors, 256 texture units and 64 ROPs. The GPU core clock is rated “up to” 1.0 GHz, nearly the same as the Fury X (1050 MHz), and the only difference that I can see in the specifications on paper is that the Nano is rated at 8.19 TFLOPS of theoretical compute performance while the Fury X is rated at 8.60 TFLOPS.
Subject: Graphics Cards | August 31, 2015 - 07:19 PM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce, drivers
Unlike last week's 355.80 Hotfix, today's driver is fully certified by both NVIDIA and Microsoft (WHQL). According to users on GeForce Forums, this driver includes the hotfix changes, although I am still seeing a few users complain about memory issues under SLI. The general consensus seems to be that a number of bugs were fixed, and that driver quality is steadily increasing. This is also a “Game Ready” driver for Mad Max and Metal Gear Solid V: The Phantom Pain.
NVIDIA's GeForce Game Ready 355.82 WHQL Mad Max and Metal Gear Solid V: The Phantom Pain drivers (inhale, exhale, inhale) are now available for download at their website. Note that Windows 10 drivers are separate from Windows 7 and Windows 8.x ones, so be sure to not take shortcuts when filling out the “select your driver” form. That, or just use GeForce Experience.
Subject: Displays | August 28, 2015 - 10:02 AM | Sebastian Peak
Tagged: wqhd, TN, S2716DG, gaming monitor, G-Sync Gen II, g-sync, dell, 27-inch, 2560x1440
Dell announced a new 27-inch WQHD gaming monitor yesterday, and while the 2560x1440 resolution and TN panel are nothing new the real story is
the inclusion of NVIDIA G-Sync Gen II that there was a typo in the release.
Dell provides these details about the S2716DG monitor:
- Nvidia’s G-Sync Gen II support feature synchronizes GPU and monitor to minimize graphic distortions and screen tearing
- Quad HD resolution of 2560 x 1440 with close to 2 times more onscreen details than Full HD
- A full range of adjustability features, like tilt, pivot, swivel and height-adjustable stand allow for long hours of comfortable gameplay
- A wide range of connectivity features, including DisplayPort 1.2, HDMI 1.4, four USB 3.0 ports, USB 3.0 upstream, Audio line-out & Headphone-out
- 144 Hz maximum refresh rate and 1ms response time
Pricing is listed as $799 and the S2716DG will be available October 20.
UPDATE: Looking at the Dell announcement page, the company links to a Quadro PDF using a technology called G-Sync II. The problem is that technology was releaesd in 2011 and served a very different purpose than the G-Sync we use for gaming monitors today. We always knew that re-using that name would haunt NVIDIA in some ways...this is one of them. So, that means that Dell's reference to a second generation of G-Sync here is simply a typo, or the naming scheme is correct but the writer of the press release linked to something unrelated.
It is possible that a new version of the G-Sync module is on its way with updated features and possibly support over other display outputs, but I haven't heard anything official as of yet. I'll keep digging!
UPDATE 2: Just confirmed with Dell, this was a typo! The S2176DG "was incorrectly listed as "G-Sync Gen II" and the accurate name of the technology is NVIDIA® G-SYNC™." There you have it. False alarm!
Retail Card Design
AMD is in an interesting spot right now. The general consensus is that both the AMD Radeon R9 Fury X and the R9 Fury graphics cards had successful launches into the enthusiast community. We found that the performance of the Fury X was slightly under that of the GTX 980 Ti from NVIDIA, but also that the noise levels and power draw were so improved on Fiji over Hawaii that many users would dive head first into the new flagship from the red team.
The launch of the non-X AMD Fury card was even more interesting – here was a card with a GPU performing better than the competition in a price point that NVIDIA didn’t have an exact answer. The performance gap between the GTX 980 and GTX 980 Ti resulted in a $550 graphics card that AMD had a victory with. Add in the third Fiji-based product due out in a few short weeks, the R9 Nano, and you have a robust family of products that don’t exactly dominate the market but do put AMD in a positive position unlike any it has seen in recent years.
But there are some problems. First and foremost for AMD, continuing drops in market share. With the most recent reports from multiple source claiming that AMD’s Q2 2015 share has dropped to 18%, an all-time low in the last decade or so, AMD needs some growth and they need it now. Here’s the catch: AMD can’t make enough of the Fiji chip to affect that number at all. The Fury X, Fury and Nano are going to be hard to find for the foreseeable future thanks to production limits on the HBM (high bandwidth memory) integration; that same feature that helps make Fiji the compelling product it is. I have been keeping an eye on the stock of the Fury and Fury X products and found that it often can’t be found anywhere in the US for purchase. Maybe even more damning is the fact that the Radeon R9 Fury, the card that is supposed to be the model customizable by AMD board partners, still only has two options available: the Sapphire, which we reviewed when it launched, and the ASUS Strix R9 Fury that we are reviewing today.
AMD’s product and financial issues aside, the fact is that the Radeon R9 Fury 4GB and the ASUS Strix iteration of it are damned good products. ASUS has done its usual job of improving on the design of the reference PCB and cooler, added in some great features and packaged it up a price that is competitive and well worth the investment for enthusiast gamers. Our review today will only lightly touch on out-of-box performance of the Strix card mostly because it is so similar to that of the initial Fury review we posted in July. Instead I will look at the changes to the positioning of the AMD Fury product (if any) and how the cooler and design of the Strix product helps it stand out. Overclocking, power consumption and noise will all be evaluated as well.
Subject: Graphics Cards | August 27, 2015 - 05:23 PM | Scott Michaud
Tagged: windows 10, nvidia, geforce, drivers, graphics drivers
While GeForce Hotfix driver 355.80 is not certified, or even beta, I know that a lot of our readers have issues with SLI in Windows 10. Especially in games like Battlefield 4, memory usage would expand until, apparently, a crash occurs. Since I run a single GPU, I have not experienced this issue and so I cannot comment on what happens. I just know that it was very common in the GeForce forums and in our comment section, so it was probably a big problem for many users.
If you are not experiencing this problem, then you probably should not install this driver. This is a hotfix that, as stated above, was released outside of NVIDIA's typical update process. You might experience new, unknown issues. Affected users, on the other hand, have the choice to install the fix now, which could very well be stable, or wait for a certified release later.
You can pick it up from NVIDIA's support site.
Subject: Mobile | August 27, 2015 - 03:31 PM | Jeremy Hellstrom
Tagged: asus, ZenBook UX305
That is correct, the 12mm thick Zenbook UX305 from ASUS does not have a LAN port, it is wireless or nothing for this ultrabook. It does have three USB 3.0 ports, a micro HDMI, a 3.5mm jack for audio and an SD card reader so you will be able to use some wired peripherals with this ultramobile device. At a mere 1.2 kg the machine is very light and with a M-5Y10 which can clock between 800MHz up to 2GHz with Turbo Boost it will run when you need it and be gentle on your battery when you do not. KitGuru has posted a review of the UX305 here.
"The ZenBook UX305 is the latest Ultrabook offering from Asus. When I last reviewed one of their products – the hybrid T300 Chi – it greatly impressed me. The UX305 is a similar device, with a Core M processor, 8GB RAM and another SanDisk M.2 SSD. This time, however, it is a conventional laptop, and is priced at £649.95."
Here are some more Mobile articles from around the web:
- Asus ZenBook UX305 @ The Inquirer
- Vodafone Smart Prime 6 Smartphone @ Kitguru
- Galaxy Note 5 vs S6 @ The Inquirer
- Wileyfox Swift hands-on @ The Inquirer
- SISWOO C55 Longbow Smartphone Review @ Madshrimps
Subject: General Tech, Cases and Cooling | August 27, 2015 - 12:17 AM | Tim Verry
Tagged: water cooling, liquid cooling, Intel, ek, AIO
EK (EK Water Blocks) is pouncing on the AIO liquid cooling market with its new EK-Predator series. The new cooler series combines the company's enthusiast parts into pre-filled and pre-assembled loops ready to cool Intel CPUs (AMD socket support is slated for next year). Specifically, EK is offering up the EK-Predator 240 and EK-Predator 360 which are coolers with a 240mm radiator and a 360mm radiator respectively.
The new coolers use copper radiators and EK Supremacy MX CPU blocks the latter of which has a polished copper base so there is no risk associated with using mixed metals in the loop. A 6W DDC pump drives the loop with the pump and a small reservoir attached to one side of the radiator (allegedly using a vibration dampening mounting system). EK ZMT (Zero Maintenance Tubing) 10/16mm tubing connects the CPU block to the pump/radiator/reservoir combo which uses standard G1/4 threaded ports.
EK pairs the radiator with two or three (depending on the model) EK-Vardar high static pressure fans. The fans and pump are PWM controlled and connect to a hub which is then connected to the PC motherboard's CPU fan header over a single cable. Then, a single SATA power cable from the power supply provides the necessary power to drive the pump and fans.
The EK-Predator 360 further adds quick disconnect (QDC) fittings to allow users to expand the loop to include, for example, GPU blocks. EK Water Blocks is reportedly working on compatible GPU blocks which will be available later this year that users will be able to easily tie into the EK-Predator 360 cooling loop.
Available for pre-order now, the EK-Predator 240 will be available September 23rd with an MSRP of $199 while the larger EK-Predator 360 is slated for an October 19th release at $239 MSRP.
If the expected performance is there, these units look to be a decent value that will allow enthusiasts to (pun intended) get their feet wet with liquid cooling with the opportunity to expand the loop as their knowledge and interest in water cooling grows. The EK-Predators are not a unique or new idea (other companies have offered water cooling kits for awhile) but coming pre-assembled and pre-filled makes it dead simple to get started and the parts should be of reputable quality. The one drawback I can see from the outset is that users will need to carefully measure their cases as the pump and reservoir being attached to the radiator means users will need more room than usual to fit the radiator. EK states in the PR that the 240mm rad should fit most cases, and is working with vendors on compatible cases for the 360mm radiator version, for what that's worth. Considering I spent a bit under $300 for my custom water cooling loop used, this new kit doesn't seem like a bad value so long as the parts are up to normal EK quality (barring that whole GPU block flaking thing which I luckily have not run into...).
What do you think about EK's foray into AIO water cooling? Are the new coolers predators or prey? (okay, I'll leave the puns to Scott!).
Subject: Cases and Cooling | August 31, 2015 - 05:25 PM | Sebastian Peak
Tagged: matx case, Indiegogo, enclosures, crowdfunding, Crono Labs, cases, C1 Computer Case
Crono Labs of Galway, Ireland is a startup that hopes to “declutter your desk” with their C1 Computer Case, a unique enclosure that allows you to mount a VESA compliant monitor to the case itself, creating your own all-in-one system.
The C1 is a slim micro-ATX enclosure with support for standard ATX power supplies and graphics cards up to 10.5”, and it sits on a stand that looks like that of a standard monitor.
Here’s a list of compatible components from Crono Labs:
- mATX or ITX motherboard
- ATX PSU
- Two 3.5″ drives
- Two 2.5″ drives
- GPU’s up to 10.5″
- Low profile CPU coolers
- Four 120mm fans
- Water Cooling: 1X 120mm cooler and 1X 240mm cooler can be used, at the same time. Water coolers will not fit if an mATX motherboard is used
The Indiegogo page is now up, and with a modest goal of $2000 they hope to create their initial prototypes before moving to the next phase of funding for production. It’s an interesting concept, and it looks like they have thought this design through with some nice touches:
- A short VGA, HDMI and branching power cable come with the case for reduced cable clutter. Less mess, less stress.
- Rotated motherboard points the IO ports downwards for tidier cables. The motherboard is also raised up into the case to allow cables to go beneath it.
- Carry handle makes transporting the case easy, from desk to desk or room to room.
- The case has a very small footprint, leaving you with a much more pleasing work area, for all that important stuff you do.
The idea of creating a portable all-in-one type system is appealing for the space-constrained or for LAN gaming, and the ability to use full-sized components would allow for a more powerful, and lower cost, build. What do you think of this design?
Subject: General Tech | September 1, 2015 - 04:24 PM | Scott Michaud
Tagged: unreal engine 4, unreal engine, ue4.9, ue4, epic games, dx12
For an engine that was released in late-March, 2014, Epic has been updating it frequently. Unreal Engine 4.9 is, as the number suggests, the tenth release (including 4.0) in just 17 months, which is less than two months per release on average. Each release is fairly sizable, too. This one has about 232 pages of release notes, plus a page and a half of credits, and includes changes for basically every system that I can think of.
The two most interesting features, for me, are Area Shadows and Full Scene Particle Collision.
Area Shadows simulates lights that are physically big and relatively close. At the edges of a shadow, the object that casts the shadow are blocking part of the light. Wherever that shadow falls will be partially lit by the fraction of the light that can see it. As that shadow position gets further back from the shadow caster, it gets larger.
On paper, you can calculate this by drawing rays from either edge of each shadow-casting light to either edge of each shadow-casting object, continued to the objects that receive the shadows. If both sides of the light can see the receiver? No shadows. If both sides of the light cannot see the receiver? That light is blocked, which is a shadow. If some percent of a uniform light can see the receiver, then it will be shadowed by 100% minus that percentage. This is costly to do, unless neither the light nor any of the affected objects move. In that case, you can just store the result, which is how “static lighting” works.
Another interesting feature is Full Scene Particle Collision with Distance Fields. While GPU-computed particles, which is required for extremely high particle counts, collide already, distance fields allow them to collide with objects off screen. Since the user will likely be able to move the camera, this will allow for longer simulations as the user cannot cause it to glitch out by, well, playing the game. It requires SM 5.0 though, which limits it to higher end GPUs.
This is also the first release to support DirectX 12. That said, when I used a preview build, I noticed a net-negative performance with my 9000 draw call (which is a lot) map on my GeForce GTX 670. Epic calls it “experimental” for a reason, and I expect that a lot of work must be done to deliver tasks from an existing engine to the new, queue-based system. I will try it again just in case something changed from the preview builds. I mean, I know something did -- it had a different command line parameter before.
UPDATE (Sept 1st, 10pm ET): An interesting question was raised in the comments that we feel could be a good aside for the news post.
Anonymous asked: I don't have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?
Modern, cross-platform game engines are basically an API and a set of tools atop it.
For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function -- let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter - Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.
Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() -- unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.
The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).
One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".
Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.
Devs with a larger budget might want to dive in and tweak stuff themselves, though.
Unreal Engine 4.9 is now available. It is free to use until your revenue falls under royalty clauses.
Subject: General Tech | August 31, 2015 - 04:48 PM | Jeremy Hellstrom
Tagged: wireless router, idiots, dd-wrt
In the next installment of poorly planned out moves by a US government agency attempting to solve a problem that does not exist, we shall see an attempt to make illegal the modification of the firmware on any device which contains an radio. This is likely to prevent you from using open source software to modify your wireless router into a death ray which will allow you to take over the planet.
Specifically, it will make illegal the modification of any device which can broadcast on U-NII bands which happen to include the 5GHz bandwidth that WiFi broadcasts on. While most firmware changes, such as dd-wrt only change the processor the routers are SoC's which means that the radio is technically a part of the same device as what you modify when applying custom firmware. Hack a Day has links to the FCC proposal, you might want to consider emailing your congress critters about it.
"Because of the economics of cheap routers, nearly every router is designed around a System on Chip – a CPU and radio in a single package. Banning the modification of one inevitably bans the modification of the other, and eliminates the possibility of installing proven Open Source firmware on any device."
Here is some more Tech News from around the web:
- Win10 Insider build 10532: Avoid if you run Chrome 64-bit @ The Register
- Nvidia GRID 2.0 doubles performance of its virtual GPU @ The Inquirer
- Dropbox DROPS BOX as service GOES TITSUP worldwide @ The Register
- Unearthed E.T. Atari Game Cartridges Score $108K At Auction @ Slashdot
Subject: Motherboards | August 27, 2015 - 03:41 PM | Sebastian Peak
Tagged: Z170i Gaming Pro AC, Z170, msi, motherboard, mini-itx, Intel Skylake
MSI has announced a new mini-ITX motherboard for Intel's latest chipset, the Z170I Gaming Pro AC.
Mini-ITX boards have been hard to come by for Skylake thus far, with very few models and limited availability in the first month (though not quite as elusive as the i7-6700K). With this new gaming-oriented board MSI offers another option, and it looks pretty impressive with 5-phase power delivery, 802.11ac wireless, an Intel onboard NIC, and M.2 support from a slot on the back of the PCB.
Pricing isn't immediately available, but the existing Mini-ITX Z170 motherboards (EVGA and ASRock each have one) have been selling for $199 so I'd expect something in that vicinity.
Subject: General Tech | August 28, 2015 - 04:40 PM | Jeremy Hellstrom
Tagged: google, chrome, flash, apple
The good news from Google is that as of next month, Flash ads will be 'Click to Play' when you are browsing in Chrome. This will be nice for the moving ads but even better for defeating those sick minded advertisers who think audio ads are acceptable. However this will hurt websites which depend on ad revenue ... as in all of the ones that are not behind a paywall which have Flash based ads. The move will make your web browsing somewhat safer as this will prevent the drive-by infections which Flash spreads like a plague infested flea and as long as advertisers switch to HTML 5 their ads will play and revenue will continue to come in.
The news of Chrome's refusal to play Flash ads is tempered somewhat by Google's decision to put advertising ahead of security for Apple devices. The new iOS 9 uses HTTPS for all connectivity, providing security and making it more difficult for websites to gather personalized data but as anyone who uses HTTPS Everywhere already knows, not all advertisements are compliant and are often completely blocked from displaying. To ensure that advertisers can display on your iOS9 device Google has provided a tool to get around Apple's App Transport Security thus rendering the protection HTTPS offers inoperative. Again, while sites do depend on advertisements to exist, sacrificing security to display those ads is hard to justify.
"The web giant has set September 1, 2015 as the date from which non-important Flash files will be click-to-play in the browser by default – effectively freezing out "many" Flash ads in the process."
Here is some more Tech News from around the web:
- BitTorrent kills bug that turns networks into a website-slaying weapon @ The Register
- Windows 10 download Build 10532 arrives but Chrome borkage continues @ The Inquirer
- Turning a typewriter into a mechanical keyboard @ Hack a Day
Introduction and First Impressions
The Enthoo Pro M is the new mid-tower version of the Enthoo Pro, previously a full-tower ATX enclosure from the PC cooler and enclosure maker. This new enclosure adds another option to the $79 case market, which already has a number of solid options. Let's see how it stacks up!
I was very impressed by the Phanteks Enthoo EVOLV ATX enclosure, which received our Editor’s Choice award when reviewed earlier this year. The enclosure was very solidly made and had a number of excellent features, and even with a primarily aluminum construction and premium design it can be found for $119, rather unheard-of for this combination in the enclosure market. So what changes from that design might be expect to see with the $79 Enthoo Pro M?
The Pro M is a very businesslike design, constructed of steel and plastic, and with a very understated appearance. Not exactly “boring”, as it does have some personality beyond the typical rectangular box, with a brushed finish to the front panel which also features a vented front fan opening, and a side panel window to show off your build. But I think the real story here is the intelligent internal design, which is nearly identical to that of the EVOLV ATX.
That is a lotta SKUs!
The slow, gradual release of information about Intel's Skylake-based product portfolio continues forward. We have already tested and benchmarked the desktop variant flagship Core i7-6700K processor and also have a better understanding of the microarchitectural changes the new design brings forth. But today Intel's 6th Generation Core processors get a major reveal, with all the mobile and desktop CPU variants from 4.5 watts up to 91 watts, getting detailed specifications. Not only that, but it also marks the first day that vendors can announce and begin selling Skylake-based notebooks and systems!
All indications are that vendors like Dell, Lenovo and ASUS are still some weeks away from having any product available, but expect to see your feeds and favorite tech sites flooded with new product announcements. And of course with a new Apple event coming up soon...there should be Skylake in the new MacBooks this month.
Since I have already talked about the architecture and the performance changes from Haswell/Broadwell to Skylake in our 6700K story, today's release is just a bucket of specifications and information surround 46 different 6th Generation Skylake processors.
Intel's 6th Generation Core Processors
At Intel's Developer Forum in August, the media learned quite a bit about the new 6th Generation Core processor family including Intel's stance on how Skylake changes the mobile landscape.
Skylake is being broken up into 4 different line of Intel processors: S-series for desktop DIY users, H-series for mobile gaming machines, U-series for your everyday Ultrabooks and all-in-ones, Y-series for tablets and 2-in-1 detachables. (Side note: Intel does not reference an "Ultrabook" anymore. Huh.)
As you would expect, Intel has some impressive gains to claim with the new 6th Generation processor. However, it is important to put them in context. All of the claims above, including 2.5x performance, 30x graphics improvement and 3x longer battery life, are comparing Skylake-based products to CPUs from 5 years ago. Specifically, Intel is comparing the new Core i5-6200U (a 15 watt part) against the Core i5-520UM (an 18 watt part) from mid-2010.
Subject: Storage | September 1, 2015 - 08:00 AM | Allyn Malventano
Tagged: Seagate, hdd, Enterprise NAS, Enterprise Capacity 3.5, 8TB
Just when we were starting to get comfortable with the thought of 6TB hard drives, Seagate goes and announces their lineup of 8TB HDDs:
Now before you get too excited about throwing one of these into your desktop, realize that these models are meant for enterprise and larger NAS environments:
As you can see from the above chart, Seagate will be moving to 8TB maximum capacities on their 'Enterprise NAS' and 'Enterprise Capacity 3.5' models, which are meant for larger storage deployments.
Home and small business users opting to go with Seagate for their storage will remain limited to 4TB per drive for the time being.
For those curious about Kinetic, this is Seagate's push to connect arrays of drives via standard Ethernet, which would allow specialized storage applications to speak directly to the raw storage via standard network gear. Kinetic HDDs are currently limited to 4TB, with 8TB planned this coming January.
Seagate's full press blast appears after the break.
Subject: Graphics Cards, Processors | August 30, 2015 - 09:14 PM | Scott Michaud
Tagged: amd, carrizo, Fiji, opencl, opencl 2.0
Apart from manufacturers with a heavy first-party focus, such as Apple and Nintendo, hardware is useless without developer support. In this case, AMD has updated their App SDK to include support for OpenCL 2.0, with code samples. It also updates the SDK for Windows 10, Carrizo, and Fiji, but it is not entirely clear how.
That said, OpenCL is important to those two products. Fiji has a very high compute throughput compared to any other GPU at the moment, and its memory bandwidth is often even more important for GPGPU workloads. It is also useful for Carrizo, because parallel compute and HSA features are what make it a unique product. AMD has been creating first-party software software and helping popular third-party developers such as Adobe, but a little support to the world at large could bring a killer application or two, especially from the open-source community.
The SDK has been available in pre-release form for quite some time now, but it is finally graduated out of beta. OpenCL 2.0 allows for work to be generated on the GPU, which is especially useful for tasks that vary upon previous results without contacting the CPU again.
Subject: General Tech | August 26, 2015 - 01:52 PM | Jeremy Hellstrom
Tagged: gaming, The Witcher 3, VLAN party, fragging frogs
[H]ard|OCP has taken the guesswork out of GPU performance on the current version of Witcher 3 in this round up featuring 10 GPUs, five from each company. Of course only NVIDIA supports lips occluded by PhysX powered mustachios but not everyone is obsessed with perfect hair. Indeed when it takes a $1000 video card just to enable the lowest options on HairWorks at 1440p without disabling every other feature one wonders why HairWorks had gamers tied up in knots. Check out the full review for performance comparisons and even some HairWorks nitpicking.
This weekend also marks the 11th Fragging Frogs VLAN party, which kicks off on Saturday August 29 10:00 AM ET and will go until the last frog has been fragged. Sign up in this thread if you haven't already and if you are new to the Fragging Frogs follow the links to the FAQ threads for information on which patches or mods you will need to apply to your games to get playing as soon as possible.
"We take The Witcher 3: Wild Hunt, using the 1.08.2 patch and latest drivers, find the highest playable settings and examine apples-to-apples performance with and without GameWorks across 10 video cards. We put a focus on NVIDIA HairWorks and how it impacts performance and find out which video cards provide the best gaming value."
Here is some more Tech News from around the web:
- 14-Way AMD vs. NVIDIA Linux Gaming Performance For DiRT Showdown @ Phoronix
- Super Useful Skyrim Script Extender Now On Steam @ Rock, Paper, SHOTGUN
- The RPG Scrollbars: The Long Night Of Vampire: The Masquerade: Bloodlines (With Clan Quests) @ Rock, Paper, SHOTGUN
- Customers start to receive Nvidia SHIELD tablet replacements @ HEXUS
- Warm Up The Cerebral Bore: Turok 1&2 Being Revamped @ Rock, Paper, SHOTGUN
- YouTube Gaming site and app to launch later today @ HEXUS
- Windows 10 Won’t Run Games Using SafeDisc Or Securom DRM @ Rock, Paper, SHOTGUN
- Do Corpses Make Darkest Dungeon Too Difficult? @ Rock, Paper, SHOTGUN
Introduction and Technical Specifications
Courtesy of ASUS
The Z170-A motherboard is among initial offerings from ASUS' channel line of Intel Z170 chipset board line. The board features ASUS' new Channel line aesthetics, featuring white and black coloration to differentiate the line from their Z97 gold-theme offerings. ASUS uses the Z170-A to redefine the definition of a base-line motherboard, integrating many "upper-tier style" features not normally found on the lower tier offerings. The board's integrated Intel Z170 chipset integrates support for the latest Intel LGA1151 Skylake processor line as well as Dual Channel DDR4 memory. Offered at a price-competitive MSRP of $165, the Z170-A threatens to give the rest of the Z170-based boards a run for the money.
Courtesy of ASUS
The Z170 shares the same DIGI+ style power system of its higher priced siblings, featuring an 8-phase digital power delivery system. ASUS integrated the following features into the Z170-A board: four SATA 3 ports; one SATA-Express port; one M.2 PCIe x4 capable port; an Intel I219-V Gigabit NIC; three PCI-Express x16 slots; two PCI-Express x1 slots; one PCI slot; on-board power, and MemOK! buttons; EZ XMP and TPU switches; Crystal Sound 3 audio subsystem; integrated DisplayPort, HDMI, DVI, and VGA video ports; and USB 2.0, 3.0, and 3.1 Type-A and Type-C port support.
Courtesy of ASUS
The Z170-A motherboard comes standard with ASUS latest iteration of their sound technology, dubbed Crystal Sound 3. Like its predecessors, Crystal Sound 3 integrates the audio components on a isolated PCB from the other main board components minimizing noise generation caused by those other integrated devices. ASUS designed the audio subsystem with high-quality Japanese-sourced audio and power circuitry for a top-notch audio experience.
Subject: General Tech | September 1, 2015 - 02:19 PM | Jeremy Hellstrom
Tagged: Lenovo, Thinkpad E Series, Realsense 3D, windows 10
The new 14" and 15.6" Lenovo ThinkPad E Series were revealed recently and The Inquirer got a sneak peek at it. They offer a choice of Intel and AMD models, somewhat good news for the much beleaguered processor company, along with up to 16GB of RAM and an SSD. The most interesting upgrade is the Intel RealSense 3D camera on some models, which you may remember Ryan testing on the Dell Venue 8, which should make conference calls more interesting as well as letting you measure your room. They also announced updated M and B and E line of laptops as well as the S series desktops, read more about it at The Inquirer.
"The E Series laptops come with a host of features "ideal for business users", Lenovo said, including fingerprint scanning security and up to nine hours of battery life."
Here is some more Tech News from around the web:
- Muted HAMR blow from Seagate: 4TB whizzbang drive coming 2016 @ The Register
- Hands on with Windows Server 2016 Containers @ The Register
- Better crypto, white-box switch support in Linux 4.2 @ The Register
- Tricks For Using Desktop-Integrated Calendars @ Linux.com
- Windows 10 is the world's fourth biggest OS after a month @ The Inquirer
- Worldwide server shipments grew 8% in 2Q15, while revenue increased 7.2%, says Gartner @ DigiTimes
- Amkov AMK5000S Sports Action Camera @ Kitguru
- EnGenius ENS1750 Outdoor Access Point @ Benchmark Reviews
Subject: Cases and Cooling | August 26, 2015 - 01:06 PM | Jeremy Hellstrom
Tagged: SFF, micro-atx, mini-itx, SG12, Silverstone
The SilverStone SG12 is an SFF case which dreams big, built for Mini-ITX through Micro-ATX motherboards it is still large enough to fit a GPU over a foot long. Overall it is 266x210x407mm (10.5x8.3x16") in size, still small enough to fit in a living room or cart around with you thanks to the built in handle but large enough to fit high end components. Bjorn3D installed an i7-4790K on an ASUS Z97M-PLUS with a GTX 970 powered by a SilverStone SST-ST55F-G PSU which is about 40mm shorter than the majority of PSUs. For a cooler they used the SilverStone SST-ST55F-G, the 140x82x139mm size comes close to the maximum size you can fit into the case. Check out their full review here.
"Here at Bjorn3D we are no strangers to the SilverStone brand. They have been creating awesome cases, power supplies, coolers and more since 2003, and we have been fortunate enough to take a look at many of their offerings over the years. Early on in their history, they created the Sugo series of cases, a line which caters to those that wish to build a small form factor PC."
Here are some more Cases & Cooling reviews from around the web:
- Silverstone Sugo SG12 Case Review @ Hardware Asylum
- SilverStone Sugo SG12 @ Benchmark Reviews
- Element Gaming Hyperian Micro-ATX Chassis @ eTeknix
- Cougar QBX Mini-ITX Gaming Chassis @ eTeknix
- Phanteks Enthoo Evolv ITX SE @ Modders-Inc
- Cooler Master MasterCase 5 @ techPowerUp
- Rosewill WolfAlloy Review Case Review @ Hardware Asylum
- Cooler Master MasterCase 5 & Pro 5 @ Kitguru
- MAINGEAR Shift @ Modders-Inc
- Thermaltake Suppressor F51 Midi Tower Review @ NikKTech
- Alphacool Custom 480mm Watercooling Kit Review @ NikKTech
- Enermax Liqmax II 240mm AIO CPU Cooler @ eTeknix
- Optimized CPU Cooling with Top-Down Heatsinks @ Benchmark Reviews
- be quiet! Shadow Rock LP @ techPowerUp
- Deepcool Assassin II Review @ OCC
- 1 of 2