Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »
Author:
Manufacturer: AMD

The Tiniest Fiji

Way back on June 16th, AMD held a live stream event during E3 to announce a host of new products. In that group was the AMD Radeon R9 Fury X, R9 Fury and the R9 Nano. Of the three, the Nano was the most intriguing to most of the online press as it was the one we knew the least about. AMD promised a full Fiji GPU in a package with a 6-in PCB and a 175 watt TDP. Well today, AMD is, uh, re-announcing (??) the AMD Radeon R9 Nano with more details on specifications, performance and availability.

r9nano-2.jpg

First, let’s get this out of the way: AMD is making this announcement today because they publicly promised the R9 Nano for August. And with the final days of summer creeping up on them, rather than answer questions about another delay, AMD is instead going the route of a paper launch, but one with a known end date. We will apparently get our samples of the hardware in early September with reviews and the on-sale date following shortly thereafter. (Update: AMD claims the R9 Nano will be on store shelves on September 10th and should have "critical mass" of availability.)

Now let’s get to the details that you are really here for. And rather than start with the marketing spin on the specifications that AMD presented to the media, let’s dive into the gory details right now.

  R9 Nano R9 Fury R9 Fury X GTX 980 Ti TITAN X GTX 980 R9 290X
GPU Fiji XT Fiji Pro Fiji XT GM200 GM200 GM204 Hawaii XT
GPU Cores 4096 3584 4096 2816 3072 2048 2816
Rated Clock 1000 MHz 1000 MHz 1050 MHz 1000 MHz 1000 MHz 1126 MHz 1000 MHz
Texture Units 256 224 256 176 192 128 176
ROP Units 64 64 64 96 96 64 64
Memory 4GB 4GB 4GB 6GB 12GB 4GB 4GB
Memory Clock 500 MHz 500 MHz 500 MHz 7000 MHz 7000 MHz 7000 MHz 5000 MHz
Memory Interface 4096-bit (HBM) 4096-bit (HBM) 4096-bit (HBM) 384-bit 384-bit 256-bit 512-bit
Memory Bandwidth 512 GB/s 512 GB/s 512 GB/s 336 GB/s 336 GB/s 224 GB/s 320 GB/s
TDP 175 watts 275 watts 275 watts 250 watts 250 watts 165 watts 290 watts
Peak Compute 8.19 TFLOPS 7.20 TFLOPS 8.60 TFLOPS 5.63 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 5.63 TFLOPS
Transistor Count 8.9B 8.9B 8.9B 8.0B 8.0B 5.2B 6.2B
Process Tech 28nm 28nm 28nm 28nm 28nm 28nm 28nm
MSRP (current) $649 $549 $649 $649 $999 $499 $329

AMD wasn’t fooling around, the Radeon R9 Nano graphics card does indeed include a full implementation of the Fiji GPU and HBM, including 4096 stream processors, 256 texture units and 64 ROPs. The GPU core clock is rated “up to” 1.0 GHz, nearly the same as the Fury X (1050 MHz), and the only difference that I can see in the specifications on paper is that the Nano is rated at 8.19 TFLOPS of theoretical compute performance while the Fury X is rated at 8.60 TFLOPS.

Continue reading our preview of the AMD Radeon R9 Nano graphics card!!

Manufacturer: PC Perspective

To the Max?

Much of the PC enthusiast internet, including our comments section, has been abuzz with “Asynchronous Shader” discussion. Normally, I would explain what it is and then outline the issues that surround it, but I would like to swap that order this time. Basically, the Ashes of the Singularity benchmark utilizes Asynchronous Shaders in DirectX 12, but they disable it (by Vendor ID) for NVIDIA hardware. They say that this is because, while the driver reports compatibility, “attempting to use it was an unmitigated disaster in terms of performance and conformance”.

epic-2015-ue4-dx12.jpg

AMD's Robert Hallock claims that NVIDIA GPUs, including Maxwell, cannot support the feature in hardware at all, while all AMD GCN graphics cards do. NVIDIA has yet to respond to our requests for an official statement, although we haven't poked every one of our contacts yet. We will certainly update and/or follow up if we hear from them. For now though, we have no idea whether this is a hardware or software issue. Either way, it seems more than just politics.

So what is it?

Simply put, Asynchronous Shaders allows a graphics driver to cram workloads in portions of the GPU that are idle, but not otherwise available. For instance, if a graphics task is hammering the ROPs, the driver would be able to toss an independent physics or post-processing task into the shader units alongside it. Kollock from Oxide Games used the analogy of HyperThreading, which allows two CPU threads to be executed on the same core at the same time, as long as it has the capacity for it.

Kollock also notes that compute is becoming more important in the graphics pipeline, and it is possible to completely bypass graphics altogether. The fixed-function bits may never go away, but it's possible that at least some engines will completely bypass it -- maybe even their engine, several years down the road.

I wonder who would pursue something so silly, whether for a product or even just research.

But, like always, you will not get an infinite amount of performance by reducing your waste. You are always bound by the theoretical limits of your components, and you cannot optimize past that (except for obviously changing the workload itself). The interesting part is: you can measure that. You can absolutely observe how long a GPU is idle, and represent it as a percentage of a time-span (typically a frame).

And, of course, game developers profile GPUs from time to time...

According to Kollock, he has heard of some console developers getting up to 30% increases in performance using Asynchronous Shaders. Again, this is on console hardware and so this amount may increase or decrease on the PC. In an informal chat with a developer at Epic Games, so massive grain of salt is required, his late night ballpark “totally speculative” guesstimate is that, on the Xbox One, the GPU could theoretically accept a maximum ~10-25% more work in Unreal Engine 4, depending on the scene. He also said that memory bandwidth gets in the way, which Asynchronous Shaders would be fighting against. It is something that they are interested in and investigating, though.

AMD-2015-MantleAPI-slide1.png

This is where I speculate on drivers. When Mantle was announced, I looked at its features and said “wow, this is everything that a high-end game developer wants, and a graphics developer absolutely does not”. From the OpenCL-like multiple GPU model taking much of the QA out of SLI and CrossFire, to the memory and resource binding management, this should make graphics drivers so much easier.

It might not be free, though. Graphics drivers might still have a bunch of games to play to make sure that work is stuffed through the GPU as tightly packed as possible. We might continue to see “Game Ready” drivers in the coming years, even though much of that burden has been shifted to the game developers. On the other hand, maybe these APIs will level the whole playing field and let all players focus on chip design and efficient injestion of shader code. As always, painfully always, time will tell.

NVIDIA Releases 355.82 WHQL Drivers

Subject: Graphics Cards | August 31, 2015 - 07:19 PM |
Tagged: nvidia, graphics drivers, geforce, drivers

Unlike last week's 355.80 Hotfix, today's driver is fully certified by both NVIDIA and Microsoft (WHQL). According to users on GeForce Forums, this driver includes the hotfix changes, although I am still seeing a few users complain about memory issues under SLI. The general consensus seems to be that a number of bugs were fixed, and that driver quality is steadily increasing. This is also a “Game Ready” driver for Mad Max and Metal Gear Solid V: The Phantom Pain.

nvidia-2015-drivers-35582.png

NVIDIA's GeForce Game Ready 355.82 WHQL Mad Max and Metal Gear Solid V: The Phantom Pain drivers (inhale, exhale, inhale) are now available for download at their website. Note that Windows 10 drivers are separate from Windows 7 and Windows 8.x ones, so be sure to not take shortcuts when filling out the “select your driver” form. That, or just use GeForce Experience.

Source: NVIDIA

Dell 27-inch S2716DG Gaming Monitor Announced with NVIDIA G-Sync

Subject: Displays | August 28, 2015 - 10:02 AM |
Tagged: wqhd, TN, S2716DG, gaming monitor, G-Sync Gen II, g-sync, dell, 27-inch, 2560x1440

Dell announced a new 27-inch WQHD gaming monitor yesterday, and while the 2560x1440 resolution and TN panel are nothing new the real story is the inclusion of NVIDIA G-Sync Gen II that there was a typo in the release.

dell-monitor-27-S2716DG.jpg

Dell provides these details about the S2716DG monitor:

  • Nvidia’s G-Sync Gen II support feature synchronizes GPU and monitor to minimize graphic distortions and screen tearing
  • Quad HD resolution of 2560 x 1440 with close to 2 times more onscreen details than Full HD
  • A full range of adjustability features, like tilt, pivot, swivel and height-adjustable stand allow for long hours of comfortable gameplay
  • A wide range of connectivity features, including DisplayPort 1.2, HDMI 1.4, four USB 3.0 ports, USB 3.0 upstream, Audio line-out & Headphone-out
  • 144 Hz maximum refresh rate and 1ms response time

Pricing is listed as $799 and the S2716DG will be available October 20.

UPDATE: Looking at the Dell announcement page, the company links to a Quadro PDF using a technology called G-Sync II. The problem is that technology was releaesd in 2011 and served a very different purpose than the G-Sync we use for gaming monitors today. We always knew that re-using that name would haunt NVIDIA in some ways...this is one of them. So, that means that Dell's reference to a second generation of G-Sync here is simply a typo, or the naming scheme is correct but the writer of the press release linked to something unrelated.

It is possible that a new version of the G-Sync module is on its way with updated features and possibly support over other display outputs, but I haven't heard anything official as of yet. I'll keep digging!

UPDATE 2: Just confirmed with Dell, this was a typo! The S2176DG "was incorrectly listed as "G-Sync Gen II" and the accurate name of the technology is NVIDIA® G-SYNC™." There you have it. False alarm!

-Ryan

Source: Dell

NVIDIA 355.80 Hotfix for Windows 10 SLI Memory Issues

Subject: Graphics Cards | August 27, 2015 - 05:23 PM |
Tagged: windows 10, nvidia, geforce, drivers, graphics drivers

While GeForce Hotfix driver 355.80 is not certified, or even beta, I know that a lot of our readers have issues with SLI in Windows 10. Especially in games like Battlefield 4, memory usage would expand until, apparently, a crash occurs. Since I run a single GPU, I have not experienced this issue and so I cannot comment on what happens. I just know that it was very common in the GeForce forums and in our comment section, so it was probably a big problem for many users.

nvidia-geforce.png

If you are not experiencing this problem, then you probably should not install this driver. This is a hotfix that, as stated above, was released outside of NVIDIA's typical update process. You might experience new, unknown issues. Affected users, on the other hand, have the choice to install the fix now, which could very well be stable, or wait for a certified release later.

You can pick it up from NVIDIA's support site.

Source: NVIDIA
Author:
Subject: Processors
Manufacturer: Intel

That is a lotta SKUs!

The slow, gradual release of information about Intel's Skylake-based product portfolio continues forward. We have already tested and benchmarked the desktop variant flagship Core i7-6700K processor and also have a better understanding of the microarchitectural changes the new design brings forth. But today Intel's 6th Generation Core processors get a major reveal, with all the mobile and desktop CPU variants from 4.5 watts up to 91 watts, getting detailed specifications. Not only that, but it also marks the first day that vendors can announce and begin selling Skylake-based notebooks and systems!

All indications are that vendors like Dell, Lenovo and ASUS are still some weeks away from having any product available, but expect to see your feeds and favorite tech sites flooded with new product announcements. And of course with a new Apple event coming up soon...there should be Skylake in the new MacBooks this month.

Since I have already talked about the architecture and the performance changes from Haswell/Broadwell to Skylake in our 6700K story, today's release is just a bucket of specifications and information surround 46 different 6th Generation Skylake processors.

Intel's 6th Generation Core Processors

intel6th-6.jpg

At Intel's Developer Forum in August, the media learned quite a bit about the new 6th Generation Core processor family including Intel's stance on how Skylake changes the mobile landscape.

intel6th-7.jpg

Skylake is being broken up into 4 different line of Intel processors: S-series for desktop DIY users, H-series for mobile gaming machines, U-series for your everyday Ultrabooks and all-in-ones, Y-series for tablets and 2-in-1 detachables. (Side note: Intel does not reference an "Ultrabook" anymore. Huh.)

intel6th-8.jpg

As you would expect, Intel has some impressive gains to claim with the new 6th Generation processor. However, it is important to put them in context. All of the claims above, including 2.5x performance, 30x graphics improvement and 3x longer battery life, are comparing Skylake-based products to CPUs from 5 years ago. Specifically, Intel is comparing the new Core i5-6200U (a 15 watt part) against the Core i5-520UM (an 18 watt part) from mid-2010.

Continue reading our overview of the 46 new Intel Skylake 6th Generation Core processors!!

Crono Labs C1 Computer Case Hits Indiegogo: DIY AIO

Subject: Cases and Cooling | August 31, 2015 - 05:25 PM |
Tagged: matx case, Indiegogo, enclosures, crowdfunding, Crono Labs, cases, C1 Computer Case

Crono Labs of Galway, Ireland is a startup that hopes to “declutter your desk” with their C1 Computer Case, a unique enclosure that allows you to mount a VESA compliant monitor to the case itself, creating your own all-in-one system.

C1_00.jpg

The C1 is a slim micro-ATX enclosure with support for standard ATX power supplies and graphics cards up to 10.5”, and it sits on a stand that looks like that of a standard monitor.

Here’s a list of compatible components from Crono Labs:

  • mATX or ITX motherboard
  • ATX PSU
  • Two 3.5″ drives
  • Two 2.5″ drives
  • GPU’s up to 10.5″
  • Low profile CPU coolers
  • Four 120mm fans
  • Water Cooling: 1X 120mm cooler and 1X 240mm cooler can be used, at the same time. Water coolers will not fit if an mATX motherboard is used

C1_01.PNG

The Indiegogo page is now up, and with a modest goal of $2000 they hope to create their initial prototypes before moving to the next phase of funding for production. It’s an interesting concept, and it looks like they have thought this design through with some nice touches:

  • A short VGA, HDMI and branching power cable come with the case for reduced cable clutter. Less mess, less stress.
  • Rotated motherboard points the IO ports downwards for tidier cables. The motherboard is also raised up into the case to allow cables to go beneath it.
  • Carry handle makes transporting the case easy, from desk to desk or room to room.
  • The case has a very small footprint, leaving you with a much more pleasing work area, for all that important stuff you do.

The idea of creating a portable all-in-one type system is appealing for the space-constrained or for LAN gaming, and the ability to use full-sized components would allow for a more powerful, and lower cost, build. What do you think of this design?

Source: Indiegogo

What, no ethernet?!? ASUS trims the ZenBook UX305

Subject: Mobile | August 27, 2015 - 03:31 PM |
Tagged: asus, ZenBook UX305

That is correct, the 12mm thick Zenbook UX305 from ASUS does not have a LAN port, it is wireless or nothing for this ultrabook.  It does have three USB 3.0 ports, a micro HDMI, a 3.5mm jack for audio and an SD card reader so you will be able to use some wired peripherals with this ultramobile device.  At a mere 1.2 kg the machine is very light and with a M-5Y10 which can clock between 800MHz up to 2GHz with Turbo Boost it will run when you need it and be gentle on your battery when you do not.  KitGuru has posted a review of the UX305 here.

angle1.jpg

"The ZenBook UX305 is the latest Ultrabook offering from Asus. When I last reviewed one of their products – the hybrid T300 Chi – it greatly impressed me. The UX305 is a similar device, with a Core M processor, 8GB RAM and another SanDisk M.2 SSD. This time, however, it is a conventional laptop, and is priced at £649.95."

Here are some more Mobile articles from around the web:

Mobile

Source: KitGuru

Epic Games Releases Unreal Engine 4.9

Subject: General Tech | September 1, 2015 - 04:24 PM |
Tagged: unreal engine 4, unreal engine, ue4.9, ue4, epic games, dx12

For an engine that was released in late-March, 2014, Epic has been updating it frequently. Unreal Engine 4.9 is, as the number suggests, the tenth release (including 4.0) in just 17 months, which is less than two months per release on average. Each release is fairly sizable, too. This one has about 232 pages of release notes, plus a page and a half of credits, and includes changes for basically every system that I can think of.

The two most interesting features, for me, are Area Shadows and Full Scene Particle Collision.

Area Shadows simulates lights that are physically big and relatively close. At the edges of a shadow, the object that casts the shadow are blocking part of the light. Wherever that shadow falls will be partially lit by the fraction of the light that can see it. As that shadow position gets further back from the shadow caster, it gets larger.

pcper-2015-softshadows.png

On paper, you can calculate this by drawing rays from either edge of each shadow-casting light to either edge of each shadow-casting object, continued to the objects that receive the shadows. If both sides of the light can see the receiver? No shadows. If both sides of the light cannot see the receiver? That light is blocked, which is a shadow. If some percent of a uniform light can see the receiver, then it will be shadowed by 100% minus that percentage. This is costly to do, unless neither the light nor any of the affected objects move. In that case, you can just store the result, which is how “static lighting” works.

Another interesting feature is Full Scene Particle Collision with Distance Fields. While GPU-computed particles, which is required for extremely high particle counts, collide already, distance fields allow them to collide with objects off screen. Since the user will likely be able to move the camera, this will allow for longer simulations as the user cannot cause it to glitch out by, well, playing the game. It requires SM 5.0 though, which limits it to higher end GPUs.

epic-2015-ue4-dx12.jpg

This is also the first release to support DirectX 12. That said, when I used a preview build, I noticed a net-negative performance with my 9000 draw call (which is a lot) map on my GeForce GTX 670. Epic calls it “experimental” for a reason, and I expect that a lot of work must be done to deliver tasks from an existing engine to the new, queue-based system. I will try it again just in case something changed from the preview builds. I mean, I know something did -- it had a different command line parameter before.

UPDATE (Sept 1st, 10pm ET): An interesting question was raised in the comments that we feel could be a good aside for the news post.

Anonymous asked: I don't have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?

Modern, cross-platform game engines are basically an API and a set of tools atop it.

For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function -- let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter - Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.

Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() -- unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.

The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).

One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".

Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.

Devs with a larger budget might want to dive in and tweak stuff themselves, though.

Unreal Engine 4.9 is now available. It is free to use until your revenue falls under royalty clauses.

Source: Epic Games

IFA 2015: ASUS Reveals RT-AC5300U Router: 8 Antenna Beast

Subject: Networking | September 2, 2015 - 07:00 AM |
Tagged: RT-AC5300U, router, mu-mimo, IFA 2015, dual band, asus, 802.11ac

This is a seriously imposing-looking router, and the specs are just as huge.

ASUS_RTAC5300.png

Here are some highlights from ASUS:

  • AC5300 speeds
  • Tri-band wireless up to 1000 Mbit/s on 2.4 GHz and up to 2167 Mbit/s on each 5 GHz band
  • Up to 5333 Mbit/s combined on the 5GHz band
  • NitroQAM technology for low-latency gaming and 4K/UHD streaming
  • Eight external antennas in a 4x4 config
  • Ultra-wide area coverage
  • Award-winning ASUS AiProtection Network Security Services

5333 Mbps on the 5 GHz band alone? So how does the RT-AC5300U router provide so much bandwidth? It’s powered by a staggering array of radios! Looking at the chipset specs we that it’s comprised of BCM4709 + BCM4366 (2.4 GHz) + 2x BCM4366 (5 GHz), with 256MB DDR3 memory and 128MB of flash. And we can’t forget the 8 external dual-band antennas! Yes, eight. Truly, this is a beast (though it looks like an overturned spider).

Pricing and exact availability were not revealed, but ASUS says it will be coming in Q4 2015.

Source: ASUS

EK Jumps Into AIO Water Cooling With New EK-Predator Coolers

Subject: General Tech, Cases and Cooling | August 27, 2015 - 12:17 AM |
Tagged: water cooling, liquid cooling, Intel, ek, AIO

EK (EK Water Blocks) is pouncing on the AIO liquid cooling market with its new EK-Predator series. The new cooler series combines the company's enthusiast parts into pre-filled and pre-assembled loops ready to cool Intel CPUs (AMD socket support is slated for next year). Specifically, EK is offering up the EK-Predator 240 and EK-Predator 360 which are coolers with a 240mm radiator and a 360mm radiator respectively.

EK-Predator 240 AIO Water Cooler.jpg

The new coolers use copper radiators and EK Supremacy MX CPU blocks the latter of which has a polished copper base so there is no risk associated with using mixed metals in the loop. A 6W DDC pump drives the loop with the pump and a small reservoir attached to one side of the radiator (allegedly using a vibration dampening mounting system). EK ZMT (Zero Maintenance Tubing) 10/16mm tubing connects the CPU block to the pump/radiator/reservoir combo which uses standard G1/4 threaded ports.

EK pairs the radiator with two or three (depending on the model) EK-Vardar high static pressure fans. The fans and pump are PWM controlled and connect to a hub which is then connected to the PC motherboard's CPU fan header over a single cable. Then, a single SATA power cable from the power supply provides the necessary power to drive the pump and fans.

EK-Predator 360 AIO Water Cooler.jpg

The EK-Predator 360 further adds quick disconnect (QDC) fittings to allow users to expand the loop to include, for example, GPU blocks. EK Water Blocks is reportedly working on compatible GPU blocks which will be available later this year that users will be able to easily tie into the EK-Predator 360 cooling loop.

Available for pre-order now, the EK-Predator 240 will be available September 23rd with an MSRP of $199 while the larger EK-Predator 360 is slated for an October 19th release at $239 MSRP.

My thoughts:

If the expected performance is there, these units look to be a decent value that will allow enthusiasts to (pun intended) get their feet wet with liquid cooling with the opportunity to expand the loop as their knowledge and interest in water cooling grows. The EK-Predators are not a unique or new idea (other companies have offered water cooling kits for awhile) but coming pre-assembled and pre-filled makes it dead simple to get started and the parts should be of reputable quality. The one drawback I can see from the outset is that users will need to carefully measure their cases as the pump and reservoir being attached to the radiator means users will need more room than usual to fit the radiator. EK states in the PR that the 240mm rad should fit most cases, and is working with vendors on compatible cases for the 360mm radiator version, for what that's worth. Considering I spent a bit under $300 for my custom water cooling loop used, this new kit doesn't seem like a bad value so long as the parts are up to normal EK quality (barring that whole GPU block flaking thing which I luckily have not run into...).

What do you think about EK's foray into AIO water cooling? Are the new coolers predators or prey? (okay, I'll leave the puns to Scott!).

MSI Announces Z170I Gaming Pro AC Mini-ITX Motherboard

Subject: Motherboards | August 27, 2015 - 03:41 PM |
Tagged: Z170i Gaming Pro AC, Z170, msi, motherboard, mini-itx, Intel Skylake

MSI has announced a new mini-ITX motherboard for Intel's latest chipset, the Z170I Gaming Pro AC.

126a.jpg

Mini-ITX boards have been hard to come by for Skylake thus far, with very few models and limited availability in the first month (though not quite as elusive as the i7-6700K). With this new gaming-oriented board MSI offers another option, and it looks pretty impressive with 5-phase power delivery, 802.11ac wireless, an Intel onboard NIC, and M.2 support from a slot on the back of the PCB.

126c.jpg

Pricing isn't immediately available, but the existing Mini-ITX Z170 motherboards (EVGA and ASRock each have one) have been selling for $199 so I'd expect something in that vicinity.

Source: TechPowerUp

Now they are coming for your dd-wrt

Subject: General Tech | August 31, 2015 - 04:48 PM |
Tagged: wireless router, idiots, dd-wrt

In the next installment of poorly planned out moves by a US government agency attempting to solve a problem that does not exist, we shall see an attempt to make illegal the modification of the firmware on any device which contains an radio.  This is likely to prevent you from using open source software to modify your wireless router into a death ray which will allow you to take over the planet. 

Specifically, it will make illegal the modification of any device which can broadcast on U-NII bands which happen to include the 5GHz bandwidth that WiFi broadcasts on.  While most firmware changes, such as dd-wrt only change the processor the routers are SoC's which means that the radio is technically a part of the same device as what you modify when applying custom firmware.  Hack a Day has links to the FCC proposal, you might want to consider emailing your congress critters about it.

ddwrt-alt-logo-large.jpg

"Because of the economics of cheap routers, nearly every router is designed around a System on Chip – a CPU and radio in a single package. Banning the modification of one inevitably bans the modification of the other, and eliminates the possibility of installing proven Open Source firmware on any device."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Hack a Day

Google giveth with one hand whilst taking with the other

Subject: General Tech | August 28, 2015 - 04:40 PM |
Tagged: google, chrome, flash, apple

The good news from Google is that as of next month, Flash ads will be 'Click to Play' when you are browsing in Chrome.  This will be nice for the moving ads but even better for defeating those sick minded advertisers who think audio ads are acceptable.  However this will hurt websites which depend on ad revenue ... as in all of the ones that are not behind a paywall which have Flash based ads.  The move will make your web browsing somewhat safer as this will prevent the drive-by infections which Flash spreads like a plague infested flea and as long as advertisers switch to HTML 5 their ads will play and revenue will continue to come in.

The news of Chrome's refusal to play Flash ads is tempered somewhat by Google's decision to put advertising ahead of security for Apple devices.  The new iOS 9 uses HTTPS for all connectivity, providing security and making it more difficult for websites to gather personalized data but as anyone who uses HTTPS Everywhere already knows, not all advertisements are compliant and are often completely blocked from displaying.  To ensure that advertisers can display on your iOS9 device Google has provided a tool to get around Apple's App Transport Security thus rendering the protection HTTPS offers inoperative.  Again, while sites do depend on advertisements to exist, sacrificing security to display those ads is hard to justify.

adobe-flash-player-icon.jpg

"The web giant has set September 1, 2015 as the date from which non-important Flash files will be click-to-play in the browser by default – effectively freezing out "many" Flash ads in the process."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

ASUS Announces ROG GX700 Gaming Notebook that's Water Cooled

Subject: Mobile | September 2, 2015 - 07:00 AM |
Tagged: ROG, notebook, ifa, gx700, gaming notebook, gaming laptop, asus

IFA is turning out to be an odd place full of weird announcements focused on PC gaming and enthusiasts rather than just mobile phones and electronics. ASUS has gone in the completely opposite direction today, announcing not just a series of gaming notebooks but a new series that is water cooled. I'm not making that up.

GX700.jpg

That is the new ASUS ROG (Republic of Gamers) GX700 series of gaming notebooks, coming in the 4th quarter of 2015. Looking for a price? You won't find it here but you will find a lot of interesting technology. This is what ASUS claims about the GX700:

  • All-new flagship gaming laptop
  • 4K 17-inch display
  • Water-cooling system with pump/radiator
  • Mobile K-series CPU with overclocking
  • NVIDIA GeForce GTX graphics (TBD)

A 4K screen in a 17-inch form factor is going to...have exceptionally small pixels. Clearly this is going to need quite a bit of Windows-based text and format scaling to make sure the desktop experience is usable. ASUS is using the new K-series Skylake processor that is unlocked and allows for overclocking in the same way you do so in the desktop market.

Oh, and what's this? An unannounced mobile GeForce GTX GPU? I doubt this is anything more than a currently shipping Maxwell GPU with some additional horsepower behind it, possibly more closely matching performance of the desktop GTX 980 Ti.

And of course, let's talk about the water cooling system. I asked for more details but ASUS wasn't budging. Clearly if you market this as a notebook there has to be portability to the device so expect that large portion that is front in center in the above picture to detach with quick connections to the notebook housing. That large external base will likely hold the pump, radiator, reservoir and even some docking functions like display connections, USB, etc. With water cooling and an unlocked Skylake processor you should expect some impressive overclocking capability considering the form factor!

I would assume that if you disconnect the machine to take on the road without the water cooling base the hardware would run at slower speeds with normal in-case fans as we see with other designs on the market today.

This sound amazing, crazy and kind of senseless, but I need to try it right away. Expect to pay top dollar for something like this especially considering the component cost of the screen, CPU, GPU, etc. not to mention the specific engineering for the new housing and design. I'll keep my eyes out for more information on the ASUS ROG GX700!

Source: ASUS
Manufacturer: Phanteks

Introduction and First Impressions

The Enthoo Pro M is the new mid-tower version of the Enthoo Pro, previously a full-tower ATX enclosure from the PC cooler and enclosure maker. This new enclosure adds another option to the $79 case market, which already has a number of solid options. Let's see how it stacks up!

pro_m_cover.jpg

I was very impressed by the Phanteks Enthoo EVOLV ATX enclosure, which received our Editor’s Choice award when reviewed earlier this year. The enclosure was very solidly made and had a number of excellent features, and even with a primarily aluminum construction and premium design it can be found for $119, rather unheard-of for this combination in the enclosure market. So what changes from that design might be expect to see with the $79 Enthoo Pro M?

The Pro M is a very businesslike design, constructed of steel and plastic, and with a very understated appearance. Not exactly “boring”, as it does have some personality beyond the typical rectangular box, with a brushed finish to the front panel which also features a vented front fan opening, and a side panel window to show off your build. But I think the real story here is the intelligent internal design, which is nearly identical to that of the EVOLV ATX.

Continue reading our review of the Phanteks Enthoo Pro M enclosure!!

Seagate Pushes in to 8TB Territory with New Enterprise HDD Models

Subject: Storage | September 1, 2015 - 08:00 AM |
Tagged: Seagate, hdd, Enterprise NAS, Enterprise Capacity 3.5, 8TB

Just when we were starting to get comfortable with the thought of 6TB hard drives, Seagate goes and announces their lineup of 8TB HDDs:

lineup.png

Now before you get too excited about throwing one of these into your desktop, realize that these models are meant for enterprise and larger NAS environments:

selector.png

As you can see from the above chart, Seagate will be moving to 8TB maximum capacities on their 'Enterprise NAS' and 'Enterprise Capacity 3.5' models, which are meant for larger storage deployments.

Home and small business users opting to go with Seagate for their storage will remain limited to 4TB per drive for the time being.

kinetic.png

For those curious about Kinetic, this is Seagate's push to connect arrays of drives via standard Ethernet, which would allow specialized storage applications to speak directly to the raw storage via standard network gear. Kinetic HDDs are currently limited to 4TB, with 8TB planned this coming January.

Seagate's full press blast appears after the break.

Source: Seagate

AMD Releases App SDK 3.0 with OpenCL 2.0

Subject: Graphics Cards, Processors | August 30, 2015 - 09:14 PM |
Tagged: amd, carrizo, Fiji, opencl, opencl 2.0

Apart from manufacturers with a heavy first-party focus, such as Apple and Nintendo, hardware is useless without developer support. In this case, AMD has updated their App SDK to include support for OpenCL 2.0, with code samples. It also updates the SDK for Windows 10, Carrizo, and Fiji, but it is not entirely clear how.

amd-new2.png

That said, OpenCL is important to those two products. Fiji has a very high compute throughput compared to any other GPU at the moment, and its memory bandwidth is often even more important for GPGPU workloads. It is also useful for Carrizo, because parallel compute and HSA features are what make it a unique product. AMD has been creating first-party software software and helping popular third-party developers such as Adobe, but a little support to the world at large could bring a killer application or two, especially from the open-source community.

The SDK has been available in pre-release form for quite some time now, but it is finally graduated out of beta. OpenCL 2.0 allows for work to be generated on the GPU, which is especially useful for tasks that vary upon previous results without contacting the CPU again.

Source: AMD

Microsoft is a little fuzzy on what the word 'no' means

Subject: General Tech | September 2, 2015 - 06:37 PM |
Tagged: microsoft, KB3080149

It seems that not only aren't people leaping to Windows 10 and allowing Microsoft permission to collect their metadata but far too many who use Windows 7 or 8 are opting out of the program.  KB3080149 is a recent 'Update for customer experience and diagnostic telemetry' which will enable Microsoft to track your usage even though you explicitly opted out of the Customer Experience Improvement Programme.  At least the data sent is encrypted, little consolation for users as The Inquirer points out.

Customer_Experience_Improvement_Program_Settings.png

"MICROSOFT HAS BEGUN retrofitting some of the more controversial aspects of the new Windows 10 operating system to predecessors 7 and 8."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

IFA 2015: Acer Predator Z35 and XB1 G-SYNC Gaming Monitors

Subject: Displays | September 2, 2015 - 06:00 AM |
Tagged: Predator Z35, IFA 2015, gaming monitor, g-sync, curved, acer, 2560x1080, 21:9

Acer has announced a pair of gaming monitors, beginning with their first curved NVIDIA G-SYNC monitor, the Predator Z35.

Predator Z35_wp_game_02.jpg

This 21:9 UltraWide display features a 2560x1080 resolution and supports overclocking for up to 200 Hz refresh. The Predator Z35 certainly looks the part, with angular styling and a dramatically curved (2000R curvature) screen that promises to help provide immersive gameplay.

Next up is the Predator XB1 Series, which consists of both 27-inch and 28-inch models.

Predator XB281HK_wp_04.jpg

All monitors in the Predator XB1 Series feature NVIDIA G-SYNC technology, with resolution the differentiating factor between the two 27-inch models.

From Acer:

The 27-inch models (XB271HK / XB271HU) feature a ZeroFrame edge-to-edge design with 4K UHD (3840 x 2160) or WQHD (2560 x 1440) IPS panels that support 100% of the sRGB color gamut, while the XB271HU supports NVIDIA ULMB and refresh rates of up to 144Hz. The 28-inch model (XB281HK) features a 4K UHD panel that has a fast GTG (gray to gray) response time of 1ms, rendering fast-moving actions or dramatic transitions smoothly without smearing or ghosting. 

Pricing for the Predator Z35 will be $1199, with XB1 starting at $799. The Z35 will be available in the U.S. in December, while the XB1 will be available in November.

Source: Acer