Make yourself a WiFi camera remote

Subject: General Tech | January 25, 2016 - 12:40 PM |
Tagged: wifi, camera, DIY, iot

Hack a Day has posted a perfect example of how inexpensive and easy it is to build yourself useful things instead of shopping for expensive electronics.  If you have looked at the prices of cameras or adapters which allow you to wirelessly take a picture you have probably been disappointed, but you don't have to stay that way.  Instead, take an existing manual remote trigger, add in a WiFi enabled SoC module like the ESP8266 suggested in the video, download and compile the code and the next thing you know you will have a camera with wireless focus and shutter trigger.  Not too shabby for a ~$5 investment.

index.jpg

"It’s just ridiculous how cheap and easy it is to do some things today that were both costly and difficult just two or three years ago. Case in point: Hackaday.io user [gamaral] built a WiFi remote control for his Canon E3 camera out of just three parts"

Here is some more Tech News from around the web:

Tech Talk

 

Source: Hack a Day

AMD Shows Dual-Fiji Graphics Card in a Falcon Northwest PC at VRLA

Subject: Graphics Cards | January 25, 2016 - 11:51 AM |
Tagged: fury x2, Fiji, dual fiji, amd

Lo and behold! The dual-Fiji card that we have previous dubbed the AMD Radeon Fury X2 still lives! Based on a tweet from AMD PR dude Antal Tungler, a PC from Falcon Northwest at the VRLA convention was utilizing a dual-GPU Fiji graphics card to power some demos.

This prototype Falcon Northwest Tiki system was housing the GPU beast but no images were shown of the interior of the system. Still, it's good to see AMD at least recognize that this piece of hardware still exists at all, since it was initially promised to the enthusiast market by "fall of 2015."  Even in October we had hints that the card might be coming soon after seeing some shipping manifests leak out to the web. 

dualfuryken_0.jpg

Better late than never, right? One theory floating around inside the offices here is that AMD is going to release the Fury X2 along with the VR headsets coming out this spring, with hopes of making it THE VR graphics card of choice. The value of using multi-GPU for VR is interesting, with one GPU dedicated to each eye, though the pitfalls that could haunt both AMD and NVIDIA in this regard (latency, frame time consistency) make the technological capability a debate. 

Source: Twitter

Report: Intel Tigerlake Revealed; Company's Third 10nm CPU

Subject: Processors | January 24, 2016 - 12:19 PM |
Tagged: Tigerlake, rumor, report, processor, process node, Intel, Icelake, cpu, Cannonlake, 10 nm

A report from financial website The Motley Fool discusses Intel's plan to introduce three architectures at the 10 nm node, rather than the expected two. This comes after news that Kaby Lake will remain at the present 14 nm, interrupting Intel's 2-year manufacturing tech pace.

intel_10nm.jpg

(Image credit: wccftech)

"Management has told investors that they are pushing to try to get back to a two-year cadence post-10-nanometer (presumably they mean a two-year transition from 10-nanometer to 7-nanometer), however, from what I have just learned from a source familiar with Intel's plans, the company is working on three, not two, architectures for the 10-nanometer node."

Intel's first 10 nm processor architecture will be known as Cannonlake, with Icelake expected to follow about a year afterward. With Tigerlake expected to be the third architecture build on 10 nm, and not coming until "the second half of 2019", we probably won't see 7 nm from Intel until the second half of 2020 at the earliest.

It appears that the days of two-year, two product process node changes are numbered for Intel, as the report continues:

"If all goes well for the company, then 7-nanometer could be a two-product node, implying a transition to the 5-nanometer technology node by the second half of 2022. However, the source that I spoke to expressed significant doubts that Intel will be able to return to a two-years-per-technology cycle."

intel-node-density_large.png

(Image credit: The Motley Fool)

It will be interesting to see how players like TSMC, themselves "planning to start mass production of 7-nanometer in the first half of 2018", will fare moving forward as Intel's process development (apparently) slows.

High-End Surface Pro 4 and Surface Book Are Available

Subject: Systems | January 23, 2016 - 02:26 AM |
Tagged: microsoft, surface, surface pro 4, surface book

The Microsoft Surface Book and Surface Pro 4 launched back in October, and Ryan published a review of them in December. He didn't really make reference to it, but the highest-end model of each were unavailable until a later date. As it turns out, that time is roughly now. I say “roughly” because, while Microsoft has launched the devices, Amazon's landing page doesn't list them, and searching for the product directly shows a price tag of just under $10,000. I assume Amazon hasn't pushed the appropriate buttons yet.

microsoft-2016-surfacebook.JPG

The only real improvement that you will see, versus the second-highest SKU, is a jump in SSD capacity from 512GB to 1TB. This extra storage will cost roughly 1$/GB, but this is also a very fast NVMe SSD. If 512GB was too small, and you were holding out for availability of the 1TB model, then your wait should (basically) be over.

Although, since you waited this long, you might want to hold off a little longer. Microsoft is supposed to be correcting (some say) severe issues with upcoming firmware. You may want to see whether the problems are solved before dropping two-and-a-half to three grand.

Source: Microsoft

If want great audio and don't care about the price; HiFiMAN HE-1000

Subject: General Tech | January 22, 2016 - 01:47 PM |
Tagged: hifiman, headphones, HE-1000, audio

HiFiMAN have been producing mid level and high end audio products for quite some time, straddling the line between affordable and audiophile quality.  The HE-1000 are of the aforementioned audiophile level, at $3000 you really have to have discerning ears to want to pick up these cans.  The headset is quite pretty, built with leather, wood, and aluminium with soft cloth for the earcups and a window blind design on the exterior which HiFiMAN claims has a positive effect on the audio quality.  techPowerUp tested these headphones out, you can read the description of their experience in the audio soundstage these headphones create in their review ... or not.

he-1000.jpg

"HiFiMAN is constantly developing their planar technology, and today, we will take a look at their latest state-of-the-art headphone. It is dubbed the HE-1000 and features a nanometer thick diaphragm, leather headband, and milled aluminum. We take HiFiMAN's most audacious and pricey headphone for a ride!"

Here is some more Tech News from around the web:

Audio Corner

 

Source: techPowerUp

Cortana, where is my app?

Subject: General Tech | January 22, 2016 - 12:15 PM |
Tagged: microsoft, Windows Store, windows 10

If the complaints of the developers in this story over at The Register are accurate then the problem with the Windows Store might not be that there are no good apps but instead that you simply can't find them.  If a developer can't find their own app in the store using keywords in the title or description or even the ones they submitted to the store then how can you expect to?  If the only good way to find an app is to know its exact name, how many apps are there in the store that no one but the developer has even seen?  It is still possible that an improved search function will not help the Windows Store but at this point its reputation could not get all that much worse.

all-search-results-windows-store.jpg

"Looking at the developer forums though, it seems that official guidance and assistance for this issue is not easy to find, which will not help Microsoft in its efforts to establish a strong Windows 10 app ecosystem."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

GDDR5X Memory Standard Gets Official with JEDEC

Subject: Graphics Cards, Memory | January 22, 2016 - 11:08 AM |
Tagged: Polaris, pascal, nvidia, jedec, gddr5x, GDDR5, amd

Though information about the technology has been making rounds over the last several weeks, GDDR5X technology finally gets official with an announcement from JEDEC this morning. The JEDEC Solid State Foundation is, as Wikipedia tells us, an "independent semiconductor engineering trade organization and standardization body" that is responsible for creating memory standards. Getting the official nod from the org means we are likely to see implementations of GDDR5X in the near future.

The press release is short and sweet. Take a look.

ARLINGTON, Va., USA – JANUARY 21, 2016 –JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of JESD232 Graphics Double Data Rate (GDDR5X) SGRAM.  Available for free download from the JEDEC website, the new memory standard is designed to satisfy the increasing need for more memory bandwidth in graphics, gaming, compute, and networking applications.

Derived from the widely adopted GDDR5 SGRAM JEDEC standard, GDDR5X specifies key elements related to the design and operability of memory chips for applications requiring very high memory bandwidth.  With the intent to address the needs of high-performance applications demanding ever higher data rates, GDDR5X  is targeting data rates of 10 to 14 Gb/s, a 2X increase over GDDR5.  In order to allow a smooth transition from GDDR5, GDDR5X utilizes the same, proven pseudo open drain (POD) signaling as GDDR5.

“GDDR5X represents a significant leap forward for high end GPU design,” said Mian Quddus, JEDEC Board of Directors Chairman.  “Its performance improvements over the prior standard will help enable the next generation of graphics and other high-performance applications.”

JEDEC claims that by using the same signaling type as GDDR5 but it is able to double the per-pin data rate to 10-14 Gb/s. In fact, based on leaked slides about GDDR5X from October, JEDEC actually calls GDDR5X an extension to GDDR5, not a new standard. How does GDDR5X reach these new speeds? By doubling the prefech from 32 bytes to 64 bytes. This will require a redesign of the memory controller for any processor that wants to integrate it. 

gddr5x.jpg

Image source: VR-Zone.com

As for usable bandwidth, though information isn't quoted directly, it would likely see a much lower increase than we are seeing in the per-pin statements from the press release. Because the memory bus width would remain unchanged, and GDDR5X just grabs twice the chunk sizes in prefetch, we should expect an incremental change. No mention of power efficiency is mentioned either and that was one of the driving factors in the development of HBM.

07-bwperwatt.jpg

Performance efficiency graph from AMD's HBM presentation

I am excited about any improvement in memory technology that will increase GPU performance, but I can tell you that from my conversations with both AMD and NVIDIA, no one appears to be jumping at the chance to integrate GDDR5X into upcoming graphics cards. That doesn't mean it won't happen with some version of Polaris or Pascal, but it seems that there may be concerns other than bandwidth that keep it from taking hold. 

Source: JEDEC

Crystal Dynamics Reveals Minimum Specifications for Rise of the Tomb Raider on the PC

Subject: General Tech | January 22, 2016 - 02:06 AM |
Tagged: tomb raider, pc gaming, min specs, can it run

Crystal Dynamics has revealed the minimum system requirements for Rise of the Tomb Raider on PC. This latest Lara Croft adventure sees the ever-resilient tomb raider following in the footsteps of her father in search of an artifact said to grant immortality amidst the lost city of Kitezh. Fortunately for gamers, Rise of the Tomb Raider has quite a low bar for entry with modest minimum system requirements. You will need more powerful hardware than its 2013 predecessor (Tomb Raider), but it is still quite manageable.

Rise of the Tomb Raider.jpg

PC gamers will need a 64-bit version of Windows, a dual core Intel Core i3-2100 (2 core, 4 thread at 3.1 GHz) or, for example, AMD FX 4100 processor, 6 GB of system memory, 25 GB of storage space for all the game files, and, of course, a graphics card with 2 GB of video memory such as the NVIDIA GTX 650 or AMD Radeon HD7770. Naturally, hardware with higher specifications/capabilities will get you better performance and visuals, but the above is what you will need to play.

Minimum PC System Requirements:

  • Dual Core Processor (e.g. Core i3-2100 or FX 4100)
  • 6 GB RAM
  • 25 GB Available Storage Space
  • 2 GB Graphics Card (e.g. GTX 650 or Radeon HD7770)

For those curious, Tomb Raider (2013) required XP SP3 32-bit, a dual core Intel Core 2 Duo E6300 or Athlon 64 X2 4050+ CPU, 1 GB of RAM (2 GB for Vista), and an NVIDIA 8600 or AMD HD2600 XT GPU with 512MB of video memory.

Rise of the Tomb Raider will reportedly add new stealth and crafting components along with new weapons and options for close quarters combat. Further, the game will feature day and night cycles with realistic weather which should make for cold snow-filled nights in Siberia as well as opportunities to sneak up on unwitting guards freezing their buns off!

The game is set to release on January 28th for the PC and joins the the Xbox One version that launched back in November 2015 where it will be a timed exclusive (it will come to the PS4 later this year).

Personally, I am excited for this game. I picked up its predecessor during a Steam sale for super cheap only to let it sit in my inventory for about a year. It was one of those 'I'll play it eventually, but it's not really a priority' things where the price finally got me (heh). Little did I know how wrong I was, because once I finally got around to firing up the game, I played it near constantly until I beat it! It was a surprisingly fun reboot of the series, and I am hopeful that RofTR will be more of the same! 

Source: Bit-Tech

Gigabyte Teases New Power Supplies

Subject: Cases and Cooling | January 21, 2016 - 10:44 PM |
Tagged: modular psu, gigabyte, ATX PSU

Gigabyte made an announcement teasing two new power supplies last week. The G750H and B700H are 80 PLUS rated models topping out at 750W and 700W respectively. A company most well-known for its motherboards, it was somewhat surprising to see it tease power supplies and to discover that these PSUs are not even the first to be sold by Gigabyte with its branding.

The G750H and B700H are ATX form factor and use a semi-modular design that leaves the 24-pin ATX and 8-pin CPU power cables permanently attached and uses modular cables for all other connections (see below). One neat thing is that Gigabyte is using all black flat individually sleeved cables which may make it easier to hide and route them behind the motherboard tray (which on some cases can be an especially narrow channel). Both models are rated for SLI and Crossfire multi-GPU setups, use at least some Japanese capacitors (the G750H uses all Japanese capacitors), have a MTBF of 100,000 hours, and five year warranties.

Gigabyte G750H PSU.png

In addition to the motherboard and CPU power, users can install two eight pin PCI-E, five SATA power, three Molex, and one floppy power connector. The modular cable configuration is the same on both PSU models.

The G750H is up to 90% efficient (80+ Gold) and uses a 140mm temperature controlled fan to keep noise levels low and the internal components cool (and efficient). Gigabyte has opted for a single rail design that sees the 12V rail rated at up to 62 amps.

On the other hand, the B700H is up to 85% efficient (80+ Bronze) at typical loads. It has a smaller 120mm temperature controlled fan for cooling. This model also uses a single 12V rail, but it tops out at 54 amps.

Several sites around the Internet have indicated (including Maximum PC) that Gigabyte has made the G750H and B700H available now, but they do not seem to be for sale yet in the US. I have tried to unearth pricing as well as the identity of the ODM Gigabyte is using for these new units, but no such luck so far. From my research, it appears that Gigabyte has used a number of different ODM/OEMs of varying quality for their past power supplies. It seems that we will have to wait for reviews to know for sure how these new PSUs will perform. I hope that Gigabyte has stepped up its power supply game as it has quite a bit of competition these days!

Source: Gigabyte

Windows 10 Build 11102 Released to Fast Ring Insiders

Subject: General Tech | January 21, 2016 - 06:39 PM |
Tagged: windows 10, microsoft

I wasn't planning on reporting every Windows 10 Insider build, but I actually have something to say about this one. The last couple of builds were examples of Microsoft switching to a faster release cycle for preview users, although the known issues list were quite benign. The semi-frequent upgrade cycles would shake users away from the Insider program, or transition to Slow.

They now seem ready to start rolling back to less QA for Fast users.

windows-10-bandaid.png

Five issues are known about build 11102. First, internal changes to the Windows Graphics architecture cause some games to crash on launch, full screen, or resolution changes. The Witcher 3, Fallout 4, Tomb Raider, Assassin's Creed, and Metal Gear Solid V are known to be affected by this bug, but other games (and maybe even other software) could be affected too. Second, screen readers and other accessibility software may crash randomly. If you require those accommodations, then this build could make your device functionally unusable to you.

Beyond those two, big issues, three other ones are present. There is an error message on login with a workaround, a breaking change for old wireless drivers that you should probably upgrade beforehand if you rely upon wireless to download drivers, and “The Connect button does not show up in Action Center.”

Microsoft is currently updating the deep insides of the OS, which means that they will be poking around the source code in weird places. Once it's completed, this should make Windows more maintainable, especially for multiple types of hardware. But again, if you're not wanting to be a part of this, switch to Slow or leave Insider.

Source: Microsoft

Not the Toshiba Satellite of yore, the Satellite Radius 12 has new tricks

Subject: Mobile | January 21, 2016 - 06:14 PM |
Tagged: toshiba, Satellite Radius 12

Ah, the old Toshiba Satellite; like a Volvo it was never the best nor the prettiest but short of a major collision nothing could kill it.  Since those times Toshiba has had a rough go of it, The Inquirer states they have predicted a $4.5bn loss, just after being caught cooking the books.  That has not stopped them from improving their Satellite lineup and the Satellite Radius 12 ultraportable is a great example of that.

The screen on this 300x209x15.4mm (11.8x8.2x0.6") and 1.32kg (2.9lb), 12.1" convertible laptop is an impressive 3840x2160 IPS display which can be fully flipped open to a tablet like form factor.  An i7-6500U, 8GB RAM and an unspecified 256GB SSD offer great performance, although battery life does suffer somewhat due to the screen and components.  Toshiba also included a dedicated Cortana button, cellphone like volume rocker, 0.9MP webcam and an infrared camera which works with Windows Hello but is not a RealSense camera.  The Inquirer found a lot to like about this laptop as well as some fairly serious shortcomings, read about them all in their review.

satellite-radius-12-design-540x334.jpeg

"This is the latest in Toshiba's rotating display convertible line, and the first of its kind to include a so-called 4K screen, making it an interesting proposition regardless of its creator's misfortunes."

Here are some more Mobile articles from around the web:

Mobile

 

Source: The Inquirer

Podcast #383 - Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!

Subject: General Tech | January 21, 2016 - 02:34 PM |
Tagged: x99-m, X170, X150, video, Silent Base 800, Q4 2015, Predator X34, podcast, gigabyte, g-sync, freesync, earnings, be quiet, asus, amd, acer

PC Perspective Podcast #383 - 01/21/2016

Join us this week as we discuss the Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Know anyone who uses the Intel Driver Update Utility? Update the updater ASAP

Subject: General Tech | January 21, 2016 - 12:52 PM |
Tagged: Intel, intel driver update utility, security

The Intel Driver Update Utility is not the most commonly found application on PCs but someone you know may have stumbled upon it or had it installed by Geek Squad or the local equivalent.  Since Windows Vista the tool has been available, it checks your system for any Intel parts, from your APU to your NIC and then looks for any applicable drivers that are available.  Unfortunately it was doing so over a non-SSL URL which leaves the utility wide open to a man in the middle attack and you really do not want a compromised NIC driver.  The Inquirer reports today that Intel quietly updated the tool on January 19th to resolve the issue, ensuring all communication and downloads are over SSL.  If you know anyone using this tool, recommend they update it immediately.

intel-driver-update.jpg

"Intel has issued a fix for a major security vulnerability in a driver utility tool that could have allowed a man-in-the-middle attack and a malware maelstrom on victims' computers."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Google Releases Chrome 48 with Interesting Features

Subject: General Tech | January 21, 2016 - 02:59 AM |
Tagged: google, chrome

Web browsers are typically on rapid release cycles so they can get features out frequently. The Web is changing on a constant basis to help it become an effective application platform, which is cross-compatible with competing implementations. A common complaint is that the cycle is to yield high version numbers for marketing, to give a false sense of maturity, but I'd expect that frequent, breaking changes are kind-of necessary to synchronize features between implementations. If Google lands a feature a month after Mozilla publishes a new version, should they really wait two years for their next one? Granted, they probably knew about it pre-release, but you get the idea. Also, even if the theory is true, artificially high version numbers is one of the most benign things a company could do.

Google_Chrome_icon_(2011).png

Some versions introduce some fairly interesting features, though. This one, Google Chrome 48, deprecates RC4 encryption for HTTPS, which forces web servers to use newer cyphers or they will fail to load.

Another major one, and probably more interesting for our audience, is the introduction of VP9 to WebRTC. This video codec is Google's open competitor to H.265. At similar quality settings, VP9 will use about half of the bandwidth (or storage) as VP8. WebRTC is mostly used for video conferencing, but it's really an open platform for webcam, microphone, audio, video, and raw, peer-to-peer data connections. There are even examples of it being used to synchronize objects in multiplayer video games, which has nothing to do with video or audio streaming. I'm not sure what is possible with this support, but it might even lead to web applications that can edit video.

Google Chrome 48 is available today. Also, as a related note, Firefox 44 should release next week with its own features, like experimental rendering of WebGL images offscreen and multi-threaded. The full changelog for Google Chrome 48 from Git is about 42 MB large and, ironically, tends to crash Firefox.

Source: VentureBeat

GDC 2016 Sessions Are Up and DirectX 12 / Vulkan Are There

Subject: General Tech | January 20, 2016 - 07:06 PM |
Tagged: vulkan, ue4, nvidia, Intel, gdc 2016, GDC, epic games, DirectX 12, Codemasters, arm, amd

The 30th Game Developers Conference (GDC) will take place on March 14th through March 18th, with the expo itself starting on March 16th. The sessions have been published at some point, with DX12 and Vulkan prominently featured. While the technologies have not been adopted as quickly as advertised, the direction is definitely forward. In fact, NVIDIA, Khronos Group, and Valve have just finished hosting a developer day for Vulkan. It is coming.

gdc-2016-logo.png

One interesting session will be hosted by Codemasters and Intel, which discusses bringing the F1 2015 engine to DirectX 12. It will highlight a few features they implemented, such as voxel based raytracing using conservative rasterization, which overestimates the size of individual triangles so you don't get edge effects on pixels that are partially influenced by an edge that cuts through a tiny, but not negligible, portion of them. Sites like Game Debate (Update: Whoops, forgot the link) wonder if these features will be patched in to older titles, like F1 2015, or if they're just R&D for future games.

Another keynote will discuss bringing Vulkan to mobile through Unreal Engine 4. This one will be hosted by ARM and Epic Games. Mobile processors have quite a few cores, albeit ones that are slower at single-threaded tasks, and decent GPUs. Being able to keep them loaded will bring their gaming potential up closer to the GPU's theoretical performance, which has surpassed both the Xbox 360 and PlayStation 3, sometimes by a factor of 2 or more.

Many (most?) slide decks and video recordings are available for free after the fact, but we can't really know which ones ahead of time. It should be an interesting year, though.

Source: GDC

Phoronix Tests Almost a Decade of GPUs

Subject: Graphics Cards | January 20, 2016 - 03:26 PM |
Tagged: nvidia, linux, tesla, fermi, kepler, maxwell

It's nice to see long-term roundups every once in a while. They do not really provide useful information for someone looking to make a purchase, but they show how our industry is changing (or not). In this case, Phoronix tested twenty-seven NVIDIA GeForce cards across four architectures: Tesla, Fermi, Kepler, and Maxwell. In other words, from the GeForce 8 series all the way up to the GTX 980 Ti.

phoronix-2016-many-nvidia-roundup.jpg

Image Credit: Phoronix

Nine years of advancements in ASIC design, with a doubling time-step of 18 months, should yield a 64-fold improvement. The number of transistors falls short, showing about a 12-fold improvement between the Titan X and the largest first-wave Tesla, although that means nothing for a fabless semiconductor designer. The main reason why I include this figure is to show the actual Moore's Law trend over this time span, but it also highlights the slowdown in process technology.

Performance per watt does depend on NVIDIA though, and the ratio between the GTX 980 Ti and the 8500 GT is about 72:1. While this is slightly better than the target 64:1 ratio, these parts are from very different locations in their respective product stacks. Swapping the 8500 GT for the following year's 9800 GTX, which leads to a comparison between top-of-the-line GPUs of their respective times, and you see a 6.2x improvement in performance per watt versus the GTX 980 Ti. On the other hand, that part was outstanding for its era.

I should note that each of these tests take place on Linux. It might not perfectly reflect the landscape on Windows, but again, it's interesting in its own right.

Source: Phoronix

The Azza Nova 8000 certainly stands out in the crowd

Subject: Cases and Cooling | January 20, 2016 - 02:51 PM |
Tagged: azza, Nova 8000

You have to take a look at several pictures of the Azza Nova 8000 before you truly understand just how the orange and black case looks; to then decide if it is hot or not.  The swing out doors which allow you to access your drives are a unique feature but arguable one of limited usage. Leaving the aesthetics behind, the case supports up to E-ATX boards at 589x221x574mm (21.6x8.7x22.6") in size and supports up to four 120mm fans or up to a 360mm radiator at the top and a 240mm one on the bottom.  With up to 13 drives supported the case is certainly aimed towards the data pack rat and helps to explain the drive chambers somewhat, but not so much the colour scheme.  Check out the full review at Overclockers Club if the picture below doesn't immediately scare you off.

13.jpg

"The fit and finish of this case is top notch. All the panels lined up and fit together nicely. The top I/O panel gives you two USB 3.0 and two USB 2.0 ports, which is fairly standard on a case this size, and the colorful LEDs break up the monotony you find with many cases that use single-color LEDs. And while I am talking about LEDs, the gentle orange glow from the front fan adds a nice touch."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

"Imagine the original Deus Ex combined with the film Die Hard."

Subject: General Tech | January 20, 2016 - 01:42 PM |
Tagged: gaming, Kickstarter, consortium

That is a hell of a tagline which will be hard to live up to as trying to live up to the fond memories of the original Deus Ex while figuring out how to make Die Hard playable is not a trivial task.  The original ($2.99 on GoG right now) had mixed reviews, some loving the way you are dumped into an immersive story with no introduction and others frustrated by a lack of tutorial.  It was certainly different than your usual game in 2014.  The second game in the developers planned trilogy is Consortium: The Tower and seems to be more focused on being a single person stealth/action game than the team based experience of the first but the trailer does seem to have some of the same flavour as the first when it comes to the story and dialogue.  You don't often see a player choose to drop his weapons and then attempt to dialogue with the bad guys, except in cut scenes of which there will be none in this game.  Check out the trailer, read what the gang at Rock, Paper, SHOTGUN had to say about it and see if you think it is worth tossing in on the Kickstarter campaign.

We could have talked about FarCry Primal but Ubisoft is too busy being themselves by  taking down trailers and generally making it hard to have nice things.

"As pitches go, Consortium: The Tower has a bloody good line up its sleeve. Like its predecessor, it’s a science fiction game set in a single environment. You can talk, fight or sneak your way past or through encounters, and many events will happen even if you’re not there to see or influence them."

Here is some more Tech News from around the web:

Gaming

 

Just fondle your mouse to log into Windows?

Subject: General Tech | January 20, 2016 - 12:19 PM |
Tagged: fingerprint, synaptics, ironveil, security

Synaptics, the company most likely responsible for the trackpad on your laptop has released a new product, a 4x10mm fingerprint sensor which goes by the name of IronVeil.  The idea behind the product is to incorporate it into peripherals and pair it with Windows Passport to allow you to log in by touching your mouse or keyboard, similar to the current generation of cellphones.  Synaptics also suggests it could be used in eSports to ensure that the person behind the mouse is indeed who they claim to be.  The Tech Report tried out a Thermaltake Black V2 mouse with the sensor embedded and talk about their experiences with the mouse as well as introduce you to the FIDO Alliance and some of the authentication process which occurs behind the scenes in their recent article.

One cannot help but point out that while passwords can be hashed and salted, the same cannot be said for fingerprints which leads us back to previously mentioned concerns about the security of the online storage databases these prints would be stored in.  The eternal battle of convenience versus security rages on.

Synaptics-IronVeil-Specs.png

"Synaptics' IronVeil is a tiny fingerprint sensor module that serves as the foundation for a variety of new authentication techniques for home and business users alike. We've spent a couple weeks with a pre-production IronVeil mouse, and we've explored how it might be used in practice."

Here is some more Tech News from around the web:

Tech Talk

TSMC Allegedly Wants 5nm by 2020

Subject: Graphics Cards, Processors | January 19, 2016 - 11:38 PM |
Tagged: TSMC

Digitimes is reporting on statements that were allegedly made by TSMC co-CEO, Mark Liu. We are currently seeing 16nm parts come out of the foundry, which is expected to be used in the next generation of GPUs, replacing the long-running 28nm node that launched with the GeForce GTX 680. (It's still unannounced whether AMD and NVIDIA will use 14nm FinFET from Samsung or GlobalFoundries, or 16nm FinFET from TSMC.)

Update (Jan 20th, @4pm EST): Couple minor corrections. Radeon HD 7970 launched at 28nm first by a couple of months. I just remember NVIDIA getting swamped in delays because it was a new node, so that's probably why I thought of the GTX 680. Also, AMD announced during CES that they will use GlobalFoundries to fab their upcoming GPUs, which I apparently missed. We suspect that NVIDIA will use TSMC, and have assumed that for a while, but it hasn't been officially announced yet (if ever).

tsmc.jpg

According to their projections, which (again) are filtered through Digitimes, the foundry expects to have 7nm in the first half of 2018. They also expect to introduce extreme ultraviolet (EUV) lithography methods with 5nm in 2020. Given that Silicon in a solid has a lattice spacing of ~0.54nm at room temperature, 7nm transistors will consist of about 13 atoms, and 5nm transistors will have features containing about 9 atoms.

We continue the march toward the end of silicon lithography.

Even if the statement is correct, much can happen between then and now. It wouldn't be the first time that I've seen a major foundry believe that a node would be available, but end up having it delayed. I wouldn't hold my breath, but I might cross my fingers if my hands were free.

At the very least, we can assume that TSMC's roadmap is 16nm, 10nm, 7nm, and then 5nm.

Source: Digitimes