Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

A friendly reminder about your OneDrive storage amount

Subject: General Tech | February 5, 2016 - 05:06 PM |
Tagged: onedrive, microsoft, cloud storage

Remember the good old days when OneDrive moved from offering you 1TB of storage to an unlimited amount?  That did not last too long, they changed their minds and dropped the paid service back to 1TB and the free version from 15GB to 5GB, with a chance to grandfather in the additional storage if you followed up with them.

A viewer recently encountered this for the first time and it seems appropriate to remind everyone about the change.  If you have the paid service and are storing over 1TB you may have already heard from Microsoft but if not then consider this the warning that you have better trim down the amount of data you store on OneDrive as the changes are going to happen in the latter half of this year.  The same goes for free users who have 15GB, or 30GB if you opted into the camera roll service, get the amount of files you have stored on OneDrive under 5GB or risk losing data you would rather keep.  The standalone 100GB and 200GB plans will be reduced to 50GB, the price will remain at $1.99 per month.

The whole situation is reminiscent of a teacher in a classroom full of kids choosing to punish the entire class for the actions of a few individuals; in this case the tiny percentage which exceeded 75TB of usage.  Make sure to clean up your OneDrive as soon as possible, this is not something you want to wait until the last minute to do.

OneDrive-Logo_large.png

"If you are using more than 5 GB of free storage, you will continue to have access to all files for at least 12 months after these changes go into effect in early 2016. In addition, you can redeem a free one-year Office 365 Personal subscription (credit card required), which includes 1 TB of OneDrive storage."

Here is some more Tech News from around the web:

Tech Talk

Source: OneDrive

MSI GS72 Stealth Pro Details Released

Subject: Systems | February 6, 2016 - 11:30 PM |
Tagged: msi, gs72, gaming laptop, laptop

This laptop was announced at CES, but barely. They have now released full specifications, including options, which are actually quite interesting. The 4K panel, in particular, has a color gamut that fully covers AdobeRGB (100%). This means that, if the hardware and software are properly calibrated, it is compatible with the color spaces that both video and print professionals tend to target. The latter is quite difficult, because magazine publishers actually have a large palette. Even the Wacom Cintiq 22HD only covers around 72% AdobeRGB.

msi-2016-gs72.jpg

Outside of this, the laptop has one processor choice: a Skylake-based Intel Core i7-6700HQ backed with up to 32GB of DDR4 RAM. There are three choices in GPU: NVIDIA GeForce GTX 960M, 965M, and 970M. This could be disappointing for those hoping for desktop-class performance, although the 970M is pretty close to a GTX 680. It should handle games like Just Cause 3 and Rainbow Six Siege at around 50-60 FPS in 1080p mode. Basically, you are going to be dropping the 4K resolution down to about 1080p in games, but it's also a laptop and 4K in professional applications is quite nice. It also uses M.2 SSDs with PCIe 3.0 x4 bandwidth that communicates in the NVMe standard. They didn't say which one, or how large, but they claim read speeds of about 2.2GB/s.

They did not state pricing or availability. Its headlining feature is thickness -- just 1.99cm for a 17-inch display. This explains the GPU, but also suggests a premium price.

Source: MSI
Author:
Subject: Editorial
Manufacturer: ARM

28HPCU: Cost Effective and Power Efficient

Have you ever been approached about something and upon first hearing about it, the opportunity just did not seem very exciting?  Then upon digging into things, it became much more interesting?  This happened to me with this announcement.  At first blush, who really cares that ARM is partnering with UMC at 28 nm?  Well, once I was able to chat with the people at ARM, it is much more interesting than initially expected.

icon_arm.jpg

The new hotness in fabrication is the latest 14 nm and 16 nm processes from Samsung/GF and TSMC respectively.  It has been a good 4+ years since we last had a new process node that actually performed as expected.  The planar 22/20 nm products just were not entirely suitable for mass production.  Apple was one of the few to actually develop a part for TSMC’s 20 nm process that actually sold in the millions.  The main problem was a lack of power and speed scaling as compared to 28 nm processes.  Planar was a bad choice, but the development of FinFET technologies hadn’t been implemented in time for it to show up at this time by 3rd party manufacturers.

There is a problem with the latest process generations, though.  They are new, expensive, and are production constrained.  Also, they may not be entirely appropriate for the applications that are being developed.  There are several strengths with 28 nm as compared.  These are mature processes with an excess of line space.  The major fabs are offering very competitive pricing structures for 28 nm as they see space being cleared up on the lines with higher end SOCs, GPUs, and assorted ASICs migrating to the new process nodes.

umc_01.png

TSMC has typically been on the forefront of R&D with advanced nodes.  UMC is not as aggressive with their development, but they tend to let others do some of the heavy lifting and then integrate the new nodes when it fits their pricing and business models.  TSMC is on their third generation of 28 nm.  UMC is on their second, but that generation encompasses many of the advanced features of TSMC’s 3rd generation so it is actually quite competitive.

Click here to continue reading about ARM, UMC, and the 28HPCU process!

Swiftech's H320 X2, bigger, better and ready for your personal touches

Subject: Cases and Cooling | February 5, 2016 - 06:44 PM |
Tagged: swiftech, H320 X2, AIO, watercooling

The Swiftech H320 X2 is obviously designed for those who like to show off the insides of their system, parts of both the reservoir and waterblock are clear as is the piping and there are indeed LEDs on the cooler.  It is larger than the previous generation, the radiator is 127 x 375 x 28mm with a 109ml reservoir, three Swiftech Helix 120mm PWM fans are installed to pull heat from the radiator.  Modders Inc loved the fact that while this is an AiO cooler, it is designed with modding in mind as you can add in or switch out components which is a rarity in AiO watercoolers.  The performance was also impressive, you can read about that and more in their full review.

DSC_7910.jpg

"All-in-one (AIO) water cooling units have brought the performance and silence of water cooling to the masses with the simplicity of installing an air cooler. AIOs offer simple installation without the need to bleed the loop. Simply attach the hardware and power cables and you are all set."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

Source: Modders Inc

Unreal Editor for Unreal Engine 4 in VR

Subject: General Tech, Shows and Expos | February 4, 2016 - 07:47 PM |
Tagged: GDC, gdc 2016, epic games, ue4, VR, vive vr

Epic Games released Unreal Engine 4 at GDC two years ago, and removed its subscription fee at the next year's show. This year, one of the things that they will show is Unreal Editor in VR with the HTC Vive. Using the system's motion controllers, you will be able to move objects and access UI panels in the virtual environment. They open the video declaring that this is not an experimental project.

epicgames-2016-tim-vr.jpg

Without using this technology, it's hard to comment on its usability. It definitely looks interesting, and might be useful for VR experiences. You can see what your experience will look like as you create it, and you probably even save a bit of time in rapid iteration by not continuously wearing and removing the equipment. I wonder how precise it will be though, since the laser pointers and objects seemed to snap and jitter a bit. That said, it might be just as precise and, even still, it only really matters how it looks and behaves, and it shouldn't even prevent minor tweaks after the fact anyway.

Epic Games expects to discuss the release plans at the show.

Source: Epic Games

Next on the list of companies which should know better is Malwarebytes, but it is not as bad as some say

Subject: General Tech | February 3, 2016 - 12:46 PM |
Tagged: security, Malwarebytes

Considering the business that Malwarebytes is in you can expect to see a lot of negative press about a gaping security hole in the near future and while there is a vulnerability it is not as bad as many will make it out to be.  The issue lies in that signature updates are done over HTTP and are unsigned, very bad practice but something which would be exploited on a single client connection as opposed to something you could use to create a wide spread infection.  The Register links to the Google Project Zero entry which was released today as the vulnerability was first reported to Malwarebytes 90 days ago and has not been addressed on the client side.

The actual concern you should have is that the original bug report also found vulnerabilities on the server side.  Malwarebytes did correct the server side issues almost immediately but neglected to follow through on the client side.  It is good of them to patch and offer bug bounties but a complete follow through is necessary if you are a security software peddler who wants their reputation to stay intact.

mb-logo.png

"The antivirus firm says it has addressed server-side vulnerabilities that were reported by Google Project Zero researcher Tavis Ormandy in November. However, security holes remain in the client-side software that runs on people's Windows PCs."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

Monoprice Graphics Tablets Are Available

Subject: Displays | February 6, 2016 - 10:41 PM |
Tagged: monoprice, pen display, touch screen, drawing

A couple of CESes ago, Monoprice launched a couple of 22-inch pen displays to compete with the Wacom Cintiq 22HD. Shortly afterward, the products disappeared from their website and line-up, so I assumed, at the time, that they changed their mind or otherwise refocused.

monoprice-2016-pen-display-22.jpg

Turns out, it was only temporary. There are now two models on their product list, one for $499.99 and another for $599.99, although I have a feeling that the cheaper model might be discontinued. The only real, concrete difference that I can see is the $599.99 model uses “battery-free” pens, which I'm assuming is powered by induction from the display surface. The cheaper model is out-of-stock with an estimated availability of “TBD”. That one uses rechargeable pens. The $599.99 model also lists Linux drivers. The $599.99 version also has a slower response time (12ms vs 5ms) and higher viewing angles, although both are listed as IPS.

Whether or not the $499.99 model will become available again, the $599.99 one is still about a third of the price of the Wacom Cintiq 22HD. Also, unlike the Wacom, it supports Linux as mentioned above. They used to offer a pen display with a ten-finger capacitive touchscreen, which competes with the Wacom Cintiq 22HD Touch, but that has not been relaunched, at least not yet.

Source: Monoprice

Microsoft Lets Anyone "Git" Their Deep Learning On With Open Source CNTK

Subject: General Tech | February 4, 2016 - 01:18 PM |
Tagged: open source, microsoft, machine learning, deep neural network, deep learning, cntk, azure

Microsoft has been using deep neural networks for awhile now to power its speech recognition technologies bundled into Windows and Skype to identify and follow commands and to translate speech respectively. This technology is part of Microsoft's Computational Network Toolkit. Last April, the company made this toolkit available to academic researchers on Codeplex, and it is now opening it up even more by moving the project to GitHub and placing it under an open source license.

Lead by chief speech and computer scientist Xuedong Huang, a team of Microsoft researchers built the Computational Network Toolkit (CNTK) to power all their speech related projects. The CNTK is a deep neural network for machine learning that is built to be fast and scalable across multiple systems, and more importantly, multiple GPUs which excel at these kinds of parallel processing workloads and algorithms. Microsoft heavily focused on scalability with CNTK and according to the company's own benchmarks (which is to say to be taken with a healthy dose of salt) while the major competing neural network tool kits offer similar performance running on a single GPU, when adding more than one graphics card CNTK is vastly more efficient with almost four times the performance of Google's TensorFlow and a bit more than 1.5-times Torch 7 and Caffe. Where CNTK gets a bit deep learning crazy is its ability to scale beyond a single system and easily tap into Microsoft's Azure GPU Lab to get access to numerous GPUs from their remote datacenters -- though its not free you don't need to purchase, store, and power the hardware locally and can ramp the number up and down based on how much GPU muscle you need. The example Microsoft provided showed two similarly spec'd Linux systems with four GPUs each running on Azure cloud hosting getting close to twice the performance of the 4 GPU system (75% increase). Microsoft claims that "CNTK can easily scale beyond 8 GPUs across multiple machines with superior distributed system performance."

cntk-speed-comparison.png

Using GPU-based Azure machines, Microsoft was able to increase the performance of Cortana's speech recognition by 10-times compared to the local systems they were previously using.

It is always cool to see GPU compute in practice and now that CNTK is available to everyone, I expect to see a lot of new uses for the toolkit beyond speech recognition. Moving to an open source license is certainly good PR, but I think it was actually done more for Microsoft's own benefit rather than users which isn't necessarily a bad thing since both get to benefit from it. I am really interested to see what researchers are able to do with a deep neural network that reportedly offers so much performance thanks to GPUs. I'm curious what new kinds of machine learning opportunities the extra speed will enable.

If you are interested, you can check out CNTK on GitHub!

Source: Microsoft

BitTorrent Talks Encryption, Improved Linux Support For Sync 2.3

Subject: General Tech | February 2, 2016 - 05:11 PM |
Tagged: file syncing, encryption, bittorrent sync, bittorrent

BitTorrent continues to support its file sharing and syncing application with the recent release of Sync 2.3.1. The 2.3.x update contains a number of bug fixes for stability, but the important news is the added support for encrypted folders and finally allowing selective file syncing on Linux systems. Additionally, the company put out a short brief on the information they collect and how they are securing your files synced by Sync which is available as a PDF.

BitTorrent Sync 2_3 Encrypted Folders.png

Sync 2.3 allows Windows users to run Sync as a service and Android users can move data to and from an SD card from within the app so long as they are running at least Android 5.0 or newer. Linux users also get a bit of love with support for selective file syncing (where you can choose which specific files to download locally and which to keep on the remote peers) though it appears that BitTorrent has limited this feature to its paid Sync Pro tier which is in line with other platforms. According to BitTorrent Inc. among the performance and bug fixes, the biggest UI change is a redesigned process for adding new folders.

On the security and privacy front, BitTorrent claims that it employs several security measures to keep your data safe. First though, the company allegedly only collects benign data including the program version, add folder errors, the amount of data transferred (directly and via relay server), number of peers, and share link and tracker statistics as well as few more things you can see in the brief linked above. All the data that they collect is reportedly sent in the clear so that users can verify what they are collecting on them.

To secure your files, BitTorrent uses SSL and AES-128 encryption to transfer files. In the case of Advanced folders, it generates a X.509 certificate (each folder is given it's own certificate) using a certificate authority and then uses a certificate chain to control user access and file modification permissions as well as a mechanism to revoke access. In the case of encrypted folders, Sync generates storage and session keys with the session keys complying with perfect forwards secrecy standards such that future session keys being cracked does not compromise past sessions. When using the encrypted folders option (which is useful when using a VPS as an off-site backup or to any machine that you do not fully own and control for that matter), data from your local machines is encrypted before being sent to the remote machine using AES 128 bit encryption (I wish they had gone with at least AES-256, but it's something). The data is then sent over SSL. Thus, the data on the remote machine is never in an unencrypted state which is a good thing for having a secure off-site backup. The encrypted folder can still be used as part of the mesh to speed up syncing among your machines, as well, while remaining secure.

I think the encrypted folders are a good addition to Sync, though the encryption bit-ness could be improved (a weak VPS' processor doesn't need to decrypt the data anyway so CPU time needed for the beefier algorithm should not matter...). In past coverage users have mentioned issues when syncing folders that they encrypted themselves before adding to Sync where the data could get corrupted when the peers became confused on changes made and what to sync. Hopefully this will help avoid that though they do still need to work on fixing user chosen pre-sync encryption. I am still using Sync to backup my photos and sync documents between my laptop and desktop and it works well for that sans the storage limits imposed by One Drive (and the uncertainty of my once-promised 25GB of free storage).

What do you think of the changes, and is their security good enough?

Source: BitTorrent