Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Extreme Overclocking of Skylake (7.02566 GHz)

Subject: Processors | February 6, 2016 - 09:00 PM |
Tagged: Skylake, overclocking, asrock, Intel, gskill

I recently came across a post at PC Gamer that looked at the extreme overclocking leaderboard of the Skylake-based Intel Core i7-6700K. Obviously, these competitions will probably never end as long as higher numbers are possible on parts that are interesting for one reason or another. Skylake is the new chip on the liquid nitrogen block. It cannot reach frequencies as high as its predecessors, but teams still compete to get as high as possible on that specific SKU.

overclock-2016-skylake6700k7ghz.jpg

The current world record for a single-threaded Intel Core i7-6700K is 7.02566 GHz, which is achieved with a voltage of 4.032V. For comparison, the i7-6700K is typically around 1.3V at load. This record was apparently set about a month ago, on January 11th.

This is obviously a huge increase, about three-fold more voltage for the extra 3 GHz. For comparison, the current world record over all known CPUs is the AMD FX-8370 with a clock of 8.72278 GHz. Many Pentium 4-era processors make up the top 15 places too, as those parts were designed for high clock rates with relatively low IPC.

The rest of the system used G.SKILL Ripjaws 4 DDR4 RAM, an ASRock Z170M OC Formula motherboard, and an Antec 1300W power supply. It used an NVIDIA GeForce GT 630 GPU, which offloaded graphics from the integrated chip, but otherwise interfered as little as possible. They also used Windows XP, because why not I guess? I assume that it does the least amount of work to boot, allowing a quicker verification, but that is only a guess.

Source: HWBot

Swiftech's H320 X2, bigger, better and ready for your personal touches

Subject: Cases and Cooling | February 5, 2016 - 06:44 PM |
Tagged: swiftech, H320 X2, AIO, watercooling

The Swiftech H320 X2 is obviously designed for those who like to show off the insides of their system, parts of both the reservoir and waterblock are clear as is the piping and there are indeed LEDs on the cooler.  It is larger than the previous generation, the radiator is 127 x 375 x 28mm with a 109ml reservoir, three Swiftech Helix 120mm PWM fans are installed to pull heat from the radiator.  Modders Inc loved the fact that while this is an AiO cooler, it is designed with modding in mind as you can add in or switch out components which is a rarity in AiO watercoolers.  The performance was also impressive, you can read about that and more in their full review.

DSC_7910.jpg

"All-in-one (AIO) water cooling units have brought the performance and silence of water cooling to the masses with the simplicity of installing an air cooler. AIOs offer simple installation without the need to bleed the loop. Simply attach the hardware and power cables and you are all set."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

Source: Modders Inc
Author:
Manufacturer: Various

Early testing for higher end GPUs

UPDATE 2/5/16: Nixxes released a new version of Rise of the Tomb Raider today with some significant changes. I have added another page at the end of this story that looks at results with the new version of the game, a new AMD driver and I've also included some SLI and CrossFire results.

I will fully admit to being jaded by the industry on many occasions. I love my PC games and I love hardware but it takes a lot for me to get genuinely excited about anything. After hearing game reviewers talk up the newest installment of the Tomb Raider franchise, Rise of the Tomb Raider, since it's release on the Xbox One last year, I've been waiting for its PC release to give it a shot with real hardware. As you'll see in the screenshots and video in this story, the game doesn't appear to disappoint.

rotr-screen1.jpg

Rise of the Tomb Raider takes the exploration and "tomb raiding" aspects that made the first games in the series successful and applies them to the visual quality and character design brought in with the reboot of the series a couple years back. The result is a PC game that looks stunning at any resolution, but even more so in 4K, that pushes your hardware to its limits. For single GPU performance, even the GTX 980 Ti and Fury X struggle to keep their heads above water.

In this short article we'll look at the performance of Rise of the Tomb Raider with a handful of GPUs, leaning towards the high end of the product stack, and offer up my view on whether each hardware vendor is living up to expectations.

Continue reading our look at GPU performance in Rise of the Tomb Raider!!

A friendly reminder about your OneDrive storage amount

Subject: General Tech | February 5, 2016 - 05:06 PM |
Tagged: onedrive, microsoft, cloud storage

Remember the good old days when OneDrive moved from offering you 1TB of storage to an unlimited amount?  That did not last too long, they changed their minds and dropped the paid service back to 1TB and the free version from 15GB to 5GB, with a chance to grandfather in the additional storage if you followed up with them.

A viewer recently encountered this for the first time and it seems appropriate to remind everyone about the change.  If you have the paid service and are storing over 1TB you may have already heard from Microsoft but if not then consider this the warning that you have better trim down the amount of data you store on OneDrive as the changes are going to happen in the latter half of this year.  The same goes for free users who have 15GB, or 30GB if you opted into the camera roll service, get the amount of files you have stored on OneDrive under 5GB or risk losing data you would rather keep.  The standalone 100GB and 200GB plans will be reduced to 50GB, the price will remain at $1.99 per month.

The whole situation is reminiscent of a teacher in a classroom full of kids choosing to punish the entire class for the actions of a few individuals; in this case the tiny percentage which exceeded 75TB of usage.  Make sure to clean up your OneDrive as soon as possible, this is not something you want to wait until the last minute to do.

OneDrive-Logo_large.png

"If you are using more than 5 GB of free storage, you will continue to have access to all files for at least 12 months after these changes go into effect in early 2016. In addition, you can redeem a free one-year Office 365 Personal subscription (credit card required), which includes 1 TB of OneDrive storage."

Here is some more Tech News from around the web:

Tech Talk

Source: OneDrive

ASRock Releases BIOS to Disable Non-K Skylake Overclocking

Subject: Processors | February 5, 2016 - 11:44 AM |
Tagged: Intel, Skylake, overclocking, cpu, Non-K, BCLK, bios, SKY OC, asrock, Z170

ASRock's latest batch of motherboard BIOS updates remove the SKY OS function, which permitted overclocking of non-K Intel processors via BCLK (baseclock).

20151215-8.jpg

The news comes amid speculation that Intel had pressured motherboard vendors to remove such functionality. Intel's unlocked K parts (i5-6600K, i7-6700K) will once again be the only options for Skylake overclocking on Z170 on ASRock boards (assuming prior BIOS versions are no longer available), and with no Pentium G3258 this generation Intel is no longer a budget friendly option for enthusiasts looking to push their CPU past factory specs.

3386ebeb-34f8-4a83-9909-0e29985f4712.jpg

(Image credit: Hexus.net)

It sounds like now would be a good time to archive that SKY OS enabled BIOS update file if you've downloaded it - or simply refrain from this BIOS update. What remains to be seen of course is whether other vendors will follow suit and disable BCLK overclocking of non-K processors. This had become a popular feature on a number of Z170 motherboards on the market, but ASRock may have been in too weak a position to battle Intel on this issue.

Source: Hexus

Unreal Editor for Unreal Engine 4 in VR

Subject: General Tech, Shows and Expos | February 4, 2016 - 07:47 PM |
Tagged: GDC, gdc 2016, epic games, ue4, VR, vive vr

Epic Games released Unreal Engine 4 at GDC two years ago, and removed its subscription fee at the next year's show. This year, one of the things that they will show is Unreal Editor in VR with the HTC Vive. Using the system's motion controllers, you will be able to move objects and access UI panels in the virtual environment. They open the video declaring that this is not an experimental project.

epicgames-2016-tim-vr.jpg

Without using this technology, it's hard to comment on its usability. It definitely looks interesting, and might be useful for VR experiences. You can see what your experience will look like as you create it, and you probably even save a bit of time in rapid iteration by not continuously wearing and removing the equipment. I wonder how precise it will be though, since the laser pointers and objects seemed to snap and jitter a bit. That said, it might be just as precise and, even still, it only really matters how it looks and behaves, and it shouldn't even prevent minor tweaks after the fact anyway.

Epic Games expects to discuss the release plans at the show.

Source: Epic Games

Mods like memory; the Gainward GTX 960 Phantom 4GB

Subject: Graphics Cards | February 4, 2016 - 05:51 PM |
Tagged: gainward, GTX 960 Phantom 4GB. gtx 960, NVIDA, 4GB

If you don't have a lot of cash on hand for games or hardware, a 4k adaptive sync monitor with two $600 GPUs and a collection of $80 AAA titles simply isn't on your radar.  That doesn't mean you have to toss in your love of gaming for occasional free to play gaming sessions; you just have to adapt.  A prime example are those die hard Skyrim fans who have modded the game to oblivion over the past few years, with many other games and communities that may not be new but are still thriving.  Chances are that you are playing at 1080p so a high powered GPU is not needed, however mods that upscale textures and many others do love huge tracts of RAM. 

So for those outside of North America looking for a card they can afford after a bit of penny pinching, check out Legion Hardware's review of the 4GB version of the Gainward GTX 960 Phantom.  It won't break any benchmarking records but it will let you play the games you love and even new games as their prices inevitably decrease over time.

Image_03S.jpg

Today we are checking out Gainward’s premier GeForce GTX 960 graphics card, the Phantom 4GB. Equipped with twice the memory buffer of standard cards, it is designed for extreme 1080p gaming. Therefore it will be interesting to see how the Phantom 4GB compares to a 2GB GTX 960..."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Are you going to phish or cut clickbait?

Subject: General Tech | February 4, 2016 - 02:08 PM |
Tagged: security, google

Remember the thrill of finding the actual download button for the software you need, hidden on a webpage featuring at least four other large download buttons leading to unrelated and generally nasty software?  Well those horrible people at Google want to take that joy away from you!  Instead of practicing your skills at slapping the monkey, shooting the duck or pretending you are on an online version of Let's Make a Deal trying to pick the right download button to reveal the prize you want, they will present you with a bright red warning screen. 

For some reason those hacks over at The Inquirer think it is a good idea to take away the hours of time spent with your family, and all the interesting things that "just appeared" on their machines.

index.png

"Google is still chipping away at creating a secure online experience and has just unearthed a new element for safe browsing that stops click-happy idiots doing click-stupid things."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer
Author:
Subject: General Tech
Manufacturer: Logitech G

A mix of styles

Logitech continues its push and re-entry into the gaming peripherals market in 2016, this time adding another keyboard under the Orion brand to the mix. The Logitech G G810 Orion Spectrum is, as the name implies, an RGB mechanical keyboard using the company's proprietary Romer-G switches. But despite the similarity in model numbers to the G910 Orion Spark announced in late 2014, the G810 has some significant design and functionality changes.

11.jpg

This new offering is cleaner, less faceted (both in key caps and design) but comes much closer to the feel and function than the tenkeyless G410 from last year. Let's take a look at how the G810 changes things up for Logitech G.

Keyboard Design

The G810 Orion Spectrum is a full size keyboard with tenkey (also known as the numeric keypad) that has sleeker lines and more professional lines that its big brother. The black finish is matte on the keys and framing but the outside edges of the keyboard have a gloss to them. It's a very minimal part of the design though so you shouldn't have to worry about fingerprints.

01.jpg

At first glance, you can see that Logitech toned down some of the gamer-centric accents when compared to either the G910 or the G410. There is no wrist rest, no PCB-trace inspired lines, no curves and no sharp edges. What you get instead is a keyboard that is equally well placed in modern office or in an enthusiasts gaming den. To me, there are a lot of touches that remind me of the Das Keyboard - understated design that somehow makes it more appealing to the educated consumer. 

02.jpg

This marks the first keyboard with the new Logitech G logo on it, though you are likely more concerned about the lack of G-Keys, the company's name for its macro-capable buttons on the G910. For users that still want that capability, Logitech G allows you to reprogram the function keys along the top for macro capability, and has a pretty simple switch in software to enable or disable those macros. This means you can maintain the F-row of keys for Windows applications but still use macros for gaming.

Continue reading our review of the Logitech G810 Orion Spectrum keyboard!!

Microsoft Lets Anyone "Git" Their Deep Learning On With Open Source CNTK

Subject: General Tech | February 4, 2016 - 01:18 PM |
Tagged: open source, microsoft, machine learning, deep neural network, deep learning, cntk, azure

Microsoft has been using deep neural networks for awhile now to power its speech recognition technologies bundled into Windows and Skype to identify and follow commands and to translate speech respectively. This technology is part of Microsoft's Computational Network Toolkit. Last April, the company made this toolkit available to academic researchers on Codeplex, and it is now opening it up even more by moving the project to GitHub and placing it under an open source license.

Lead by chief speech and computer scientist Xuedong Huang, a team of Microsoft researchers built the Computational Network Toolkit (CNTK) to power all their speech related projects. The CNTK is a deep neural network for machine learning that is built to be fast and scalable across multiple systems, and more importantly, multiple GPUs which excel at these kinds of parallel processing workloads and algorithms. Microsoft heavily focused on scalability with CNTK and according to the company's own benchmarks (which is to say to be taken with a healthy dose of salt) while the major competing neural network tool kits offer similar performance running on a single GPU, when adding more than one graphics card CNTK is vastly more efficient with almost four times the performance of Google's TensorFlow and a bit more than 1.5-times Torch 7 and Caffe. Where CNTK gets a bit deep learning crazy is its ability to scale beyond a single system and easily tap into Microsoft's Azure GPU Lab to get access to numerous GPUs from their remote datacenters -- though its not free you don't need to purchase, store, and power the hardware locally and can ramp the number up and down based on how much GPU muscle you need. The example Microsoft provided showed two similarly spec'd Linux systems with four GPUs each running on Azure cloud hosting getting close to twice the performance of the 4 GPU system (75% increase). Microsoft claims that "CNTK can easily scale beyond 8 GPUs across multiple machines with superior distributed system performance."

cntk-speed-comparison.png

Using GPU-based Azure machines, Microsoft was able to increase the performance of Cortana's speech recognition by 10-times compared to the local systems they were previously using.

It is always cool to see GPU compute in practice and now that CNTK is available to everyone, I expect to see a lot of new uses for the toolkit beyond speech recognition. Moving to an open source license is certainly good PR, but I think it was actually done more for Microsoft's own benefit rather than users which isn't necessarily a bad thing since both get to benefit from it. I am really interested to see what researchers are able to do with a deep neural network that reportedly offers so much performance thanks to GPUs. I'm curious what new kinds of machine learning opportunities the extra speed will enable.

If you are interested, you can check out CNTK on GitHub!

Source: Microsoft

Podcast #385 - Rise of the Tomb Raider Performance, 3x NVMe M.2 RAID-0, AMD Q1 Offerings

Subject: General Tech | February 4, 2016 - 11:53 AM |
Tagged: video, Trion 150, tesla, steam os, Samsung, rise of the tomb raider, podcast, ocz, NVMe, Jim Keller, amd, 950 PRO

PC Perspective Podcast #385 - 02/04/2016

Join us this week as we discuss Rise of the Tomb Raider performance, a triple RAID-0 NVMe array and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

NVIDIA Stops SHIELD Android 6.0 Update After Wi-Fi Issues

Subject: Mobile | February 4, 2016 - 09:39 AM |
Tagged: wi-fi, shield tablet, shield, ota update, nvidia, android 6.0

NVIDIA has pulled the Android 6.0 OTA update for the original SHEILD (pre-K1) tablet after users experienced wi-fi connection issues. A post on NVIDIA's official forums explains:

"We have temporarily turned off the OTA update until we understand why a few users are losing WiFi connection after updating their tablet to OTA 4.0."

shield.jpg

(Image: Android Police)

The post is authored by Manuel Guzman of NVIDIA Customer Care, and includes a list of potential fixes:

  • Reboot your tablet 2-3 times. If this fails, power cycle your tablet 3-4 times (not reboot but complete power off). If this does not work, charge your tablet to 100% and attempt again a couple of times or so.
  • Factory reset your tablet. Make sure you backup any important files before you perform this step.
  • A couple of users reporting their WiFi coming back after leaving their tablet powered off for a few hours. Try leaving your tablet powered off for a few hours and then turn the device back on.

Users who still have issues connecting are asked to navigate to the Advanced W-Fi page on their tablet, and then to "take a screenshot and email the picture to driverfeedback@nvidia.com".

Windows 95 in a Web Browser

Subject: General Tech | February 4, 2016 - 02:15 AM |
Tagged: windows 95, javascript

This one is quite interesting. We've seen DOSBox cross-compiled into JavaScript using emscripten before. For instance, The Internet Archive has been publishing a huge catalog of DOS-era games on their site, including John Carmack's Catacomb II. In case you're wondering, memory management is handled in emscripten by reserving a large, contiguous chunk of memory as an ArrayBuffer. The C application can do its typical memory management tricks because it sees an unmanaged chunk of memory.

microsoft-2016-safetoturnoff.png

This example is an image of Windows 95, complete with its default applications such as Minesweeper. It was ported by Andrea Faulds, who is a major contributor to PHP. The Windows 95 demo was apparently created in 2015, according to her personal website, but I just found out about it.

OCZ Launches Trion 150, Successor to Trion 100 SATA SSD, Now Using 15nm Flash

Subject: Storage | February 3, 2016 - 03:31 PM |
Tagged: Trion 150, toshiba, tlc, ssd, slc, sata, ocz, A15nm

*Note* This piece originally stated 'A15nm', however this was an error in the Trion 150 spec sheet at OCZ. It has been corrected in this article (as well as at the OCZ web site).

2015 was a bit of a rough year for OCZ, as their integration with parent company Toshiba ran into a few performance bumps in the road. First was the Vector 180 launch, which saw some particularly troublesome stalls during writes and TRIM operations. The Trion 100 launch went a bit smoother, but we did note some inconsistencies in caching performance of those TLC/SLC caching SSDs.

OCZ hopes to turn things around by kicking off 2016 with some updates to their product lines. First up is the just announced Trion 150:

trion150_lrg_sp.png

Looking at the spec sheets of the Trion 100 and 150, it may be difficult to spot any differences. I’ll save you the trouble and say that only *one digit* changes, but it is an important one. The Trion 150 will use Toshiba 15nm TLC flash (the Trion 100 used A19nm). What is interesting about this is that the Trion 150 carries the same endurance rating as its predecessor. A flash memory die shrink typically comes with a corresponding reduction in endurance, so it is good to see Toshiba squeeze this likely last die shrink to their planar flash for all of the endurance they can. Further backing up that endurance claim, the Trion 150 will carry OCZ’s ShieldPlus warranty, which offers shipping-paid advance-RMA (without receipt) of this product line for three years!

OCZ has Trion 150 samples on the way to us, and we will get a full performance review of them up as soon as we can! Full press blast follows after the break.

Source: OCZ

Who's a pretty boy? Is it you Fallout?

Subject: General Tech | February 3, 2016 - 02:46 PM |
Tagged: modding, gaming, fallout 4

[H]ard|OCP has put together a little guide on improving your Fallout 4 experience with the help of modders and the great people at Nexus Mods.  They describe the basics on how to install mods as there are steps you need to follow to ensure your mods successfully apply, whether installed manually or with the Nexus Mod Manager tool.  They explore several mods than greatly increase the size of textures, making them much better looking as well as adding weather and storms to the mix.  As long as you meet the graphics memory requirements which they mention you should not see much performance degradation when using these mods.  Soon Fallout 4 may be meeting or surpassing Skyrim's impressive mod community.

Of course immediately after [H] covered this topic Bethesda released a new patch which enables HBAO+ for all GPUs and extra debris effects specifically for NVIDIA GPUs.

14543944144L5SDRyj0t_1_3_l.gif

"Fallout 4 has been out for several months and it is possible that you might find the image quality lacking overall. We take some of the most popular and highly downloaded image quality mods and find out how we can improve the environment in Fallout 4. We modify for visual improvements to give you more immersive gameplay."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP
Subject: Systems
Manufacturer: PC Perspective

That Depends on Whether They Need One

Ars Technica UK published an editorial called, Hey Valve: What's the point of Steam OS? The article does not actually pose the question in it's text -- it mostly rants about technical problems with a Zotac review unit -- but the headline is interesting none-the-less.

Here's my view of the situation.

steam-os.png

The Death of Media Center May Have Been...

There's two parts to this story, and both center around Windows 8. The first was addressed in an editorial that I wrote last May, titled The Death of Media Center & What Might Have Been. Microsoft wanted to expand the PC platform into the living room. Beyond the obvious support for movies, TV, and DVR, they also pushed PC gaming in a few subtle ways. The Games for Windows certification required games to be launchable by Media Center and support Xbox 360 peripherals, which pressures game developers to make PC games comfortable to play on a couch. They also created Tray and Play, which is an optional feature that allows PC games to be played from the disk while they installed in the background. Back in 2007, before Steam and other digital distribution services really took off, this eliminated install time, which was a major user experience problem with PC gaming (and a major hurdle for TV-connected PCs).

It also had a few nasty implications. Games for Windows Live tried to eliminate modding by requiring all content to be certified (or severely limiting the tools as seen in Halo 2 Vista). Microsoft was scared about the content that users could put into their games, especially since Hot Coffee (despite being locked, first-party content) occurred less than two years earlier. You could also argue that they were attempting to condition PC users to accept paid DLC.

Windows_Media_Center_Logo.png

Regardless of whether it would have been positive or negative for the PC industry, the Media Center initiative launched with Windows Vista, which is another way of saying “exploded on the launch pad, leaving no survivors.” Windows 7 cleared the wreckage with a new team, who aimed for the stars with Windows 8. They ignored the potential of the living room PC, preferring devices and services (ie: Xbox) over an ecosystem provided by various OEMs.

If you look at the goals of Steam OS, they align pretty well with the original, Vista-era ambitions. Valve hopes to create a platform that hardware vendors could compete on. Devices, big or small, expensive or cheap, could fill all of the various needs that users have in the living room. Unfortunately, unlike Microsoft, they cannot be (natively) compatible with the catalog of Windows software.

This may seem like Valve is running toward a cliff, but keep reading.

What If Steam OS Competed with Windows Store?

Windows 8 did more than just abandon the vision of Windows Media Center. Driven by the popularity of the iOS App Store, Microsoft saw a way to end the public perception that Windows is hopelessly insecure. With the Windows Store, all software needs to be reviewed and certified by Microsoft. Software based on the Win32 API, which is all software for Windows 7 and earlier, was only allowed within the “Desktop App,” which was a second-class citizen and could be removed at any point.

mozilla-2016-donothurt.png

This potential made the PC software industry collectively crap themselves. Mozilla was particularly freaked out, because Windows Store demanded (at the time) that all web browsers become reskins of Internet Explorer. This means that Firefox would not be able to implement any new Web standards on Windows, because it can only present what Internet Explorer (Trident) draws. Mozilla's mission is to develop a strong, standards-based web browser that forces all others to interoperate or die.

Remember: “This website is best viewed with Internet Explorer”?

Executives from several PC gaming companies, including Valve, Blizzard, and Mojang, spoke out against Windows 8 at the time (along with browser vendors and so forth). Steam OS could be viewed as a fire escape for Valve if Microsoft decided to try its luck and kill, or further deprecate, Win32 support. In the mean time, Windows PCs could stream to it until Linux gained a sufficient catalog of software.

microsoft-2016-windowsrt.png

Image Credit: Wikipedia

This is where Steam OS gets interesting. Its software library cannot compete against Windows with its full catalog of Win32 applications, at least not for a long time. On the other hand, if Microsoft continues to support Win32 as a first-class citizen, and they returned to the level of openness with software vendors that they had in the Windows XP era, then Valve doesn't really have a reason to care about Steam OS as anything more than a hobby anyway. Likewise, if doomsday happens and something like Windows RT ends up being the future of Windows, as many feared, then Steam OS wouldn't need to compete against Windows. Its only competition from Microsoft would be Windows Store apps and first-party software.

I would say that Valve might even have a better chance than Microsoft in that case.

Next on the list of companies which should know better is Malwarebytes, but it is not as bad as some say

Subject: General Tech | February 3, 2016 - 12:46 PM |
Tagged: security, Malwarebytes

Considering the business that Malwarebytes is in you can expect to see a lot of negative press about a gaping security hole in the near future and while there is a vulnerability it is not as bad as many will make it out to be.  The issue lies in that signature updates are done over HTTP and are unsigned, very bad practice but something which would be exploited on a single client connection as opposed to something you could use to create a wide spread infection.  The Register links to the Google Project Zero entry which was released today as the vulnerability was first reported to Malwarebytes 90 days ago and has not been addressed on the client side.

The actual concern you should have is that the original bug report also found vulnerabilities on the server side.  Malwarebytes did correct the server side issues almost immediately but neglected to follow through on the client side.  It is good of them to patch and offer bug bounties but a complete follow through is necessary if you are a security software peddler who wants their reputation to stay intact.

mb-logo.png

"The antivirus firm says it has addressed server-side vulnerabilities that were reported by Google Project Zero researcher Tavis Ormandy in November. However, security holes remain in the client-side software that runs on people's Windows PCs."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

AMD FirePro S-Series Introduces Hardware-Based GPU Virtualization

Subject: Graphics Cards | February 3, 2016 - 02:37 AM |
Tagged: virtual machines, virtual graphics, mxgpu, gpu virtualization, firepro, amd

AMD made an interesting enterprise announcement today with the introduction of new FirePro S-Series graphics cards that integrate hardware-based virtualization technology. The new FirePro S1750 and S1750 x2 are aimed at virtualized workstations, render farms, and cloud gaming platforms where each virtual machine has direct access to the graphics hardware.

The new graphics cards use a GCN-based Tonga GPU with 2,048 stream processors paired with 8GB of ECC GDDR5 memory on the single slot FirePro S1750. The dual slot FirePro S1750 x2, as the name suggests, is a dual GPU card that features a total of 4,096 shaders (2,048 per GPU) and 16 GB of ECC GDDR5 (8 GB per GPU). The S1750 has a TDP of 150W while the dual-GPU S1750 x2 variant is rated at 265W and either can be passively cooled.

AMD FirePro S1750 x2 Hardare-based virtualized GPU MxGPU.png

Where the graphics cards get niche is the inclusion of what AMD calls MxGPU (Multi-User GPU) technology which is derived from the SR-IOV (Single Root Input/Output Virtualization) PCI-Express standard. According to AMD, the new FirePro S-Series allows virtual machines direct access to the full range of GPU hardware (shaders, memory, ect.) and OpenCL 2.0 support on the software side. The S1750 supports up to 16 simultaneous users and the S1750 x2 tops out at 32 users. Each virtual machine is allocated an equal slice of the GPU, and as you add virtual machines the equal slices get smaller. AMD’s solution to that predicament is to add more GPUs to spread out the users and allocate each VM more hardware horsepower. It is worth noting that AMD has elected not to charge companies any per-user licensing fees for all these VMs the hardware supports which should make these cards more competitive.

The graphics cards use ECC memory to correct errors when dealing with very large numbers and calculations and every VM is reportedly protected and isolated such that one VM can not access any data of a different VM stored in graphics memory.

I am interested to see how these stack up compared to NVIDIA’s GRID and VGX GPU virtualization specialized graphics cards. The difference between the software versus hardware-based virtualization may not make much difference, but AMD’s approach may be every so slightly more efficient with the removal of layer between the virtual machine and hardware. We’ll have to wait and see, however.

Enterprise users will be able to pick up the new cards installed in systems from server manufacturers sometime in the first half of 2016. Pricing for the cards themselves appears to be $2,399 for the single GPU S1750 and $3,999 for the dual GPU S1750 x2.

Needless to say, this is all a bit more advanced (and expensive!) than the somewhat finicky 3D acceleration option desktop users can turn on in VMWare and VirtualBox! Are you experimenting with remote workstations and virtual machines for thin clients that can utilize GPU muscle? Does AMD’s MxGPU approach seem promising?

Source: AMD

Blender Foundation Releases Caminandes 3: Lamingos

Subject: General Tech | February 2, 2016 - 11:34 PM |
Tagged: Blender, open-source

The Blender Foundation guides development with a series of first-party short films, each of which are created with open-source software and released under a Creative Commons license. Despite their purpose, to promote open source software and highlight ways to improve Blender, they each have engaging traits that are uncommon in commercial films. Cosmos Laundromat opens with a fairly long shot of a sheep's attempt at hanging itself, while Sintel's ending will make you feel hollow when it reveals its meaning.

This short, Caminandes 3: Lamingos, above, is much lighter than Cosmos Laundromat or Sintel. It has more of the ironic, mischievous cartoon feel of Big Buck Bunny, their second Blender short film. It is about a Llama and a Penguin who are trying to eat some berries; unfortunately, they are both trying to eat the same ones.

blender-2016-caminandes3-llama.jpg

The two-and-a-half-minute short film can be downloaded and is free to use under a Creative Commons Attribution license. Its assets are also available, but only under a Blender Cloud subscription.

BitTorrent Talks Encryption, Improved Linux Support For Sync 2.3

Subject: General Tech | February 2, 2016 - 05:11 PM |
Tagged: file syncing, encryption, bittorrent sync, bittorrent

BitTorrent continues to support its file sharing and syncing application with the recent release of Sync 2.3.1. The 2.3.x update contains a number of bug fixes for stability, but the important news is the added support for encrypted folders and finally allowing selective file syncing on Linux systems. Additionally, the company put out a short brief on the information they collect and how they are securing your files synced by Sync which is available as a PDF.

BitTorrent Sync 2_3 Encrypted Folders.png

Sync 2.3 allows Windows users to run Sync as a service and Android users can move data to and from an SD card from within the app so long as they are running at least Android 5.0 or newer. Linux users also get a bit of love with support for selective file syncing (where you can choose which specific files to download locally and which to keep on the remote peers) though it appears that BitTorrent has limited this feature to its paid Sync Pro tier which is in line with other platforms. According to BitTorrent Inc. among the performance and bug fixes, the biggest UI change is a redesigned process for adding new folders.

On the security and privacy front, BitTorrent claims that it employs several security measures to keep your data safe. First though, the company allegedly only collects benign data including the program version, add folder errors, the amount of data transferred (directly and via relay server), number of peers, and share link and tracker statistics as well as few more things you can see in the brief linked above. All the data that they collect is reportedly sent in the clear so that users can verify what they are collecting on them.

To secure your files, BitTorrent uses SSL and AES-128 encryption to transfer files. In the case of Advanced folders, it generates a X.509 certificate (each folder is given it's own certificate) using a certificate authority and then uses a certificate chain to control user access and file modification permissions as well as a mechanism to revoke access. In the case of encrypted folders, Sync generates storage and session keys with the session keys complying with perfect forwards secrecy standards such that future session keys being cracked does not compromise past sessions. When using the encrypted folders option (which is useful when using a VPS as an off-site backup or to any machine that you do not fully own and control for that matter), data from your local machines is encrypted before being sent to the remote machine using AES 128 bit encryption (I wish they had gone with at least AES-256, but it's something). The data is then sent over SSL. Thus, the data on the remote machine is never in an unencrypted state which is a good thing for having a secure off-site backup. The encrypted folder can still be used as part of the mesh to speed up syncing among your machines, as well, while remaining secure.

I think the encrypted folders are a good addition to Sync, though the encryption bit-ness could be improved (a weak VPS' processor doesn't need to decrypt the data anyway so CPU time needed for the beefier algorithm should not matter...). In past coverage users have mentioned issues when syncing folders that they encrypted themselves before adding to Sync where the data could get corrupted when the peers became confused on changes made and what to sync. Hopefully this will help avoid that though they do still need to work on fixing user chosen pre-sync encryption. I am still using Sync to backup my photos and sync documents between my laptop and desktop and it works well for that sans the storage limits imposed by One Drive (and the uncertainty of my once-promised 25GB of free storage).

What do you think of the changes, and is their security good enough?

Source: BitTorrent