Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Remember when competition wasn't a bad word?

Subject: General Tech | October 2, 2015 - 12:41 PM |
Tagged: google, chromecast, AT&T, apple tv, amd, amazon

There is more discouraging news out of AMD as another 5% of their workforce, around 10,000 employees, will be let go by the end of 2016.  That move will hurt their bottom line before the end of this year, $42 million in severance, benefit payouts and other costs associated with restructuring but should save around $60-70 million in costs by the end of next year.  This is on top of the 8% cut to their workforce which occurred earlier this year and shows just how deep AMD needs to cut to stay alive, unfortunately reducing costs is not as effective as raising revenue.  Before you laugh, point fingers or otherwise disparage AMD; consider for a moment a world in which Intel has absolutely no competition selling high powered desktop and laptop parts.  Do you really think the already slow product refreshes will speed up or prices remain the same?

Consider the case of AT&T, who have claimed numerous times that they provide the best broadband service to their customers that they are capable of and at the lowest price they can sustain.  It seems that if you live in a city which has been blessed with Google Fibre somehow AT&T is able to afford to charge $40/month less than in a city which only has the supposed competition of Comcast or Time Warner Cable.  Interesting how the presence of Google in a market has an effect that the other two supposed competitors do not.

There is of course another way to deal with the competition and both Amazon and Apple have that one down pat.  Apple removed the iFixit app that showed you the insides of your phone and had the temerity to actually show you possible ways to fix hardware issues.  Today Amazon have started to kick both Apple TV and Chromecast devices off of their online store.  As of today no new items can be added to the virtual inventory and as of the 29th of this month anything not sold will disappear.  Apparently not enough people are choosing Amazon's Prime Video streaming and so instead of making the service compatible with Apple or Google's products, Amazon has opted to attempt to prevent, or at least hinder, the sale of those products.

The topics of competition, liquidity and other market forces are far too complex to be dealt with in a short post such as this but it is worth asking yourself; do you as a customer feel like competition is still working in your favour?

The Hand

The Hand

"AMD has unveiled a belt-tightening plan that the struggling chipmaker hopes will get its finances back on track to profitability."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Android to iPhone Day 3: Widgets and Live Photos

Subject: Editorial, Mobile | September 28, 2015 - 09:57 AM |
Tagged: iphone 6s, iphone, ios, google, apple, Android

PC Perspective’s Android to iPhone series explores the opinions, views and experiences of the site’s Editor in Chief, Ryan Shrout, as he moves from the Android smartphone ecosystem to the world of the iPhone and iOS. Having been entrenched in the Android smartphone market for 7+ years, the editorial series is less of a review of the new iPhone 6s as it is an exploration on how the current smartphone market compares to what each sides’ expectations are.

Full Story Listing:


Day 1

Opening and setting up a new iPhone is still an impressive experience. The unboxing process makes it feel like you are taking part in the reveal of product worth its cost and the accessories included are organized and presented well. Having never used an iPhone 6 or iPhone 6 Plus beyond the cursory “let me hold that”, it was immediately obvious to me that the iPhone build quality exceeded any of the recent Android-based smartphones I have used; including the new OnePlus 2, LG G4 and Droid Turbo. The rounded edges sparked some debate in terms of aesthetics but it definitely makes the phone FEEL slimmer than other smartphone options. The buttons were firm and responsive though I think there is more noise in the click of the home button than I expected.

The setup process for the phone was pretty painless but Ken, our production editor who has been an iPhone user every generation, did comment that the number of steps you have to go through to get to a working phone have increased quite a bit. Setup Siri, setup Touch ID, setup Wi-Fi, have you heard about iCloud? The list goes on. I did attempt to use the “Move to iOS” application from the Android Play Store on my Droid Turbo but I was never able to get it to work – the devices kept complaining about a disconnection of some sort in its peer-to-peer network and after about 8 tries, I gave up. I’m hoping to try it again with the incoming iPhone 6 Plus next week to see if it was a temporary issue.


After getting to the iPhone 6s home screen I spent the better part of the next hour doing something that I do every time I get a new phone: installing apps. The process is painful – go to the App Store, search for the program, download it, open it, login (and try to remember login information), repeat. With the Android Play Store I do appreciate the ability to “push” application downloads to a phone from the desktop website, making it much faster to search and acquire all the software you need. Apple would definitely benefit from some version of this that doesn’t require installing iTunes.

I am a LastPass user and one of the first changes I had to get used to was the change in how that software works on Android and iOS. With my Droid Turbo I was able to give LastPass access to system levels lower than you can with iOS and when using a third-party app like Twitter, LastPass can insert itself into the process and automatically input the username and/or password for the website or service. With the iPhone you don’t have that ability and there was a lot of password copying and pasting to get everything setup. This is an area where the openness of the Android platform can benefit users.

That being said, the benefits of Touch ID from Apple were immediately apparent.  After going through the setup process using my fingerprint in place of my 15+ digit Apple ID password is a huge benefit and time saver.  Every time I download a new app from the App Store and simply place my thumb on the home button, I grin; knowing this is how it should be for all passwords, everywhere. I was even able to setup my primary LastPass password to utilize Touch ID, removing one of the biggest annoyances of using the password keeping software on Android. Logging into the phone with your finger or thumb print rather than a pattern or PIN is great too. And though I know new phones like the OnePlus 2 uses a fingerprint reader for this purpose, the implementation just isn’t as smooth.

My final step before leaving the office and heading for home was to download my favorite podcasts and get that setup on the phone for the drive. Rather than use the Apple Podcasts app it was recommended that I try out Overcast, which has been solid so far. I setup the Giant Bombcast, My Brother, My Brother and I and a couple of others, let them download on Wi-Fi and set out for home. Pairing the iPhone 6s with my Chevy Volt was as easy as any other phone but I did notice that Bluetooth-based information being passed to the entertainment system (icons, current time stamps, etc.) was more accurate with the iPhone 6s than my Droid Turbo (starting times and time remaining worked when they previously did not). That could be a result of the podcast application itself (I used doubleTwist on Android).

Day 2

On Saturday, with a bit more free time to setup the phone and get applications installed that I had previously forgotten, I did start to miss a couple of Android features. First, the lack of widgets on the iPhone home screens means the mass of icons on the iPhone 6s is much less useful than the customized screens I had on my Droid Turbo. With my Droid I had a page dedicated to social media widgets I could scroll through without opening up any specific applications. Another page included my current to-do list from Google Keep and my most current 15 items from Google Calendar, all at a glance.


I know that the top drag down menu on iOS with the Today and Notifications tabs is supposed to offer some of that functionality but the apps like Google Keep and Twitter don’t take advantage of it. And though cliché at this point, why in the hell doesn’t the Apple Weather application icon show the current temperature and weather status yet??

The second item I miss is the dedicated “back” button that Android devices have on them that are universal across the entire system. Always knowing that you can move to the previous screen or move from the current app to the home screen or other program that was just recently switched over is a great safety net that is missing in iOS. With only a single “always there” button on the phone, some software has the back button functionality on the top left hand corner and others have it in the form of an X or Close button somewhere else. I found myself constantly looking around each new app on the iPhone 6s to find out how to return to a previous screen and sometimes would hit the home button out of habit, which obviously isn’t going to have the intended function. Swiping from the left of the screen to the middle works with some applications, but not all.

Also, though my Droid Turbo phone was about the same size as the iPhone 6s, the size of the screen makes it hard to reach the top of the screen when only using one hand. With the Android back button along the bottom of the phone that meant it was always within reach. Those iOS apps that put the return functionality in the top left of the screen make it much more difficult to do, often risking dropping the phone by repositioning it in your hand. And double tapping (not clicking) the home button and THEN reaching for the back button on any particular app just seems to take too long.

On Saturday I went camping with my family at an early Halloween event that we have annually. This made for a great chance to test out the iPhone 6s camera, and without a doubt, it was the best phone camera I have used. The images were clear, the shutter speed was fast, and the ability to take high frame rate video or 4K video is a nice touch. I think that enough people have shown the advantages of the iPhone camera systems over almost anything else on the smartphone market and as a user of seemingly slow and laggard Android-based phone cameras, the move to the iPhone 6s is a noticeable change. As a parent of a 3 month old baby girl, these photos are becoming ever more important to me.


The new Live Photos feature, where essentially a few frames before and a few frames after the picture you actually took are captured (with audio included), is pretty much a gimmick but the effect is definitely eye-catching. When flipping through the camera roll you actually see a little bit of movement (someone’s face for example) which caused me to raise an eyebrow at first. It’s an interesting idea, but I’m not sure what use they will have off of the phone itself – will I be able to “play” these types of photos on my PC? Will I be able to share them to other phone users that don’t have the iPhone 6s?

Day 3

Most of Sunday was spent watching football and using the iPhone 6s to monitor fantasy football and to watch football through our Wi-Fi network when I needed to leave the room for laundry. The phone was able to keep up, as you would expect, with these mostly lightweight tasks without issue. Switching between applications was quick and responsive, and despite the disadvantage that the iPhone 6s has over many Android flagship phones in terms of system memory, I never felt like the system was penalized for it.

Browsing the web through either Safari or Google Chrome did demonstrate a standard complaint about iOS – reloading of webpages when coming back into the browser application even if you didn’t navigate away from the page. With Android you are able to load up a webpage and then just…leave it there, for reference later. With the iPhone 6s, even with the added memory this model ships with, it will reload a page after some amount of time away from the browser app as the operating system decided it needed to utilize that memory for another purpose.


I haven’t had a battery life crisis with the iPhone yet, but I am worried about the lack of Quick Charging or Turbo Charging support on the iPhone 6s. This was a feature I definitely fell in love with on the Droid Turbo, especially when travelling for work or going on extended outings without access to power. I’ll have to monitor how this issue does or does not pop its head up.

Speaking of power and battery life – so far I have been impressed with how the iPhone 6s has performed. As I write this editorial up at 9:30pm on Sunday night, the battery level sits at 22%. Considering I have been using the phone for frequent speed tests (6 of them today) and just general purpose performance and usability testing, I consider this a good result. I only took one 5 minute phone call but texting and picture taking was plentiful. Again, this is another area where this long-term test is going to tell the real story, but for my first impressions the thinness of the iPhone 6s hasn’t created an instant penalty for battery life.


The journey is still beginning – tomorrow is my first full work day with the iPhone 6s and I have the final installment of my summer evening golf league. Will the iPhone 6s act as my golf GPS like my Droid Turbo did? Will it make it through the full day without having to resort to car charging or using an external battery? What other features and capabilities will I love or hate in this transition? More soon!

NVIDIA Announces GeForce NOW Streaming Gaming Service

Subject: General Tech | September 30, 2015 - 09:00 AM |

In a continued evolution of the streaming gaming product previously known as GRID, NVIDIA is taking the wraps off of the final, consumer-ready version of the technology now called GeForce NOW. This streaming gaming service brings games from the cloud to NVIDIA SHIELD devices at up to 1920x1080 resolution and 60 FPS for fluid gameplay. This has been an announcement that we have been expecting for a very long time, with NVIDIA teasing GeForce NOW in the form of GRID private and public betas.


GeForce NOW, which shares a similar goal to services like PlayStation Now and OnLive, plans to stand out through a few key points.

  1. 1080p 60 FPS Support – Supporting higher resolutions than any other service as well as higher frame rates, the resulting product of GeForce NOW could be better than anything else on market for streaming gaming.
  2. Affordability – Coming in at a USD price tag of $7.99, NVIDIA believes that with a combination of included, free, games as well as purchase-and-play games offers a great package for a minimal monthly cost.
  3. Speed of Access – NVIDIA  claims that GeForce NOW can start up new games as much as 2x faster than PlayStation Now, with titles like The Witcher 3 loading up and streaming in as little as 30 seconds.
  4. Global – GeForce NOW will be available in North America, the European Union, Western Europe, Western Russia, and Japan.


Before we talk about the games list, let’s first discuss some of the technical requirements for GeForce NOW. The first, and most important, requirement is a SHIELD device. GeForce NOW will only work with the SHIELD Android TV device or SHIELD Tablet. That will definitely limit the audience for the streaming service, and I am very curious if and when NVIDIA will decide to open this technology and capability to general PC users or other Android/Apple devices. Part of the SHIELD requirement is definitely to promote its own brand, but it might also help gate access to GeForce NOW as the technology ramps up in capacity, etc.

Other than the host device, you’ll also need a speedy broadband network connection. The minimum requirement is 12 Mbps though you will need 20 Mbps of downstream for 720p60 support and 50 Mbps for 1080p60 resolution and frame rate. In terms of latency, you’ll need a 60 ms ping time as a requirement and its going to be recommended you have a 40 ms ping to the nearest NVIDIA server location for the best experience.

All the GeForce NOW servers are based on NVIDIA Kepler GPUs which is what enables the platform to offer up impressive resolutions and image quality settings for a streaming service. Bandwidth and latency are still a concern, of course, but we’ll touch on that aspect of the service when we have more time with it this week or the next.


Finally, let’s talk about the game library. There are ~60 games in the included library including certain games that you can play an unlimited amount of with your $7.99 membership fee. NVIDIA says more games will be added as the service continues.

Continue reading our overview of the new NVIDIA GeForce NOW game streaming service!!

Source: NVIDIA

'Learn to trust us, because we're not about to stop.'

Subject: Editorial, General Tech | September 29, 2015 - 03:30 PM |
Tagged: trust, security, rant, microsoft, metadata, fud

Privacy of any nature when you utilize a device connected to the internet is quickly becoming a joke and not a very funny one. Just to name a few, Apple tracks your devices, Google scans every email you send, Lenovo actually has two programs to track your usage and of course there is Windows 10 and the data it collects and sends.  Thankfully in some of these cases the programs which track and send your data can be disabled but the fact of the matter is that they are turned on by default.

The Inquirer hits the nail on the head "Money is simply a by-product of data." a fact which online sites such as Amazon and Facebook have known for a while and which software and hardware providers are now figuring out.  In some cases an informed choice to share personal data is made, but this is not always true. When you share to Facebook or post your Fitbit results to the web you should be aware you are giving companies valuable data, the real question is about the data and metadata you are sharing of which you are unaware of.


Should you receive compensation for the data you provide to these companies?  Should you always be able to opt out of sharing and still retain use of a particular service?  Perhaps the cost of utilizing that service is sharing your data instead of money?   There are a lot of questions and even a lot of different uses for this data but there is certainly no one single answer to those questions. 

Microsoft have been collecting data from BSoD's for decades and Windows users have all benefited from it even though there is no opt out for sending that data.  On the other hand is there a debt incurred towards Lenovo or other companies when you purchase a machine from them?  Does the collection of patterns of usage benefit Lenovo users in a similar way to the data generated by a Windows BSoD or does the risk of this monitoring software being corrupted by others for nefarious purposes outweigh any possible benefits?


Of course this is only the tip of the iceberg, the Internet of Things is poised to become a nightmare for those who value their security, there are numerous exploits to track your cellphone that have nothing to do with your provider and that is only the tip of the iceberg.  Just read through the Security tag here on PCPer for more examples if you have a strong stomach.

Please, take some time to think about how much you value your privacy and what data you are willing to share in exchange for products and services.  Integrate that concern into your purchasing decisions, social media and internet usage.  Hashtags are nice, but nothing speaks as loudly as your money; never forget that.

"MICROSOFT HAS SPOKEN out about its oft-criticised privacy policies, particularly those in the newly released Windows 10, which have provoked a spike in Bacofoil sales over its data collection policies."

Here is some more Tech News from around the web:

Tech Talk


Source: The Register

Apple Dual Sources A9 SOCs with TSMC and Samsung: Some Extra Thoughts

Subject: Processors | September 30, 2015 - 09:55 PM |
Tagged: TSMC, Samsung, FinFET, apple, A9, 16 nm, 14 nm

So the other day the nice folks over at Chipworks got word that Apple was in fact sourcing their A9 SOC at both TSMC and Samsung.  This is really interesting news on multiple fronts.  From the information gleaned the two parts are the APL0898 (Samsung fabbed) and the APL1022 (TSMC).

These process technologies have been in the news quite a bit.  As we well know, it has been a hard time for any foundry to go under 28 nm in an effective way if your name is not Intel.  Even Intel has had some pretty hefty issues with their march to sub 32 nm parts, but they have the resources and financial ability to push through a lot of these hurdles.  One of the bigger problems that affected the foundries was the idea that they could push back FinFETs beyond what they were initially planning.  The idea was to hit 22/20 nm and use planar transistors and push development back to 16/14 nm for FinFET technology.


The Chipworks graphic that explains the differences between Samsung's and TSMC's A9 products.

There were many reasons why this did not work in an effective way for the majority of products that the foundries were looking to service with a 22/20 nm planar process.  Yes, there were many parts that were fabricated using these nodes, but none of them were higher power/higher performance parts that typically garner headlines.  No CPUs, no GPUs, and only a handful of lower power SOCs (most notably Apple's A8, which was around 89 mm squared and consumed up to 5 to 10 watts at maximum).  The node just did not scale power very effectively.  It provided a smaller die size, but it did not increase power efficiency and switching performance significantly as compared to 28 nm high performance nodes.

The information Chipworks has provided also verifies that Samsung's 14 nm FF process is more size optimized than TSMC's 16 nm FF.  There was originally some talk about both nodes being very similar in overall transistor size and density, but Samsung has a slightly tighter design.  Neither of them are smaller than Intel's latest 14 nm which is going into its second generation form.  Intel still has a significant performance and size advantage over everyone else in the field.  Going back to size we see the Samsung chip is around 96 mm square while the TSMC chip is 104.5 mm square.  This is not huge, but it does show that the Samsung process is a little tighter and can squeeze more transistors per square mm than TSMC.

In terms of actual power consumption and clock scaling we have nothing to go on here.  The chips are both represented in the 6S and 6S+.  Testing so far has not shown there to be significant differences between the two SOCs so far.  In theory one could be performing better than the other, but in reality we have not tested these chips at a low enough level to discern any major performance or power issue.  My gut feeling here is that Samsung's process is more mature and running slightly better than TSMC's, but the differences are going to be minimal at best.

The next piece of info that we can glean from this is that there just isn't enough line space for all of the chip companies who want to fabricate their parts with either Samsung or TSMC.  From a chip standpoint a lot of work has to be done to port a design to two different process nodes.  While 14 and 16 are similar in overall size and the usage of FinFETS, the standard cells and design libraries for both Samsung and TSMC are going to be very different.  It is not a simple thing to port over a design.  A lot of work has to be done in the design stage to make a chip work with both nodes.  I can tell you that there is no way that both chips are identical in layout.  It is not going to be a "dumb port" where they just adjust the optics with the same masks and magically make these chips work right off the bat.  Different mask sets for each fab, verification of both designs, and troubleshooting the yields by metal layer changes will be different for each manufacturer.

In the end this means that there just simply was not enough space at either TSMC or Samsung to handle the demand that Apple was expecting.  Because Apple has deep pockets they contracted out both TSMC and Samsung to produce two very similar, but still different parts.  Apple also likely outbid and locked down what availability to process wafers that Samsung and TSMC have, much to the dismay of other major chip firms.  I have no idea what is going on in the background with people like NVIDIA and AMD when it comes to line space for manufacturing their next generation parts.  At least for AMD it seems that their partnership with GLOBALFOUNDRIES and their version of 14 nm FF is having a hard time taking off.  Eventually more space will be made in production and yields and bins will improve.  Apple will stop taking up so much space and we can get other products rolling off the line.  In the meantime, enjoy that cutting edge iPhone 6S/+ with the latest 14/16 nm FF chips.

Source: Chipworks

Android to iPhone Day 6: Battery Life and Home Screens

Subject: General Tech, Mobile | October 1, 2015 - 02:45 PM |
Tagged: iphone 6s, iphone, ios, google, apple, Android

PC Perspective’s Android to iPhone series explores the opinions, views and experiences of the site’s Editor in Chief, Ryan Shrout, as he moves from the Android smartphone ecosystem to the world of the iPhone and iOS. Having been entrenched in the Android smartphone market for 7+ years, the editorial series is less of a review of the new iPhone 6s as it is an exploration on how the current smartphone market compares to what each sides’ expectations are.

Full Story Listing:


Day 4

It probably won’t come as a shock to the millions of iPhone users around the globe, but the more days I keep the 6s in my pocket, the more accepting I am becoming with the platform. The phone has been fast and reliable – I have yet to come across any instability or application crashes despite my incessant installations of new ones. And while I think it’s fair to say that even new Android-based phones feel snappy to user interactions out of the box, the iPhone is just about a week in without me ever thinking about performance – which is exactly what you want from a device like this.

There are some quirks and features missing from the iPhone 6s that I had on my Droid Turbo that I wish I could implement in settings or through third-party applications. I fell in love with the ability to do a double wrist rotation with the Droid as a shortcut to opening up the camera. It helped me capture quite a few photos when I only had access to a single hand and without having to unlock the phone, find an icon, etc. The best the iPhone has is a “drag up from the bottom” motion from the lock screen but I find myself taking several thumb swipes on it before successfully activating it when only using one hand. Trying to use the home button to access the lock screen, and thus the camera shortcut, is actually hindered because the Touch ID feature is TOO FAST, taking me to a home screen (that may not have the camera app icon on it) where I need to navigate around.

I have been a user of the Pebble Time since it was released earlier this year and I really enjoy the extended battery life (measured in days not hours) when compared to Android Wear devices or the Apple Watch. However, the capabilities of the Pebble Time are more limited with the iPhone 6s than they are with Android – I can no longer use voice dictation to reply to text messages or emails and the ability to reply with easy templates (yes, no, I’ll be there soon, etc.) is no longer available. Apple does not allow the same level of access to the necessary APIs as Android does and thus my Time has effectively become a read-only device.


Finally, my concern about missing widgets continues to stir within me; it is something that I think the iPhone 6s could benefit from greatly. I also don’t understand the inability to arrange the icons on the home screens in an arbitrary fashion. Apple will not let me move icons to the bottom of the page without first filling up every other spot on the screen – there can be no empty spaces!! So while my organizational style would like to have a group of three icons in the bottom right hand corner of the screen with some empty space around it, Apple doesn’t allow me to do that. If I want those icons in that location I need to fill up every empty space on the screen to do so. Very odd.

Continue reading my latest update on my Android to iPhone journey!!

Google's Pixel C Is A Powerful Convertible Tablet Running Android 6.0

Subject: Mobile | October 2, 2015 - 04:09 PM |
Tagged: Tegra X1, tablet, pixel, nvidia, google, android 6.0, Android

During its latest keynote event, Google unveiled the Pixel C, a powerful tablet with optional keyboard that uses NVIDIA’s Tegra X1 SoC and runs the Android 6.0 “Marshmallow” operating system.

The Pixel C was designed by the team behind the Chromebook Pixel. Pixel C features an anodized aluminum body that looks (and reportedly feels) smooth with clean lines and rounded corners. The tablet itself is 7mm thick and weighs approximately one pound. The front of the Pixel C is dominated by a 10.2” display with a resolution of 2560 x 1800 (308 PPI, 500 nits brightness), wide sRGB color gamut, and 1:√2 aspect ratio (which Google likened to the size and aspect ratio of an A4 sheet of paper). A 2MP front camera sits above the display while four microphones sit along the bottom edge and a single USB Type-C port and two stereo speakers sit on the sides of the tablet. Around back, there is an 8MP rear camera and a bar of LED lights that will light up to indicate the battery charge level after double tapping it.

Google Pixel C Tegra X1 Tablet.jpg

The keyboard is an important part of the Pixel C, and Google has given it special attention to make it part of the package. The keyboard attaches to the tablet using self-aligning magnets that are powerful enough to keep the display attached while holding it upside down and shaking it (not that you'd want to do that, mind you). It can be attached to the bottom of the tablet for storage and used like a slate or you can attach the tablet to the back of the keyboard and lift the built-in hinge to use the Pixel C in laptop mode (the hinge can hold the display at anywhere from 100 to 135-degrees). The internal keyboard battery is good for two months of use, and can be simply recharged by closing the Pixel C like a laptop and allowing it to inductively charge from the tablet portion. The keyboard is around 2mm thick and is nearly full size at 18.85mm pitch and the chiclet keys have a 1.4mm travel that is similar to that of the Chromebook Pixel. There is no track pad, but it does offer a padded palm rest which is nice to see.

Google Pixel C with Keyboard.jpg

Internally, the Pixel C is powered by the NVIDIA Tegra X1 SoC, 3GB of RAM, and 32GB or 64GB of storage (depending on model). The 20nm Tegra X1 consists of four ARM Cortex A57 and four Cortex A53 CPU cores paired with a 256-core Maxwell GPU. The Pixel C is a major design win for NVIDIA, and the built in GPU will be great for gaming on the go.

The Pixel C will be available in December ("in time for the holidays") for $499 for the base 32 GB model, $599 for the 64 GB model, and $149 for the keyboard.

First impressions, such as this hands-on by Engadget, seem to be very positive stating that it is sturdy yet sleek hardware that feels comfortable typing on. While the hardware looks more than up to the task, the operating system of choice is a concern for me. Android is not the most productivity and multi-tasking friendly software. There are some versions of Android that enable multiple windows or side-by-side apps, but it has always felt rather clunky and limited in its usefulness. With that said, Computer World's  JR Raphael seems hopeful. He points out that the Pixel C is, in Batman fashion, not the hardware Android wants, but the hardware that Android needs (to move forward) and is primed for a future of Android that is more friendly to such productive endeavors. Development versions of Android 6.0 included support for multiple apps running simultaneously side-by-side, and while that feature will not make the initial production code cut, it does show that it is something that Google is looking into pursuing and possibly enabling at some point. The Pixel C has an excellent aspect ratio to take advantage of the app splitting with the ability to display four windows each with the same aspect ratio.

I am not sure how well received the Pixel C will be by business users who have several convertible tablet options running Windows and Chrome OS. It certainly gives the iPad-and-keyboard combination a run for its money and is a premium alternative to devices like the Asus Transformers.

What do you think about the Pixel C, and in particular, it running Android?

Even if I end up being less-than-productive using it, I think I'd still want the sleek-looking hardware as a second machine, heh.

Source: Google
Subject: General Tech
Manufacturer: Supermicro

Choosing the Right Platform

Despite what most people may think, our personal workstations here at the PC Perspective offices aren’t exactly comprised of cutting edge hardware. Just as in every other production environment, we place a real benefit on stability with the machines that we write, photo edit, and in this case, video edit on.

The current video editing workstation for PC Perspective offices is quite old when you look at the generations upon generations of hardware we have reviewed in the years since it was built. In fact, it has hardly been touched since early 2011. Built around the then $1000 Intel Core-i7 990X, 24GB of DDR3, a Fermi-based NVIDIA Quadro 5000, and a single 240gb SandForce 2 based SSD, this machine has edited a lot of 1080p video for us with little problems.


However, after starting to explore the Panasonic GH4 and 4K video a few months ago, the age of this machine became quite apparent. Real-time playback of high bit rate 4K content was choppy at best, and scrubbing through the timeline next to impossible. Transcoding to a lower resolution mezzanine file, or turning down the playback quality in Premiere Pro worked to some extent, but made the visual quality we gained more difficult to deal with. It was clear that we were going to need a new workstation sooner than later.

The main question was what platform to build upon. My initial thought was to build using the 8-core Intel Core i7-5960X and X99 platform. The main application we use, Adobe Premiere Pro (and it’s associated Media Encoder app) are very multithreaded. Going from 6-cores with the i7-990X to 8-cores with the i7-5960S with modest improvement in IPC didn’t seem like a big enough gain nor very future proof.


Luckily, we had a pair of Xeon E5-2680v2’s around from another testbed that had been replaced. These processors each provide 10 cores (Hyperthreading enabled for a resulting 20 threads each) at a base frequency of 2.8GHz, with the ability to boost up to 3.6GHz. By going with two of these processors in a dual CPU configuration, we will be significantly increasing our compute power and hopefully providing some degree of future proofing. Plus, we already use the slightly higher clocked Xeon E5-2690v2’s in our streaming server, so we have some experience with a very similar setup.

Continue reading an overview of our 2015 Editing Workstation Upgrade!!

Logitech G Announces G410 Atlas Spectrum TKL Mechanical Gaming Keyboard

Subject: General Tech | September 29, 2015 - 04:00 AM |
Tagged: TKL, tenkeyless, logitech g, logitech, g410, atlas spectrum

Logitech continues to release new products aimed at the PC gaming market, following up the announcement of the G633 and G933 headphones with a new gaming keyboard, the G410 Atlas Spectrum. Using Logitech's exclusive Romer-G mechanical switches, it apparently will have 25% faster actuation than "standard" mechanical keyboards as well as improved durability.

The most unique part of the G410 Atlas Spectrum is that is a TKL (tenkeyless) design, removing the number pad to shorten to length of the keyboard. Many gamers in today's market covet the TKL designs both for their form factor as well as their weight and portability. During a live stream with Logitech G's Chris Pate, he hinted that many gamers had been requesting a tenkeyless keyboard and to look forward to future releases. The Atlas Spectrum is the result of that kind of feedback to Logitech!


For those technical keyboard fans that want a bit more information, Logitech G provided details for us:

  • The Logitech G410 Atlas Spectrum features exclusive Romer-G mechanical switches that register your key presses up to 25 percent faster than competing mechanical switches. With an actuation point of 1.5 mm, Romer-G switches receive commands more quickly, giving you an edge in competitive games where every millisecond matters. With improved durability at 70 million keystrokes, up to 40 percent longer than others on the market, you can play with confidence knowing that your keyboard can survive.
  • With all the vital keys for gaming, the Logitech G410 Atlas Spectrum can be easily carried to LAN events or a friend’s house, and fits into smaller gaming spaces. Without the number pad or macro keys, you get extra space to make wide motions with your mouse. Plus, the compact design brings your hands closer together for improved comfort, which is particularly important for low DPI-gamers.

And let's not forget that, as the Spectrum name implies, the G410 has full RGB backlighting that can be configured using the Logitech Gaming Software package. You can customize each key to the full palette of 16.8 million colors and even synchronize lighting patterns across Logitech mice and headphones.

The keycaps on the G410 are not cupped and formed in the same way that they are with the G910 Orion Spark - those keys have a bevel on them that I liked for gaming but wasn't ideal for typing out emails and articles. The G410 uses standard molded keycaps that all users should be comfortable with.

Finally, the G410 includes a Arx Control dock, a phone and tablet dock that you can remove from the keyboard and place anywhere on your desk. You can use it simply for convenience or you can install the Logitech iOS and Android apps to display in-game information or system statistics including CPU utilization and more. This differs from the integration on the larger G910 keyboard that has a fixed location Arx Control dock.

The G410 Atlas Spectrum will be available in early October in the US and Europe with a starting MSRP of $129.99. In a market that has exploded on pricing for high end keyboards, that price is very competitive and should help the G410 find its way into many PC gamers' hands.

I currently am typing up this news post on a sample of the G410 Atlas Spectrum, so expect more coverage of this mini but powerful keyboard in the near future!!

Read on for the full press release after the break!!

$700 for 2TB of SSD goodness

Subject: Storage | September 29, 2015 - 07:07 PM |
Tagged: tlc, ssd, Samsung 850 EVO 2 TB, 850 EVO, 2TB

That's right, currently $713 will pick you up a 2TB Samsung 850 EVO SSD but how does it perform?  The Tech Report is on the case with their latest review, checking out how 32-layer 128Gbit 3D V-NAND with 2GB of DRAM cache and an upgraded Samsung MHX controller perform.  It took some doing but once they had filled its over-provisioned area the drive levelled out at 7252 IOps on the random write test though the peak of 84423 was certainly impressive.  Check out the full review to see if this is the large sized SSD for you or if you prefer smaller, more agile SSDs which do not use TLC NAND. 

If you are like me and running out of mental storage space, you may have already forgotten about Al's review of this drive.


"Samsung now offers its popular and affordable 850 EVO SSD in an enormous 2TB configuration. We put the EVO to the test to see how this behemoth performs"

Here are some more Storage reviews from around the web:


Samsung 840 EVO mSATA Gets Long Awaited EXT43B6Q Firmware, Fixes Read Speed Issue

Subject: Storage | October 1, 2015 - 05:42 PM |
Tagged: Samsung, firmware, 840 evo, msata

It took them a while to get it right, but Samsung did manage to fix their read degradation issue in many of their TLC equipped 840 Series SSDs. I say many because there were some models left out when firmware EXT0DB6Q was rolled out via Magician 4.6. The big exception was the mSATA variant of the 840 EVO, which was essentially the same SSD just in a more compact form. This omission was rather confusing as the previous update was applicable to both the 2.5" and mSATA form factors simultaneously.

840 EVO mSATA - 06.png

The Magician 4.7 release notes included a bullet for Advanced Performance Optimization support on the 840 EVO mSATA model, but it took Samsung some time to push out the firmware update that enabled this possibility. We know from our previous testing that the Advanced Performance Optimization feature was included with other changes that enabled reads from 'stale' data at full speeds, compensating for the natural voltage drift of flash cell voltages representing the stored data.

840 EVO mSATA FW - 6.png

Now that the firmware has been made available (it came out early this week but was initially throttled), I was able to apply it to our 840 EVO 1TB mSATA sample without issue, and could perform the Advanced Performance Optimization and observe the expected effects, but my sample was recently used for some testing and did not have data old enough to show a solid improvement with the firmware applied *and before* running the Optimization. Luckily, an Overclock.net forum member was able to perform just that test on his 840 EVO 500GB mSATA model:

Palorim12 OC.net post.png

Kudos to that member for being keen enough to re-run his test just after the update.


It looks like the only consumer 840 TLC model left to fix is the original 840 SSD (not 840 EVO, just 840). This was the initial model launched that was pure TLC flash with no SLC TurboWrite cache capability. We hope to see this model patched in the near future. There were also some enterprise units that used the same planar 19nm TLC flash, but I fear Samsung may not be updating those as most workloads seen by those drives would constantly refresh the flash and not give it a chance to become stale and suffer from slowing read speeds. The newer and faster V-NAND equipped models (850 / 950 Series) have never been susceptible to this issue.

Source: Samsung
Subject: General Tech
Manufacturer: NVIDIA

Setup, Game Selection

Yesterday NVIDIA officially announced the new GeForce NOW streaming game service, the conclusion to the years-long beta and development process known as NVIDIA GRID. As I detailed on my story yesterday about the reveal, GeForce NOW is a $7.99/mo. subscription service that will offer on-demand, cloud-streamed games to NVIDIA SHIELD devices, including a library of 60 games for that $7.99/mo. fee in addition to 7 titles in the “purchase and play” category. There are several advantages that NVIDIA claims make GeForce NOW a step above any other streaming gaming service including PlayStation Now, OnLive and others. Those include load times, resolution and frame rate, combined local PC and streaming game support and more.


I have been able to use and play with the GeForce NOW service on our SHIELD Android TV device in the office for the last few days and I thought I would quickly go over my initial thoughts and impressions up to this point.

Setup and Availability

If you have an NVIDIA SHIELD Android TV (or a SHIELD Tablet) then the setup and getting started process couldn’t be any simpler for new users. An OS update is pushed that changes the GRID application on your home screen to GeForce NOW and you can sign in using your existing Google account on your Android device, making payment and subscription simple to manage. Once inside the application you can easily browse through the included streaming games or look through the smaller list of purchasable games and buy them if you so choose.


Playing a game is as simple and selecting title from the grid list and hitting play.

Game Selection

Let’s talk about that game selection first. For $7.99/mo. you get access to 60 titles for unlimited streaming. I have included a full list below, originally posted in our story yesterday, for reference.

Continue reading my initial thoughts and an early review of GeForce NOW!!

The fast and the Fury(ous): 4K

Subject: Graphics Cards | September 28, 2015 - 04:45 PM |
Tagged: R9 Fury, asus strix r9 fury, r9 390x, GTX 980, crossfire, sli, 4k

Bring your wallets to this review from [H]ard|OCP which pits multiple AMD and NVIDIA GPUs against each other at 4K resolutions and no matter the outcome it won't be cheap!  They used the Catalyst 15.8 Beta and the GeForce 355.82 WHQL which were the latest drivers available at the time of writing as well as trying out Windows 10 Pro x64.  There were some interesting results, for instance you want an AMD card when driving in the rain playing Project Cars as the GTX 980's immediately slowed down in inclement weather.  With Witcher 3, AMD again provided frames faster but unfortunately the old spectre of stuttering appeared, which those of you familiar with our Frame Rating tests will understand the source of.  Dying Light proved to be a game that liked VRAM with the 390X taking top spot though sadly neither AMD card could handle Crossfire in Far Cry 4.  There is a lot of interesting information in the review and AMD's cards certainly show their mettle but the overall winner is not perfectly clear, [H] chose Fury the R9 Fury with a caveat about Crossfire support.


"We gear up for multi-GPU gaming with AMD Radeon R9 Fury CrossFire, NVIDIA GeForce GTX 980 SLI, and AMD Radeon R9 390X CrossFire and share our head-to-head results at 4K resolution and find out which solution offers the best gameplay experience. How well does Fiji game when utilized in a CrossFire configuration?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Powerline Networking is maturing and makes going wireless easy

Subject: General Tech | September 28, 2015 - 02:39 PM |
Tagged: TP-LINK TL-PA8010P, powerline networking, powerline ethernet

Ryan tried out Powerline Networking quite a while ago and found that while it worked, there were certain scenarios where it was not quite as good as advertised, though the idea of transmitting network signals without needing additional wiring and terminations was certainly welcome.  The Tech Report have just concluded a test of the TP-LINK TL-PA8010P adapters a newer product for transmitting ethernet over your dwellings powerlines and even added in WiFi to boot.  When a laptop was wired in, without any setup apart from installing the adapters they saw speeds of 120Mbps, however the WiFi router was not quite as amiable to this configuration.  Once the router had been beaten into submission, it was stuck on WDS mode as it had previously been used as an AP, speeds of 75-80 Mbps were available throughout the house.  Seems much easier that setting up wireless APs as well as a nice maturation of powerline ethernet technology.


"I decided to try a new spin on a disappointing older technology, home power-line networking, as a means of improving coverage in my home Wi-Fi network. Kinda worked out. Here's what happened:"

Here is some more Tech News from around the web:

Tech Talk


We interrupt the Apple coverage for a look at a high end Android

Subject: Mobile | October 1, 2015 - 07:13 PM |
Tagged: nexus 6p, google, Android

The Nexus 6P is new from Google and is designed to compete directly against the new Apple devices, with an all aluminium body and a new USB Type-C connection for charging.  One of the benefits of the new USB is in the charging speed, which Google claims will give you 7 hours of usage off of a 10 minute charge.  They have also added a 12.3MP camera complete with a 1.55μm sensor, though in the style of the Nokia Lumia 1520, that camera does project beyond the casing.  The 5.7in 2560x1440 AMOLED screen is made of Gorilla Glass 4 and is powered by a 2GHz Qualcomm Snapdragon 810 v2.1 octa-core processor, which may display that chips tendency to get a little warm during use.  The Inquirer has not had a chance to fully review the Nexus 6P but you can catch their preview right here.


"THE NEXUS 6P is the first truly premium Android device from Google. Last year's Nexus 6 divided opinion with its bulky design and lacklustre features, but the firm is hoping that its successor, with the premium case and next-gen specs, will finally fill the void for those after a stock Android device."

Here are some more Mobile articles from around the web:



Source: The Inquirer

Snapdragon 820 Features Qualcomm's New X12 Modem: Fastest LTE To Date

Subject: Mobile | September 30, 2015 - 02:33 PM |
Tagged: X12 Modem, SoC, snapdragon 820, qualcomm, phones, mu-mimo, mobile, LTE, cell phones

The upcoming Snapdragon 820 is shaping up to be a formidable SoC after the disappointing response to the previous flagship, the Snapdragon 810, which was in far fewer devices than expected for reasons still shrouded in mystery and speculation. One of the biggest aspects of the upcoming 820 is Qualcomm’s new X12 modem, which will provide the most advanced LTE connectivity seen to date when the SoC launches. The X12 features CAT 12 LTE downlink speeds for up to 600 Mbps, and CAT 13 on the uplink for up to 150 Mbps.

LTE connectivity isn’t the only new thing here, as we see from this slide there is also tri-band Wi-Fi supporting 2x2 MU-MIMO.


“This is the first publicly announced processor for use in mobile devices to support LTE Category 12 in the downlink and Category 13 in the uplink, providing up to 33 percent and 200 percent improvement over its predecessor’s download and upload speeds, respectively.”

The specifications for this new modem are densely packed:

  • Cat 12 (up to 600 Mbps) in the downlink
  • Cat 13 (up to 150 Mbps) in the uplink
  • Up to 4x4 MIMO on one downlink LTE carrier
  • 2x2 MU-MIMO (802.11ac)
  • Multi-gigabit 802.11ad
  • LTE-U and LTE+Wi-Fi Link Aggregation (LWA)
  • Next Gen HD Voice and Video calling over LTE and Wi-Fi
  • Call Continuity across Wi-Fi, LTE, 3G, and 2G
  • RF front end innovations
  • Advanced Closed Loop Antenna Tuner
  • Qualcomm RF360™ front end solution with CA
  • Wi-Fi/LTE antenna sharing

Rumored phones that could end up running the Snapdragon 820 with this X12 modem include the Samsung Galaxy S7 and around 30 other devices, though final word is of course pending on shipping hardware.

Source: Qualcomm

Oh Hey! Skylake and Broadwell Stock Levels Replenish

Subject: Processors | September 27, 2015 - 07:01 AM |
Tagged: Skylake, iris pro, Intel, Broadwell

Thanks to the Tech Report for pointing this out, but some recent stock level troubles with Skylake and Broadwell have been overcome. Both Newegg and Amazon have a few Core i7-6700Ks that are available for purchase, and both also have the Broadwell Core i7s and Core i5s with Iris Pro graphics. Moreover, Microcenter has stock of the Skylake processor at some of their physical stores with the cheapest price tag of all, but they do not have the Broadwell chips with Iris Pro (they are not even listed).


You'll notice that Skylake is somewhat cheaper than the Core i7 Broadwell, especially on Newegg. That is somewhat expected, as Broadwell with Iris Pro is a larger die than Skylake with an Intel HD 530. A bigger die means that fewer can be cut from a wafer, and thus each costs more (unless the smaller die has a relatively high amount of waste to compensate of course). Also, if you go with Broadwell, you will miss out on the Z170 chipset, because they still use Haswell's LGA-1150 socket.

On the other hand, despite being based on an older architecture and having much less thermal headroom, you can find some real-world applications that really benefit from the 128 MB of L4 Cache that Iris Pro brings, even if the iGPU itself is unused. The graphics cache can be used by the main processor. In Project Cars, again, according to The Tech Report, the i7-5775C measured a 5% increase in frame rate over the newer i7-6700k -- when using a GeForce GTX 980. Granted, this was before the FCLK tweak on Skylake so there are a few oranges mixed with our apples. PCIe rates might be slightly different now.

Regardless, they're all available now. If you were awaiting stock, have fun.

A good contender for powering your system, the CoolerMaster V750W

Subject: Cases and Cooling | October 2, 2015 - 03:29 PM |
Tagged: PSU, modular psu, coolermaster, V750W, 80+ gold

Cooler Master's V750W PSU is fully modular and comes with a nice selection of cabling including four PCIe 6+2 connectors and eight SATA power connectors. At 150x140x86mm (5.9x5.5x3.4") it also takes up less space than many PSUs, though not enough to fit in a truly SFF case. A single 12V rail can provide 744W at 62A which is enough to power more than one mid to high range GPU and Bjorn3D's testing shows that it can maintain that 80+ GOLD rating while it is being used.  The five year warranty is also a good reason to pick up this PSU, assuming you are not in the market for something in the kilowatt range.


"One available option soon to be available on the market for <179$, and our specimen of review today, is the CoolerMaster V750. CoolerMaster has partnered with Seasonic to produce the high quality compact “V” series PSUs which made a huge statement for CoolerMaster and told the world they were ready to push some serious power."

Here are some more Cases & Cooling reviews from around the web:



Source: Bjorn3D

AMD goes Pro with Carrizo and Godavari

Subject: General Tech | September 30, 2015 - 01:04 PM |
Tagged: amd, carrizo pro, Godavari Pro, 28nm, hp, elitebook

The Carrizo based AMD Pro A12 APU is going to be familiar to anyone who read our coverage of the non-Pro Carrizo models.  The A12 will have a boost clock of 3.4GHz, eight 800MHz Radeon R7 cores, 2MB of L2 cache, and hardware based HEVC decoding, exactly like the FX-8800P.  Indeed there is nothing obvious that differentiates the two processors apart from AMD's tag line that the Pro models are designed for corporate desktops and laptops.  The Inquirer lists three laptops which should already be available which use the new mobile processor, the HP EliteBook 725, 745 and 755.  No news yet on Godavari Pro powered desktops.


"AMD HAS ANNOUNCED its "most powerful" line of Pro A-Series mobile and desktop processors, formerly codenamed Carrizo Pro and Godavari Pro."

Here is some more Tech News from around the web:

Tech Talk


Source: The Inquirer
Subject: Motherboards
Manufacturer: GIGABYTE

Introduction and Technical Specifications



Courtesy of GIGABYTE

As the flagship board in their Intel Z170-based G1 Gaming Series product line, GIGABYTE integrated all the premium features you could ever want into their Z170X-G1 Gaming motherboard. The board features a black PCB with red and white accents spread throughout its surface to make for a very appealing aesthetic. GIGIGABYTE chose to integrate plastic shields covering their rear panel assembly, the VRM heat sinks, the audio components, and the chipset and SATA ports. With the addition of the Intel Z170 chipset, the motherboard supports the latest Intel LGA1151 Skylake processor line as well as Dual Channel DDR4 memory. Offered at at a premium MSRP of $499, the Z170X-Gaming G1 is priced to appeal to the premium user enthusiasts.


Courtesy of GIGABYTE


Courtesy of GIGABYTE

GIGABYTE over-engineered the Z170X-Gaming G1 to take anything you could think of throwing at it with a massive 22-phase digital power delivery system, featuring 4th gen IR digital controllers and 3rd gen IR PowerIRStage ICs as well as Durable Black solid capacitors rated at 10k operational hours. GIGABYTE integrated the following features into the Z170X-G1 Gaming board: four SATA 3 ports; three SATA-Express ports; two M.2 PCIe x4 capable port; dual Qualcomm® Atheros Killer E2400 NICs; a Killer™ Wireless-AC 1535 802.11AC WiFI controller; four PCI-Express x16 slots; three PCI-Express x1 slots; 2-digit diagnostic LED display; on-board power, reset, CMOS clear, ECO, and CPU Overclock buttons; Dual-BIOS and active BIOS switches; audio gain control switch; Sound Blaster Core 3D audio solution; removable audio OP-AMP port; integrated voltage measurement points; Q-Flash Plus BIOS updater; integrated HDMI video port; and USB 2.0, 3.0, and 3.1 Type-A and Type-C port support.

Continue reading our review of the GIGABYTE Z170X-Gaming G1 motherboard!