Subject: General Tech
Manufacturer: Logitech

Meet the M320

Logitech is brand synonymous with mice, joysticks and other peripherals, providing a handy way to interact with your computer for over 20 years.  Anyone who has used a computer for any amount of time knows Logitech and have used a variety of their products.  Their peripheral lineup has come a long way from the beginnings, with washable keyboards, webcams and mice with over two dozen programmable buttons. 

100_1103.JPG

In this case we are looking at the M320 Wireless Mouse with three buttons and scroll wheel, a rubberized grip shaped for the right hand and an offset optical sensor with 1000 dpi resolution.

100_1122.JPG

The Logitech M320 comes in a user friendly clamshell package with cut out flap on the back which is actually effective in opening the packaging without the need of a utility knife or a couple of stitches on your hand.  Perhaps even more impressive is the fact that it ships with a battery included; not the rechargeable kind but certainly a nice touch for those of us who remember receiving toys that were unusable until someone made a trip to the store to pick up the required mix of AAA's, D's or 9V's.  The documentation claims the battery will last for two years and while there was obviously no way to put that to the test the automatic sleep mode and physical power switch will ensure that your battery life will not be inconveniently short. 

Continue reading our review of the Logitech M320 Wireless mouse!!

Manufacturer: Multiple

Finding Your Clique

One of the difficulties with purchasing a mechanical keyboard is that they are quite expensive and vary greatly in subtle, but important ways. First and foremost, we have the different types of keyswitches. These are the components that are responsible for making each button behave, and thus varying them will lead to variations in how those buttons react and feel.

cherrymx-barekeyswitch.png

Until recently, the Cherry MX line of switches were the basis of just about every major gaming mechanical keyboard, although we will discuss recent competitors later on. Its manufacturer, Cherry Corp / ZF Electronics, maintained a strict color code to denote the physical properties of each switch. These attributes range from the stiffness of the spring to the bumps and clicks felt (or heard) as the key travels toward its bottom and returns back up again.

  Linear Tactile Clicky
45 cN Cherry MX Red
Cherry MX Brown
Razer Orange
Omron/Logitech Romer-G
 
50 cN    
Cherry MX Blue
Cherry MX White (old B)
Razer Green
55 cN   Cherry MX Clear  
60 cN Cherry MX Black    
80 cN Cherry MX Linear Grey (SB) Cherry MX Tactile Grey (SB)
Cherry MX Green (SB)
Cherry MX White (old A)
Cherry MX White (2007+)
90 cN     IBM Model M (not mechanical)
105 cN     Cherry MX Click Grey (SB)
150+ cN Cherry MX Super Black    

(SB) Denotes switches with stronger springs that are primarily for, or only for, Spacebars. The Click Grey is intended for spacebars on Cherry MX White, Green, and Blue keyboards. The MX Green is intended for spacebars on Cherry MX Blue keyboards (but a few rare keyboards use these for regular keys). The MX Linear Grey is intended for spacebars on Cherry MX Black keyboards.

The four main Cherry MX switches are: Blue, Brown, Black, and Red. Other switches are available, such as the Cherry MX Green, Clear, three types of Grey, and so forth. You can separate (I believe) all of these switches into three categories: Linear, Tactile, and Clicky. From there, the only difference is the force curve, usually from the strength of the spring but also possibly from the slider features (you'll see what I mean in the diagrams below).

Read on to see a theoretical comparison of various mechanical keyswitches.

Author:
Subject: General Tech
Manufacturer: Plex

Plex Overview

If you’re a fan of digital video and music, you’ve likely heard the name “Plex” floating around. Plex (not to be confused with EVE Online’s in-game subscription commodity) is free media center software that lets users manage and stream a wide array of videos, audio files, and pictures to virtually any computer and a growing number of mobile devices and electronics. As a Plex user from the very beginning, I’ve seen the software change and evolve over the years into the versatile and powerful service it is today.

plex-1.jpg

My goal with this article twofold. First, as an avid Plex user, I’d like to introduce the software to users have yet to hear about or try it. Second, for those already using or experimenting with Plex, I hope that I can provide some “best practices” when it comes to configuring your servers, managing your media, or just using the software in general.

Before we dive into the technical aspects of Plex, let’s look at a brief overview of the software’s history and the main components that comprise the Plex ecosystem today.

History

Although now widely supported on a range of platforms, Plex was born in early 2008 as an OS X fork of the Xbox Media Center project (XBMC). Lovingly named “OSXBMC” (get it?) by its creators, the software was initially a simple media player for Mac, with roughly the same capabilities as the XBMC project from which it was derived. (Note: XBMC changed its name to “Kodi” in August, although you’ll still find plenty of people referring to the software by its original name).

A few months into the project, the OSXBMC team decided to change the name to “Plex” and things really started to take off for the nascent media software. Unlike the XBMC/Kodi community, which focused its efforts primarily on the playback client, the Plex team decided to bifurcate the project with two distinct components: a dedicated media server and a dedicated playback client.

plex-2.png

The dedicated media server made Plex unique among its media center peers. Once properly set up, it gave users with very little technical knowledge the ability to maintain a server that was capable of delivering their movies, TV shows, music, and pictures on demand throughout the house and, later, the world. We'll take a more detailed look at each of the Plex components next.

Plex Media Server

The “brains” behind the entire Plex ecosystem is Plex Media Server (PMS). This software, available for Windows, Linux, and OS X, manages your media database, metadata, and any necessary transcoding, which is one of its best features. Although far from error-free, the PMS encoding engine can convert virtually any video codec and container on the fly to a format requested by a client device. Want to play a high-bitrate 1080p MKV file with a 7.1 DTS-HD MA soundtrack on your Roku? No problem; Plex will seamlessly transcode that high quality source file to the proper format for Roku, as well as your iPad, or your Galaxy S5, and many other devices, all without having to store multiple copies of your video files.

Continue reading our story on setting up the ultimate Plex media server!!

Author:
Subject: General Tech
Manufacturer: Various

PC Components

It's that time of year again!  When those of us lucky enough with the ability, get and share the best in technology with our friends and family.  You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you.  :)

This year we are going to break up the guide into categories.  We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories.  Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.

We thank you for your support of PC Perspective through all of 2014. The links included below embed our affiliate code to Amazon.com (when applicable) and if you are doing other shopping for the holidays this year we would appreciate it if you used the button above before perusing Amazon.com. In case you want to know the affiliate cod directly, it is: pcper04-20.

Enjoy!!

 

Intel Core i7-4790K Haswell Processor

4790k.jpg

Last year our pick for the best high-performance processor was the Core i7-4770K, and it sold for $379. This year we have a part running 500 MHz faster, though at higher power, for $80 less. If you are still waiting for a time to upgrade your processor (and hey, games will need more cores sooner rather than later!), the Core i7-4790K looks like a great option and now looks like a great time.

 

NVIDIA GeForce GTX 980 4GB

gtx980.jpg

Likely the most controversial selection in our gift guide, the GeForce GTX 980 is an interesting product. It's expensive compared to the other options from AMD like the Radeon R9 290X or even the R9 290, but it is also a better performing part; just not by much. The selection process of a GTX 980 stems from other things: G-Sync support, game bundles with Far Cry 4 and The Crew available, GeForce Experience, driver stability and frequency, etc. The GTX 970 is another good choice along these lines but as you'll see below...AMD has a strong contender as well.

Continue reading our 2014 Holiday Gift Guide!!

Subject: General Tech
Manufacturer: Microsoft

It could be a good... start.

So this is what happens when you install pre-release software on a production machine.

Sure, I only trusted it as far as a second SSD with Windows 7 installed, but it would be fair to say that I immersed myself in the experience. It was also not the first time that I evaluated upcoming Microsoft OSes on my main machine, having done the same for Windows Vista and Windows 7 as both were in production. Windows 8 was the odd one out, which was given my laptop. In this case, I was in the market for a new SSD and was thus willing to give it a chance, versus installing Windows 7 again.

windows-10.png

So far, my experience has been roughly positive. The first two builds have been glitchy. In the first three days, I have rebooted my computer more times than I have all year (which is about 1-2 times per month). It could be the Windows Key + Arrow Key combinations dropping randomly, Razer Synapse deciding to go on strike a couple of times until I reinstall it, the four-or-so reboots required to install a new build, and so forth. You then also have the occasional issue of a Windows service (or DWM.exe) deciding that it would max out a core or two.

But it is pre-release software! That is all stuff to ignore. The only reason I am even mentioning it is so people do not follow in my footsteps and install it on their production machines, unless they are willing to have pockets of downtime here or there. Even then, the latest build, 9879, has been fairly stable. It has been installed all day and has not given me a single issue. This is good, because it is the last build we will get until 2015.

What we will not ignore is the features. For the first two builds, it was annoying to use with multiple monitors. Supposedly to make it easier to align items, mouse cursors would remain locked inside each monitor's boundary until you provide enough velocity to have it escape to the next one. This was the case with Windows 8.1 as well, but you were given registry entries to disable the feature. Those keys did not work with Windows 10. But, with Build 9879, that seems to have been disabled unless you are currently dragging a window. In this case, a quick movement would pull windows between monitors, while a slow movement would perform a Snap.

microsoft-windows10-snap-on-monitor.png

This is me getting ready to snap a window on the edge between two monitors with just my mouse.

In a single build, they turned this feature from something I wanted to disable, to something that actually performs better (in my opinion) than Windows 7. It feels great.

Now on to a not-so-pleasant experience: updating builds.

Simply put, you can click "Check Now" and "Download Update" all that you want, but it will just sit there doing nothing until it feels like it. During the update from 9860 to 9879, I was waiting with the PC Settings app open for three hours. At some point, I got suspicious and decided to monitor network traffic: nothing. So I did the close app, open app, re-check dance a few times, and eventually gave up. About a half of an hour after I closed PC Settings the last time, my network traffic spiked to the maximum that my internet allows, which task manager said was going to a Windows service.

Shortly after, I was given the option to install the update. After finishing what I was doing, I clicked the install button and... it didn't seem to do anything. After about a half of an hour, it prompted me to restart my computer with a full screen message that you cannot click past to save your open windows - it is do it or postpone it one or more hours, there is no in-between. About another twenty minutes (and four-or-five reboots) after I chose to reboot, I was back up and running.

microsoft-windows10-preview-builds.png

Is that okay? Sure. When you update, you clearly need to do stuff and that could take your computer several minutes. It would be unrealistic to complain about a 20-minute install. The only real problem is that it waits for extended periods of time doing nothing (measured, literally nothing) until it decides that the time is right, and that time is NOW! It may have been three hours after you originally cared, but the time is NOW!

Come on Microsoft, let us know what is going on behind the scenes, and give us reliable options to pause or suspend the process before the big commitment moments.

So that is where I am, one highly positive experience and one slightly annoying one. Despite my concerns about Windows Store (which I have discussed at length in the past and are still valid) this operating system seems to be on a great path. It is a work in progress. I will keep you up to date, as my machine is kept up to date.

Author:
Manufacturer: Apple

One Small Step

While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.

applea83.jpg

First, let's start with 3DMark.

3dmark-iceunlimited.png

Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.

A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)

octane.png

If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.

sunspider.png

While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.

gfxbench-manhattan.png

Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.

These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.

However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.

apple-a8-dieshot-chipworks.png

In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.

Author:
Subject: General Tech
Manufacturer: Logitech G

Optical + Accelerometer

When I met with Logitech while setting up for our Hardware Workshop at Quakecon this year, they wanted to show me a new mouse they were coming out with. Of course I was interested, but to be honest, mice have seemingly gone to a point where I could very rarely tell them apart in terms of performance. Logitech promised me this would be different. The catch? The G402 Hyperion Fury includes not just an optical sensor but an accelerometer and gyro combo.

IMG_9323.JPG

Pretty much all mice today use optical sensors to generate data. The sensors are, basically, taking hundreds or thousands of photos of the surface of your desk or mouse and compare them to each other to measure how far and how fast you have moved your mouse. Your PC then takes that data from the mouse at a USB polling rate, up to 1000 Hz with this mouse, and translates it into mouse movement on your desktop and in games.

There is an issue though - at very high speeds of mouse movement, the optical sensor can fail. It essentially loses track of where it is on the surface and can no longer provide accurate data back to the system. At this point, depending on the design of the mouse and driver, the mouse may just stop sending data all together or just attempt to "guess" for a short period of time. Clearly that's not ideal and means that gamers (or any user for that matter) is getting inaccurate measurements. Boo.

top.jpg

To be quite honest though, that doesn't happen with modern mice at your standard speeds, or even standard "fast" gaming motions. According to Logitech, the optical sensor will start to lose tracking somewhere in the 150-180 IPS, or inches per second. That's quite a lot. More precisely that is 3.8 meters per second or 8.5 miles per hour. 

Continue reading our overview of the Logitech G402 Hyperion Fury Gaming Mouse!!

Manufacturer: EVGA

Introduction, Hardware, and Subjective Feel

This review comes before the end of the pre-order period. The reason why I targeted that deadline is because the pre-order perks are quite significant. First, either version of the mouse is listed for about $50 off of its MSRP (which is half price for the plastic version). EVGA also throws in a mouse pad for registering your purchase. The plastic mouse is $49.99 during its pre-order period ($99.99 MSRP) and its carbon fiber alternative is $79.99 ($129.99 MSRP). EVGA has supplied us with the plastic version for review.

EVGA-TORQ_Page_Revised.jpg

Being left-handed really puts a damper on my choice of gaming mice. If the peripheral is designed to contain thumb buttons, it needs to either be symmetric (because a right hand's thumb buttons would be controlled by my pinky or ring finger) or be an ergonomic, curved mouse which comes in a special version for lefties that is mirrored horizontally (which is an obvious risk, especially when the market of left-handed gamers is further split by those who learned to force themselves to use right-handed mice).

Please read on to see my thoughts on the EVGA Torq X10

Author:
Manufacturer: Anker

Upgrades from Anker

Last year we started to have a large amount of mobile devices around the office including smartphones, tablets and even convertibles like the ASUS T100, all of which were charged with USB connections. While not a hassle when you are charging one or two units at time, having 6+ on our desks on any day started to become a problem for our less numerous wall outlets. Our solution last year was Anker's E150 25 watt wall charger that we did a short video overview on.

It was great but had limitations including different charging rates depending on the port you connected it to, limited output of 5 Amps total for all five ports and fixed outputs per port. Today we are taking a look at a pair of new Anker devices that implement smart ports called PowerIQ that enable the battery and wall charger to send as much power to the charging device as it requests, regardless of what physical port it is attached to.

We'll start with the updated Anker 40 watt 5-port wall charger and then move on to discuss the 3-port mobile battery charger, both of which share the PowerIQ feature.

Anker 40 watt 5-Port Wall Charger

The new Anker 5-port wall charger is actually smaller than the previous generation but offers superior specifications at all feature points. This unit can push out more than 40 watts total combined through all five USB ports, 5 volts at as much as 8 amps. All 8 amps can in fact go through a single USB charging port we are told if there was a device that would request that much - we don't have anything going above 2.3A it seems in our offices.

wall1.jpg

Any USB port can be used for any device on this new model, it doesn't matter where it plugs in. This great simplifies things from a user experience point of view as you don't have to hold the unit up to your face to read the tiny text that existed on the E150. With 8 amps spread across all five ports you should have more than enough power to charge all your devices at full speed. If you happen to have five iPads charging at the same time, that would exceed 8A and all the devices charge rates would be a bit lower.

Continue reading our review of the Anker 40 watt 5-port Wall Charger and 2nd Gen Astro3 12000 mAh Battery!!

Author:
Subject: General Tech
Manufacturer: PC Perspective

AM1 Walks New Ground

After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.

While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.

71F0Lmi7WgL._SL1500_.jpg

Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.

Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.

Continue reading AMD AM1 Platform and Athlon 5350 with GTX 750 Ti - 1080p at under $450!!

Author:
Manufacturer: Various

Athlon and Pentium Live On

Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.

However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.

Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.

  AMD System Intel System
Processor AMD Athlon X4 760K - $85 Intel Pentium G3220 - $65
Cores / Threads 4 / 4 2 / 2
Motherboard Gigabyte F2A55M-HD2 - $60 ASUS H81M-E - $60
Graphics MSI R9 270 Gaming - $180 MSI R9 270 Gaming - $180
System Memory Corsair 8GB DDR3-1600 (1x8GB) - $73 Corsair 8GB DDR3-1600 (1x8GB) - $73
Hard Drive Western Digital 1TB Caviar Green - $60 Western Digital 1TB Caviar Green - $60
Power Supply  Cooler Master GX 450W - $50 Cooler Master GX 450W - $50
Case Cooler Master N200 MicroATX - $50 Cooler Master N200 MicroATX - $50
Price $560 $540

(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)

These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.

The Platforms

IMG_9973.JPG

First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.

What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.

As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.

Continue reading our build guide for a gaming PC under $550!!!

Author:
Manufacturer: Various

1920x1080, 2560x1440, 3840x2160

Join us on March 11th at 9pm ET / 6pm PT for a LIVE Titanfall Game Stream!  You can find us at http://www.pcper.com/live.  You can subscribe to our mailing list to be alerted whenever we have a live event!!

We canceled the event due to the instability of Titanfall servers.  We'll reschedule soon!!

With the release of Respawn's Titanfall upon us, many potential PC gamers are going to be looking for suggestions on compiling a list of parts targeted at a perfect Titanfall experience.  The good news is, even with a fairly low investment in PC hardware, gamers will find that the PC version of this title is definitely the premiere way to play as the compute power of the Xbox One just can't compete.

titanfallsystem.jpg
 

In this story we'll present three different build suggestions, each addressing a different target resolution but also better image quality settings than the Xbox One can offer.  We have options for 1080p, the best option that the Xbox could offer, 2560x1440 and even 3840x2160, better known as 4K.  In truth, the graphics horsepower required by Titanfall isn't overly extreme, and thus an entire PC build coming in under $800, including a full copy of Windows 8.1, is easy to accomplish.

Target 1: 1920x1080

First up is old reliable, the 1920x1080 resolution that most gamers still have on their primary gaming display.  That could be a home theater style PC hooked up to a TV or monitors in sizes up to 27-in.  Here is our build suggestion, followed by our explanations.

  Titanfall 1080p Build
Processor Intel Core i3-4330 - $137
Motherboard MSI H87-G43 - $96
Memory Corsair Vengeance LP 8GB 1600 MHz (2 x 4GB) - $89
Graphics Card EVGA GeForce GTX 750 Ti - $179
Storage Western Digital Blue 1TB - $59
Case Corsair 200R ATX Mid Tower Case - $72
Power Supply Corsair CX 500 watt - $49
OS Windows 8.1 OEM - $96
Total Price $781 - Amazon Full Cart

Our first build comes in at $781 and includes some incredibly competent gaming hardware for that price.  The Intel Core i3-4330 is a dual-core, HyperThreaded processor that provides more than enough capability to push Titanfall any all other major PC games on the market.  The MSI H87 motherboard lacks some of the advanced features of the Z87 platform but does the job at a lower cost.  8GB of Corsair memory, though not running at a high clock speed, provides more than enough capacity for all the programs and applications you could want to run.

Continue reading our article on building a gaming PC for Titanfall!!

Manufacturer: PC Perspective
Tagged: Mantle, interview, amd

What Mantle signifies about GPU architectures

Mantle is a very interesting concept. From the various keynote speeches, it sounds like the API is being designed to address the current state (and trajectory) of graphics processors. GPUs are generalized and highly parallel computation devices which are assisted by a little bit of specialized silicon, when appropriate. The vendors have even settled on standards, such as IEEE-754 floating point decimal numbers, which means that the driver has much less reason to shield developers from the underlying architectures.

Still, Mantle is currently a private technology for an unknown number of developers. Without a public SDK, or anything beyond the half-dozen keynotes, we can only speculate on its specific attributes. I, for one, have technical questions and hunches which linger unanswered or unconfirmed, probably until the API is suitable for public development.

Or, until we just... ask AMD.

amd-mantle-interview-01.jpg

Our response came from Guennadi Riguer, the chief architect for Mantle. In it, he discusses the API's usage as a computation language, the future of the rendering pipeline, and whether there will be a day where Crossfire-like benefits can occur by leaving an older Mantle-capable GPU in your system when purchasing a new, also Mantle-supporting one.

Q: Mantle's shading language is said to be compatible with HLSL. How will optimizations made for DirectX, such as tweaks during shader compilation, carry over to Mantle? How much tuning will (and will not) be shared between the two APIs?

[Guennadi] The current Mantle solution relies on the same shader generation path games the DirectX uses and includes an open-source component for translating DirectX shaders to Mantle accepted intermediate language (IL). This enables developers to quickly develop Mantle code path without any changes to the shaders. This was one of the strongest requests we got from our ISV partners when we were developing Mantle.

AMD-mantle-dx-hlsl-GSA_screen_shot.jpg

Follow-Up: What does this mean, specifically, in terms of driver optimizations? Would AMD, or anyone else who supports Mantle, be able to re-use the effort they spent on tuning their shader compilers (and so forth) for DirectX?

[Guennadi] With the current shader compilation strategy in Mantle, the developers can directly leverage DirectX shader optimization efforts in Mantle. They would use the same front-end HLSL compiler for DX and Mantle, and inside of the DX and Mantle drivers we share the shader compiler that generates the shader code our hardware understands.

Read on to see the rest of the interview!

Subject: General Tech
Manufacturer: PC Perspective

A Hard Decision

Welcome to our second annual (only chumps say first annual... crap) Best Hardware of the Year awards. This is where we argue the order of candidates in several categories on the podcast and, some time later, compile the results into an article. The majority of these select the best hardware of its grouping but some look at the more general trends of our industry.

As an aside, Google Monocle will win Best Hardware Ever 2014, 2015, and 2017. It will fail to be the best of all time for 2016, however.

If you would like to see the discussion as it unfolded then you should definitely watch Episode 282 recorded January 2nd, 2014. You do not even need to navigate away because we left it tantalizingly embed below this paragraph. You know you want to enrich the next two hours of your life. Click it. Click it a few times if you have click to enable plugins active in your browser. You can stop clicking when you see the polygons dance. You will know it when you see it.

The categories were arranged as follows:

  • Best Graphics Card of 2013
  • Best CPU of 2013
  • Best Storage of 2013
  • Best Case of 2013
  • Best Motherboard of 2013
  • Best Price Drop of 2013
  • Best Mobile Device of 2013
  • Best Trend of 2013
  • Worst Trend of 2013

Each of the winners will be given our "Editor's Choice" award regardless of its actual badge in any review we conducted of it. This is because the product is the choice of our editors for this year even if it is not an "Editor's Choice". It may have not even been reviewed by us at all.

Also, the criteria for winning each category is left as vague as possible for maximum interpretation.

Continue reading our selection for Best Hardware of 2013!!

Manufacturer: StarTech

Introduction and Design

PB063557.jpg

We’re always on the hunt for good docking stations, and sometimes it can be difficult to locate one when you aren’t afforded the luxury of a dedicated docking port. Fortunately, with the advent of USB 3.0 and the greatly improved bandwidth that comes along with it, the options have become considerably more robust.

Today, we’ll take a look at StarTech’s USB3SDOCKHDV, more specifically labeled the Universal USB 3.0 Laptop Docking Station - Dual Video HDMI DVI VGA with Audio and Ethernet (whew). This docking station carries an MSRP of $155 (currently selling for $123 on Amazon.com) and is well above other StarTech options (such as the $100 USBVGADOCK2, which offers just one video output—VGA—10/100 Ethernet, and four USB 2.0 ports). In terms of street price, it is currently available at resellers such as Amazon for around $125.

The big selling points of the USB3SDOCKHDV are its addition of three USB 3.0 ports and Gigabit Ethernet—but most enticingly, its purported ability to provide three total screens simultaneously (including the connected laptop’s LCD) by way of dual HD video output. This video output can be achieved by way of either HDMI + DVI-D or HDMI + VGA combinations (but not by VGA + DVI-D). We’ll be interested to see how well this functionality works, as well as what sort of toll it takes on the CPU of the connected machine.

Continue reading our review of the StarTech USB3SDOCKHDV USB 3.0 Docking Station!!!

Author:
Manufacturer: Valve

A not-so-simple set of instructions

Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening.  We have spent quite a lot of time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.

Our full video tutorial of installing and configuring SteamOS

First up was selecting the hardware for the build.  As is usually the case, we had a nearly-complete system sitting around that needed some tweaks.  Here is a quick list of the hardware we used, with a discussion about WHY just below.

  Gaming Build
Processor Intel Core i5-4670K - $222
Motherboard EVGA Z87 Stinger Mini ITX Motherboard - $257
Memory Corsair Vengeance LP 8GB 1866 MHz (2 x 4GB) - $109
Graphics Card NVIDIA GeForce GTX TITAN 6GB - $999
EVGA GeForce GTX 770 2GB SuperClocked - $349
Storage Samsung 840 EVO Series 250GB SSD - $168
Case EVGA Hadron Mini ITX Case - $189
Power Supply Included with Case
Optical Drive Slot loading DVD Burnder - $36
OS FREE!!
Peak Compute 4,494 GFLOPS (TITAN), 3,213 GFLOPS (GTX 770)
Total Price $1947 (GTX TITAN)     $1297 (GTX 770)

We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on.  First up was the case, the new EVGA Hadron Mini ITX chassis.  It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.

evga_hadron_hero.jpg

The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended.  Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.

Continue reading our installation and configuration guide for SteamOS!!

Author:
Subject: General Tech
Manufacturer: Various

PC Component Selections

It's that time of year again!  When those of us lucky enough with the ability, get and share the best in technology with our friends and family.  You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you.  :)

This year we are going to break up the guide into categories.  We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories.  Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.

Our Amazon code is: pcper04-20

Enjoy!!

 

Intel Core i7-4770K Haswell Processor

4770k.jpg

The Intel Core i7-4770K is likely the best deal in computing performance today, after to power just about any configuration of PC you can think of without breaking much a sweat.  You want to game?  This part has you covered?  You want to encode some video?  The four cores and included HyperThreading support provide just about as much power as you could need.  Yes there are faster processors in the form of the the Ivy Bridge-E and even 10+ core Xeon processors, but those are significantly more expensive.  For a modest price of $299 you can get what is generally considered the "best" processor on the market.

Corsair Carbide Series Air 540 Case

corsair540air.jpg

Cases are generally considered a PC component that is more about the preference of the buyer but there are still fundamentals that make cases good, solid cases.  The new Corsair Carbide Air 540 is unique in a lot of ways.  The square-ish shape allows for a division of your power supply, hard drives and SSDs from the other motherboard-attached components.  Even though the case is a bit shorter than others on the market, there is plenty of working room inside thanks to the Corsair dual-chamber setup and it even includes a pair of high-performance Corsair AF140L fans for intake and exhaust.  The side panel window is HUGE allowing you to show off your goods and nice touches like the rubber grommeted cable routing cut outs and dust filters make this one of the best mid-range cases available.

Continue reading our selections for this year's PC Perspective Holiday Gift Guide!!

Author:
Manufacturer: Sony

Does downloading make a difference?

This is PART 2 of our testing on the PlayStation 4 storage systems, with the stock hard drive, an SSHD hybrid and an SSD.  Previously, we compared performance based on Blu-ray based installations though today we add downloaded titles from PSN to the mix.  Be sure you read PART 1, PlayStation 4 (PS4) HDD, SSHD and SSD Performance Testing.

I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD.  The results were fairly interesting (and got a good bit of attention) but some readers wanted more data.  In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network.  I also will compare boot times for each of the tested storage devices.

You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.

psn1.jpg

Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing.  The process was the same: start the game then load the first save spot.  Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.

load3.png

The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray.  The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.

Continue reading our story on the performance testing of HDD, SSD and SSHD with downloaded and Blu-ray installed games on PS4!!

Author:
Manufacturer: Sony

Load time improvements

This is PART 1 of our testing on the PlayStation 4 storage systems, with the stock hard drive, an SSHD hybrid and an SSD.  In PART 2 we take a look at the changes introduced with PSN downloaded games versus Blu-ray installed games as well as show boot time differences.  Be sure you read PART 2, PlayStation 4 (PS4) Blu-ray and Download Storage Performance, Boot Times.

On Friday Sony released the PlayStation 4 onto the world.  The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem.  Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.

Hard Drive Replacement Process

Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.

01_0.jpg

Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar.  Obviously make sure your PS4 is completely turned off and unplugged.

02_0.jpg

Simply slide it to the outside of the chassis and wiggle it up to release.  There are no screws or anything to deal with yet.

03_0.jpg

Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage. 

Continue reading our analysis of PS4 HDD, SSHD and SSD Performance!!

Manufacturer: NVIDIA

It impresses.

ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.

Also, it is free.

shadowplay-vs.jpg

I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.

This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.

  • Intel Core i7-3770, 3.4 GHz
  • NVIDIA GeForce GTX 670
  • 16 GB DDR3 RAM
  • Windows 7 Professional
  • 1920 x 1080 @ 120Hz.
  • 3 TB USB3.0 HDD (~50MB/s file clone).

The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.

Read on to see my thoughts on ShadowPlay, the new Experience on the block.