All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
PC Component Selections
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
Our Amazon code is: pcper04-20
Intel Core i7-4770K Haswell Processor
The Intel Core i7-4770K is likely the best deal in computing performance today, after to power just about any configuration of PC you can think of without breaking much a sweat. You want to game? This part has you covered? You want to encode some video? The four cores and included HyperThreading support provide just about as much power as you could need. Yes there are faster processors in the form of the the Ivy Bridge-E and even 10+ core Xeon processors, but those are significantly more expensive. For a modest price of $299 you can get what is generally considered the "best" processor on the market.
Corsair Carbide Series Air 540 Case
Cases are generally considered a PC component that is more about the preference of the buyer but there are still fundamentals that make cases good, solid cases. The new Corsair Carbide Air 540 is unique in a lot of ways. The square-ish shape allows for a division of your power supply, hard drives and SSDs from the other motherboard-attached components. Even though the case is a bit shorter than others on the market, there is plenty of working room inside thanks to the Corsair dual-chamber setup and it even includes a pair of high-performance Corsair AF140L fans for intake and exhaust. The side panel window is HUGE allowing you to show off your goods and nice touches like the rubber grommeted cable routing cut outs and dust filters make this one of the best mid-range cases available.
Does downloading make a difference?
I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD. The results were fairly interesting (and got a good bit of attention) but some readers wanted more data. In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network. I also will compare boot times for each of the tested storage devices.
You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.
- HGST 500GB 5400 RPM HDD - $50 - $0.10/GB
- Seagate 1TB Hybrid SSHD - $122 - $0.12/GB
- Corsair 240GB Force GS SSD - $189 - $0.78/GB
Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing. The process was the same: start the game then load the first save spot. Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.
The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray. The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.
Load time improvements
On Friday Sony released the PlayStation 4 onto the world. The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem. Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.
Hard Drive Replacement Process
Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.
Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar. Obviously make sure your PS4 is completely turned off and unplugged.
Simply slide it to the outside of the chassis and wiggle it up to release. There are no screws or anything to deal with yet.
Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage.
ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.
Also, it is free.
I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.
This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.
- Intel Core i7-3770, 3.4 GHz
- NVIDIA GeForce GTX 670
- 16 GB DDR3 RAM
- Windows 7 Professional
- 1920 x 1080 @ 120Hz.
- 3 TB USB3.0 HDD (~50MB/s file clone).
The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.
A new generation of Software Rendering Engines.
We have been busy with side projects, here at PC Perspective, over the last year. Ryan has nearly broken his back rating the frames. Ken, along with running the video equipment and "getting an education", developed a hardware switching device for Wirecase and XSplit.
My project, "Perpetual Motion Engine", has been researching and developing a GPU-accelerated software rendering engine. Now, to be clear, this is just in very early development for the moment. The point is not to draw beautiful scenes. Not yet. The point is to show what OpenGL and DirectX does and what limits are removed when you do the math directly.
Errata: BioShock uses a modified Unreal Engine 2.5, not 3.
In the above video:
- I show the problems with graphics APIs such as DirectX and OpenGL.
- I talk about what those APIs attempt to solve, finding color values for your monitor.
- I discuss the advantages of boiling graphics problems down to general mathematics.
- Finally, I prove the advantages of boiling graphics problems down to general mathematics.
I would recommend watching the video, first, before moving forward with the rest of the editorial. A few parts need to be seen for better understanding.
If Microsoft was left to their own devices...
Microsoft's Financial Analyst Meeting 2013 set the stage, literally, for Steve Ballmer's last annual keynote to investors. The speech promoted Microsoft, its potential, and its unique position in the industry. He proclaims, firmly, their desire to be a devices and services company.
The explanation, however, does not befit either industry.
Ballmer noted, early in the keynote, how Bing is the only notable competitor to Google Search. He wanted to make it clear, to investors, that Microsoft needs to remain in the search business to challenge Google. The implication is that Microsoft can fill the cracks where Google does not, or even cannot, and establish a business from that foothold. I agree. Proprietary products (which are not inherently bad by the way), as Google Search is, require one or more rivals to fill the overlooked or under-served niches. A legitimate business can be established from that basis.
It is the following, similar, statement which troubles me.
Ballmer later mentioned, along the same vein, how Microsoft is among the few making fundamental operating system investments. Like search, the implication is that operating systems are proprietary products which must compete against one another. This, albeit subtly, does not match their vision as a devices and services company. The point of a proprietary platform is to own the ecosystem, from end to end, and to derive your value from that control. The product is not a device; the product is not a service; the product is a platform. This makes sense to them because, from birth, they were a company which sold platforms.
A platform as a product is not a device nor is it service.
Over the past few weeks, I have been developing a device that enables external control of Wirecast and XSplit. Here's a video of the device in action:
But now, let's get into the a little bit of background information:
While the TriCaster from NewTek has made great strides in decreasing the cost of video switching hardware, and can be credited with some of the rapid expansion of live streaming on the Internet, it still requires an initial investment of about $20,000 on the entry-level. Even though this is down from around 5x or 10x the cost just a few years ago for professional-grade hardware, a significant startup cost is still presented.
This brings us to my day job. For the past 4 years I have worked here at PC Perspective. My job began as an intern helping to develop video content, but quickly expanded from there. Several years ago, we decided to make the jump to live content, and started investing in the required infrastructure. Since we obviously didn't need to worry about the availability of PC Hardware, we decided to go with the software video switching route, as opposed to dedicated hardware like the TriCaster. At the time, we started experimenting with Wirecast and bought a few Blackmagic Intensity Pro HDMI capture cards for our Canon Vixia HV30 cameras. Overall, building an 6 core computer (Core i7-980x in those days) with 3 capture cards resulted in an investment of about $2500.
Advantages to the software route not only consisted of a much cheaper initial investment, we had an operation running for about a 1/10th of the cost of a TriCaster, but ultimately our setup was more expandable. If we had gone with a TriCaster we would have a fixed number of inputs, but in this configuration we could add more inputs on the fly as long as we had available I/O on our computer.
Introduction and externals
Razer maintains a distinct sense of style across their product line. Over the past decade and a half, Razer has carved a spot in the peripherals market catering to competitive gamers as well as developing wholly novel products for the gaming market. Razer has a catalog including standard peripherals and more arcane things such as mice with telephone-style keypads geared toward MMORPG players as well as motion sensing controllers employing magnetic fields to detect controller position.
The Razer BlackWidow Ultimate Stealth 2013 Edition comes out of the box ready for use without additional software provided or assembly required. The keyboard uses a standard layout with five macro keys attached in a column on the left of the board. Rather than dedicated media buttons, media and keyboard specific functions are accessed by pressing a combination of a function key located to the right of right alt and the function keys on the top row.
The headphone and microphone jack are present on the side of the keyboard.
NVIDIA Finally Gets Serious with Tegra
Tegra has had an interesting run of things. The original Tegra 1 was utilized only by Microsoft with Zune. Tegra 2 had a better adoption, but did not produce the design wins to propel NVIDIA to a leadership position in cell phones and tablets. Tegra 3 found a spot in Microsoft’s Surface, but that has turned out to be a far more bitter experience than expected. Tegra 4 so far has been integrated into a handful of products and is being featured in NVIDIA’s upcoming Shield product. It also hit some production snags that made it later to market than expected.
I think the primary issue with the first three generations of products is pretty simple. There was a distinct lack of differentiation from the other ARM based products around. Yes, NVIDIA brought their graphics prowess to the market, but never in a form that distanced itself adequately from the competition. Tegra 2 boasted GeForce based graphics, but we did not find out until later that it was comprised of basically four pixel shaders and four vertex shaders that had more in common with the GeForce 7800/7900 series than it did with any of the modern unified architectures of the time. Tegra 3 boasted a big graphical boost, but it was in the form of doubling the pixel shader units and leaving the vertex units alone.
While NVIDIA had very strong developer relations and a leg up on the competition in terms of software support, it was never enough to propel Tegra beyond a handful of devices. NVIDIA is trying to rectify that with Tegra 4 and the 72 shader units that it contains (still divided between pixel and vertex units). Tegra 4 is not perfect in that it is late to market and the GPU is not OpenGL ES 3.0 compliant. ARM, Imagination Technologies, and Qualcomm are offering new graphics processing units that are not only OpenGL ES 3.0 compliant, but also offer OpenCL 1.1 support. Tegra 4 does not support OpenCL. In fact, it does not support NVIDIA’s in-house CUDA. Ouch.
Jumping into a new market is not an easy thing, and invariably mistakes will be made. NVIDIA worked hard to make a solid foundation with their products, and certainly they had to learn to walk before they could run. Unfortunately, running effectively entails having design wins due to outstanding features, performance, and power consumption. NVIDIA was really only average in all of those areas. NVIDIA is hoping to change that. Their first salvo into offering a product that offers features and support that is a step above the competition is what we are talking about today.
A quick look at a great accessory
Though we are a PC hardware and technology website by day, we are also video creators by night (and sometimes day as well). If you don't believe me, check out our PC Perspective video tag or even our very own YouTube channel. See?!?
While we do have a big fancy studio setup for in-house production, sometimes on the road you just need something quick and easy but also high quality for recording. While our collection of DSLR cameras does amazing with video quality, the audio from the in-camera microphones has always sucked and lugging around wireless mic packs seemed unnecessary much of the time.
Enter the RODE VideoMic.
This $170 shotgun, directional microphone is from one of the most well recognized and respected companies in pro-sumer audio. In the short video below I show you what you get in the box (not much) and how much you can improve your audio with this simple add-on.
Overall, I have to say I was very impressed with the RODE VideoMic and anyone looking to improve the quality of their videos with an audio upgrade should give this option a try!
What do they want Origin to be?
GamesIndustry International conducted an interview with EA's Executive Vice President, Andrew Wilson, during this year's Electronic Entertainment Expo (E3 2013). Wilson was on the team which originally designed Origin before marketing decided to write off all DOS-era nostalgia they once held with PC gamers through recycling an old web address.
The service, itself, has also changed since the original project.
'"Over the years ... there've been some permutations of that vision that have manifested as part of Origin," Wilson said. "I think what we've done is taken a step back and said 'Wow, we've actually done some really cool things with Origin.' It is by no means perfect, but we've done some pretty cool things. As you say, the plumbing is there. What can we do now to really think about Origin in the next generation?"
Fans of Sim City, who faithfully pre-ordered, will likely argue that Origin does not have enough sewage treatment at the end of their plumbing and the out-flow defecated all over their experience. A good service can be built atop the foundations of Origin; but, I have little confidence in their ability to realize that potential.
Wilson, on the other hand, believes they now "get it".
One assertion deals with customers who purchase more than one game. He argues that multiple update and online services are required and that is a barrier for users who desire a second, third, or hundredth purchase thereafter. The belief is that Origin can create a single experience for users and remove that barrier to inhibit a user's purchase. In practice, Origin ends up being a bigger hurdle than a single-game's service. It washes a bad faith over their entire library and fails to justify itself: games, such as Sim City, update on their own and old titles still have their online services taken offline.
What it comes down to is lack of focus. Wilson believes development of Origin was too focused on the transaction, and that lead to bad faith, presumably because customers would smell the disingenuous salesman. Good Old Games (GOG), on the other hand, successfully focused on the transaction. The difference? GOG gets out of your way immediately after the transaction, leaving you with just the game plus its bonus pack-ins you ordered, not DRM and a redundant social network.
Steam is heavily focused as a service and that is where EA desires Origin to be. The problem? Valve has set a high bar for EA to contend with. Steam has built customer faith consistently, albeit not perfectly, over its life with its user-centric vision. Not only would EA need to be substantially better than Steam, it is fighting with a severe handicap from their history of shutting down gaming servers and threatening to delicense merchandise if their customers upset them.
A successful Origin will need to carefully consider what it wants to be and strive to win at that goal. While possible, they are still content to handicap themselves and, then, not own the results of their decisions.
Apple has seen a healthy boost in computer sales and adoption since the transition to Intel-based platforms in 2006, but the MacBook line has far and away been the biggest benefactor. Apple has come a long way both from an engineering standpoint and consumer satisfaction point since the long retired iBook and PowerBook lines. This is especially evident when you look at their current product lineup, and products like the 11” MacBook Air.
Even though it may not be the most popular opinion around here, I have been a Mac user since 2005 with the original Mac Mini, and I have used a MacBook as my primary computer since 2008. I switched to the 11” MacBook Air when it came out in 2011, and experienced the growing pains of using a low power platform as my main computer.
While I still have a desktop for the occasional video that I edit at home, or game I manage to find time to play, the majority of my day involves being portable. Both in class and at the office, and I quickly grew to appreciate the 11” form factor, as well as the portability it offers. However, I was quite dissatisfied with the performance and battery life that my ageing ultraportable offered. Desperate for improvements, I decided to see what two generations worth of Intel engineering afforded, and picked up the new Haswell-based 11” MacBook Air.
Since the redesign of the MacBook Air in 2010, the overall look and feel has stayed virtually the same. While the Mini DisplayPort connector on the side became a Thunderbolt connector in 2011, things are still pretty much the same.
In this way, the 2013 MacBook Air should provide no surprises. The one visual difference I can notice involves upgrading the microphone on the left side to a stereo array, causing there to be two grilles this time, instead of one. However, the faults I found in the past with the MacBook Air have nothing to do with the aesthetics or build quality of the device, so I am not too disappointed by the design stagnation.
From an industrial design perspective, everything about this notebook feels familiar to me, which is a positive. I still believe that Apple’s trackpad implementation is the best I've used, and the backlit chiclet keyboard they have been using for years is a good compromise between thickness and key travel.
OpenCL Support in a Meaningful Way
Adobe had OpenCL support since last year. You would never benefit from its inclusion unless you ran one of two AMD mobility chips under Mac OSX Lion, but it was there. Creative Cloud, predictably, furthers this trend with additional GPGPU support for applications like Photoshop and Premiere Pro.
This leads to some interesting points:
- How OpenCL is changing the landscape between Intel and AMD
- What GPU support is curiously absent from Adobe CC for one reason or another
- Which GPUs are supported despite not... existing, officially.
This should be very big news for our readers who do production work whether professional or for a hobby. If not, how about a little information about certain GPUs that are designed to compete with the GeForce 700-series?
An new era for computing? Or, just a bit of catching up?
Early Tuesday, at 2am for viewers in eastern North America, Intel performed their Computex 2013 keynote to officially kick off Haswell. Unlike ASUS from the night prior, Intel did not announce a barrage of new products; the purpose is to promote future technologies and the new products of their OEM and ODM partners. In all, there was a pretty wide variety of discussed topics.
Intel carried on with the computational era analogy: the 80's was dominated by mainframes; the 90's were predominantly client-server; and the 2000's brought the internet to the forefront. While true, they did not explicitly mention how each era never actually died but rather just bled through: we still use mainframes, especially with cloud infrastructure; we still use client-server; and just about no-one would argue that the internet has been displaced, despite its struggle against semi-native apps.
Intel believes that we are currently in the two-in-one era, which they probably mean "multiple-in-one" due to devices such as the ASUS Transformer Book Trio. They created a tagline, almost a mantra, illustrating their vision:
"It's a laptop when you need it; it's a tablet when you want it."
But before elaborating, they wanted to discuss their position in the mobile market. They believe they are becoming a major player in the mobile market with key design wins and outperforming some incumbent system on a chips (SoCs). The upcoming Silvermont architecture pines to be fill in the gaps below Haswell, driving smartphones and tablets and stretching upward to include entry-level notebooks and all-in-one PCs. The architecture promises to scale between offering three-fold more performance than its past generation, or a fifth of the power for equivalent performance.
Ryan discussed Silvermont last month, be sure to give his thoughts a browse for more depth.
Our first thoughts and impressions
Since first hearing about the Kickstarter project that raised nearly 2.5 million dollars from over 9,500 contributors, I have eagerly been awaiting the arrival of my Oculus Rift development kit. Not because I plan on quitting the hardware review business to start working on a new 3D, VR-ready gaming project but just because as a technology enthusiast I need to see the new, fun gadgets and what they might mean for the future of gaming.
I have read other user's accounts of their time with the Oculus Rift, including a great write up in a Q&A form Ben Kuchera over at Penny Arcade Report, but I needed my own hands-on time with the consumer-oriented VR (virtual reality) product. Having tried it for very short periods of time at both Quakecon 2012 and CES 2013 (less than 5 minutes) I wanted to see how it performed and more importantly, how my body reacted to it.
I don't consider myself a person that gets motion sick. Really, I don't. I fly all the time, sit in the back of busses, ride roller coasters, watch 3D movies and play fast-paced PC games on large screens. The only instances I tend to get any kind of unease with motion is on what I call "roundy-round" rides, the kind that simply go in circles over and over. Think about something like this, The Scrambler, or the Teacups at Disney World. How would I react to time with the Oculus Rift, this was my biggest fear...
For now I don't want to get into the politics of the Rift, how John Carmack was initially a huge proponent of the project then backed off on how close we might be the higher-quality consumer version of the device. We'll cover those aspects in a future story. For now I only had time for some first impressions.
Watch the video above for a walk through of the development kit as well as some of the demos, as best can be demonstrated in a 2D plane!
Gaming on your Couch
Sometimes really unique products come across our door step and we just love to tell our readers about things that might normally fall outside the PC hardware field. The COUCHMASTER, essentially a piece of furniture made for gaming, is one of those items.
The COUCHMASTER, produced by a German company called Nerdytec, is a device built to help gamers use a mouse and keyboard while sitting on a couch and gaming in large screen environments. It has a pair of foam-stuffed side block that hold up a wood-constructed center panel that puts your mouse and keyboard at a comfortable angle.
Cable routing is made simple with Velcro removable panels under the keyboard and mouse and some versions of COUCHMASTER include a 4-port USB hub for connecting input devices, audio headsets, etc. The only that didn't work in our testing were external hard drives - just not enough power coming from the USB 3.0 connection through the include extension cable.
I played the entirety of Bioshock Infinite with the COUCHMASTER, and other than getting some odd looks from my wife, couldn't think of a more impressive and comfortable way to play PC games from a distance and without a standard desk setup.
I would love to see some changes like the addition of recessed drink holders on the sides, but otherwise, the only drawback to Nerdytec's COUCHMASTER is the price; it starts at $170 or so USD.
Check out the full video review posted below!!
UPDATE: The CouchMaster is now for sale in the US now!
The Ice Storm Test
Love it or hate it, 3DMark has a unique place in the world of PC gaming and enthusiasts. Since 3DMark99 was released...in 1998...with a target on DirectX 6, Futuremark has been developing benchmarks on a regular basis in time with major API changes and also major harware changes. The most recent release of 3DMark11 has been out since late in 2010 and has been a regular part of our many graphics card reviews on PC Perspective.
Today Futuremark is not only releasing a new version of the benchmark but is also taking fundamentally different approach to performance testing and platforms. The new 3DMark, just called "3DMark", will not only target high-end gaming PCs but integrated graphics platforms and even tablets and smartphones.
We interviewed the President of Futuremark, Oliver Baltuch, over the weekend and asked some questions about this new direction for 3DMark, how mobile devices were going to affect benchmarks going forward and asked about the new results patterns, stuttering and more. Check out the video below!
Make no bones about it, this is a synthetic benchmark and if you have had issues with that in the past because it is not a "real world" gaming test, you will continue to have those complaints. Personally I see the information that 3DMark provides to be very informative though it definitely shouldn't be depended on as the ONLY graphics performance metric.
CPU, Motherboard, GPU
If you want to take yourself seriously in the business, you HAVE to have an award show. PC Perspective is no different and this week we held the first annual "Best Hardware of the Year" edition of the PC Perspective Podcast in which we discussed, debated and selected the best hardware components and trends of the past 12 months. Sometimes we even went into 2011 and sometimes we were talking about the future...don't worry it will all make sense.
If you want to get the full experience of HOW we selected these products you should definitely check out episode #232 of the PC Perspective Podcast or just watch the video embedded right below. Watch as our editors throw each other under the bus as we collectively declare winners and runners up. We did not have nearly enough Christmas cheer to come to solid conclusions in every category, but we did our best. Next year, next year...
The categories we will award "Best Of" accolades on include CPU, Motherboard, GPU, Storage, Case, Price Drop and Upcoming Technology. We left some things out like power supplies, coolers, etc simply due to time but perhaps if there is demand we can address it for 2013. Each of the winners will be given our "Editor's Choice" award regardless of what award it may or may not have received from us before hand, or even if it was reviewed officially at all.
It is also important to note that these are awards are not simply for the best performing or the best price/performance products in the category. As ambiguous as it sounds, we wanted to try to find the "best" in whatever way that means. Cost, performance, marketability, effect on the ecosystem, etc. For a full break down of what our thought process was the best place to start is that video link above.
Since the Apple transition to Intel processors and mostly off-the-shelf PC hardware in 2006, people have been attempting to run OS X on home built computers originally destined for Windows. While running a different operating system on similar hardware may seem like a trivial thing, my historical experience with building a so called “Hackintosh” has been arduous at times. However, since it has been a few years since my last attempt, I decided to give installing OS X on modern PC hardware another try.
Otellini will never live that one down...
One of the big stepping stones for OS X on PC based motherboards was the widespread adoption of EFI instead of the standard BIOS environment. Official Intel Macs have always used EFI, which meant until a few years ago, emulating the EFI environment on third party motherboards to build a Hackintosh. That has changed recently and with the release of Sandy Bridge, we have seen full EFI support across all motherboard vendors.
The premiere source for information about Hackintosh builds is the tonymacx86 site and forums. The forums on tonymac is an extremely useful resource for learning about the current state of the Hackintosh scene and the experiences of people with similar hardware to what we will be using.
Tony publishes a yearly Buyer’s Guide article with components of all price ranges that will work with OS X with minimal hassle. He provides many different options in different price ranges in the 2012 guide, including H77, Z77, and even X79 based parts.
While it is technically possible to use AMD processors and graphics cards in a Hackintosh build, Apple officially supports Intel CPUs and NVIDIA Kepler GPUs, so they require much less work to ensure the operating system can fully utilize these components.
Windows Media Center Add-ons and Plugins – Page 1
Missed any installments of our Cutting the Cord Series? Catch up on them here:
- Cutting the Cord Part 1: The Assessment
- Cutting the Cord Part 2: Building your HTPC – The Hardware
- Cutting the Cord Part 3: Building your HTPC – OS Install and Tuning
- Cutting the Cord Part 4: Building your HTPC – Installing and Configuring Windows Media Center
- Cutting the Cord Part 5: Wrap up - Media Center Add-ons and Options
Now that we have our Windows Media Center up and running, we can investigate a few additional add-ons and plugins that can further improve upon the experience you can get from your Media Center. In addition to discussing some great add-ons, I’m going to discuss how well our HTPC build has done with our power efficiency goals, so without further ado let’s jump right into it!
My Experience: The add-ons and plug-ins that I’m going to walk through are by no means all that’s out there. There are tons of add-ons that will add anything from Local Weather to full overlays for your movie collection. One thing to keep in mind is that any add-on or plugin can completely bork up your Media Center. Always test the add-on on another box first, or even better, do a full image/backup of your Media Center before you try any new add-on or plugin. You do have a full image of your brand new Media Center build on another machine that you can re-image yourHTPC with right? (Check out Clonezilla or Acronis True Image if not…)
Windows Media Center Add-ons and Plugins
Windows Media Center is excellent right out of the box, but there are a few add-ons and plugins I like to add to our Media Center to give us some additional functionality and increased usability. By a wide margin, the one we use the most is Netflix.
Back when Netflix was a scrappy newcomer, trying to get subscribers, they were putting their client on every device and platform that would talk to them. They worked out a deal with Microsoft to have the Netflix client pre-installed right into Windows Media Center menu.
My Experience: The built in application was apparently a joint project between Microsoft and Netflix, which may seem great, but has actually turned out to be a quagmire of finger pointing. Since it was originally released, the application has not been updated since and both companies have washed their hands of it and point to the other as being responsible for the application. The UI badly needs a facelift, in particular with the way you navigate through titles that have multiple seasons. While all seasons of the title will show up as a single entry in your Instant Queue, there is no way to easily jump from season to season and the only way to navigate episodes is to pull up episode lists that starts at Season 1, Episode 1, every time you open up the episode list. While this may not seem like a big deal, if you watch a show with a lot of episodes (like Cheers with 11 Seasons and 275 episodes) you have to scroll past every single prior episode to get to the next one you want to watch. Clicking the down arrow on your remote over 200 times to get to the next episode you want to watch not only gets old real fast, but eats batteries like mad.
Episode list problems aside, we still use Netflix on a daily basis and it’s relatively easy to setup. First, scroll up to the “Movies” line and select the Netflix tile.
You’ll be greeted with a full Netflix splash screen. Put a check in the “I have read and understand the Terms of Service and Privacy Statement” checkbox which will then activate the “Install” button. Click on Install and off we go.