All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
High Bandwidth Memory
UPDATE: I have embedded an excerpt from our PC Perspective Podcast that discusses the HBM technology that you might want to check out in addition to the story below.
The chances are good that if you have been reading PC Perspective or almost any other website that focuses on GPU technologies for the past year, you have read the acronym HBM. You might have even seen its full name: high bandwidth memory. HBM is a new technology that aims to turn the ability for a processor (GPU, CPU, APU, etc.) to access memory upside down, almost literally. AMD has already publicly stated that its next generation flagship Radeon GPU will use HBM as part of its design, but it wasn’t until today that we could talk about what HBM actually offers to a high performance processor like Fiji. At its core HBM drastically changes how the memory interface works, how much power is required for it and what metrics we will use to compare competing memory architectures. AMD and its partners started working on HBM with the industry more than 7 years ago, and with the first retail product nearly ready to ship, it’s time to learn about HBM.
We got some time with AMD’s Joe Macri, Corporate Vice President and Product CTO, to talk about AMD’s move to HBM and how it will shift the direction of AMD products going forward.
The first step in understanding HBM is to understand why it’s needed in the first place. Current GPUs, including the AMD Radeon R9 290X and the NVIDIA GeForce GTX 980, utilize a memory technology known as GDDR5. This architecture has scaled well over the past several GPU generations but we are starting to enter the world of diminishing returns. Balancing memory performance and power consumption is always a tough battle; just ask ARM about it. On the desktop component side we have much larger power envelopes to work inside but the power curve that GDDR5 is on will soon hit a wall, if you plot it far enough into the future. The result will be either drastically higher power consuming graphics cards or stalling performance improvements of the graphics market – something we have not really seen in its history.
While it’s clearly possible that current and maybe even next generation GPU designs could still have depended on GDDR5 as the memory interface, the move to a different solution is needed for the future; AMD is just making the jump earlier than the rest of the industry.
Introduction and First Impressions
The ASUS ROG Gladius mouse features sleek styling and customizable lighting effects, but the biggest aspect is the underlying technology. With socketed Omron switches designed to be easily swapped and an adjustable 6400dpi optical sensor this gaming mouse offers a lot on paper. So how does it feel? Let's find out.
There are a few aspects to the way a mouse feels, including the shape, surface material, and overall weight. Beyond the physical properties there is the speed and accuracy of the sensor (which also affects hand movement) and of course the mouse buttons and scroll wheel. Really, there's a lot going on with a modern gaming mouse - a far cry from the "X-Y position indicator" that the inventors had nicknamed "mouse" in the 1960s.
One of the hallmarks of the ASUS ROG (Republic of Gamers) lineup is the sheer amount of additional features the products tend to have. I use an ROG motherboard in my personal system, and even my micro-ATX board is stuffed with additional functionality (and the box is loaded with accessories). So it came as no surprise to me when I opened the Gladius mouse and began to look it over. Sure, the box contents aren't as numerous as one of the Maximus motherboards, but there's still quite a bit more than I've encountered with a mouse before.
Or: How to fall asleep at work.
I will be the first to admit that just a couple of weeks ago I had zero need for a gaming chair. But when 4GamerGear.com offered to send us one of the AKracing AK-6014 ergonomic executive units, I agreed. I went out of town for a week and returned to find the chair assembled and already in use by Ken, our go to video editor and engineer. Of course I had to steal it from him to take part in the "review" process and the results were great! As I sit here now in my Ikea office chair, writing up this post, while Ken sits just feet away tapping away on some edits, I can't help but want to use my authority to take it back.
The price is steep, but the added comfort you get from a chair like the AKracing model we tested is substantial. Every part of the design, based on a racing seat for a car, is built to keep you in place. But instead of preventing lateral movements caused from taking corners, this is more to keep your butt in place and your back straight to encourage good posture. The arm rests are height adjustable (as is the seat itself of course) and the back reclines for different desk and resting positions. You can lay it PAST flat for naps if you're into that kind of thing.
You can find these chairs for sale on Amazon in different color combinations with a current price of $349. It's expensive, I can't deny that. But it looks and feels way cooler than what you are sitting in right now. And aren't you worth it?
No Longer the Media Center of Attention
Gabe Aul, of Microsoft's Windows Insiders program, has confirmed on Twitter that Windows 10 will drop support for Windows Media Center due to a decline in usage. This is not surprising news as Microsoft has been deprecating the Media Center application for a while now. In Windows 8.x, the application required both the “Pro” SKU of the operating system, and then users needed to install an optional add-on above and beyond that. The Media Center Pack cost $10 over the price of Windows 8.x Pro unless you claimed a free license in the promotional period surrounding Windows 8's launch.
While Media Center has been officially abandoned, its influence on the industry (and vice versa) is an interesting story. For a time, it looked like Microsoft had bigger plans that were killed by outside factors and other companies seem to be eying the money that Microsoft left on the table.
There will be some speculation here.
We could go back to the days of WebTV, but we won't. All you need to know is that Microsoft lusted over the living room for years. Windows owned the office and PC gaming was taking off with strong titles (and technologies) from Blizzard, Epic, iD, Valve, and others. DirectX was beloved by developers, which led to the original Xbox. Their console did not get a lot of traction, but they respected it as a first-generation product that was trying to acquire a foothold late in a console generation. Financially, the first Xbox would cost Microsoft almost four billion dollars more than it made.
At the same time, Microsoft was preparing Windows to enter the living room. This was the company's power house and it acquired significant marketshare wherever it went, due to its ease of development and its never-ending supply of OEMs, even if the interface itself was subpar. Their first attempt at bringing Windows to the living room was Windows XP Media Center Edition. This spin-off of Windows XP could only be acquired by OEMs to integrate into home theater PCs (HTPCs). The vision was interesting, using OEM competition to rapidly prototype what users actually want in a PC attached to a TV.
This leads us to Windows Vista, which is where Media Center came together while the OS fell apart.
Who Should Care? Thankfully, Many People
The Khronos Group has made three announcements today: Vulkan (their competitor to DirectX 12), OpenCL 2.1, and SPIR-V. Because there is actually significant overlap, we will discuss them in a single post rather than splitting them up. Each has a role in the overall goal to access and utilize graphics and compute devices.
Before we get into what everything is and does, let's give you a little tease to keep you reading. First, Khronos designs their technologies to be self-reliant. As such, while there will be some minimum hardware requirements, the OS pretty much just needs to have a driver model. Vulkan will not be limited to Windows 10 and similar operating systems. If a graphics vendor wants to go through the trouble, which is a gigantic if, Vulkan can be shimmed into Windows 8.x, Windows 7, possibly Windows Vista despite its quirks, and maybe even Windows XP. The words “and beyond” came up after Windows XP, but don't hold your breath for Windows ME or anything. Again, the further back in Windows versions you get, the larger the “if” becomes but at least the API will not have any “artificial limitations”.
Outside of Windows, the Khronos Group is the dominant API curator. Expect Vulkan on Linux, Mac, mobile operating systems, embedded operating systems, and probably a few toasters somewhere.
On that topic: there will not be a “Vulkan ES”. Vulkan is Vulkan, and it will run on desktop, mobile, VR, consoles that are open enough, and even cars and robotics. From a hardware side, the API requires a minimum of OpenGL ES 3.1 support. This is fairly high-end for mobile GPUs, but it is the first mobile spec to require compute shaders, which are an essential component of Vulkan. The presenter did not state a minimum hardware requirement for desktop GPUs, but he treated it like a non-issue. Graphics vendors will need to be the ones making the announcements in the end, though.
The ASUS STRIX TACTIC PRO is a premium mechanical gaming keyboard featuring Cherry MX Brown switches and some serious style.
Keyboards are a very personal thing, and as this is one of the three primary interfaces with the system itself (along with the mouse and display), feel will help decide the experience. Without a doubt mechanical keyboard have become very popular with enthusiasts, but as more manufacturers have started offering them - and the market has begun to saturate - it becomes much more difficult to pick a starting point if you're new to the game. To further complicate a buying decision there are different types of key switches used in these keyboards, and each variety has its own properties and unique feel.
And on the subject of key switches, this particular keyboard built with the brown variety of the Cherry MX switches, and ASUS offers the option of Cherry MX Black, Blue, and Red switches with the STRIX TACTIC PRO as well. Our own Scott Michaud covered the topic of key switches in great detail last year, and that article is a great starting point that helps explain the different types of switches available, and how they differ.
The Cherry MX Brown switch in action
I'll go into the feel of the keyboard on the next page, but quickly I'll say that MX Brown switches have a good feel without being too "clicky", but they are certainly more stiff feeling than a typical membrane keyboard. While it's impossible to really describe how the keyboard will feel to a particular user, we can certainly cover the features and performance of this keyboard to help with a purchasing decision in this crowded market. At $150 the STRIX TACTIC PRO carries a premium price, but as you'll see this is also a premium product.
Introducing Windows 10 (Again)
I did not exactly make too many unsafe predictions, but let's recap the Windows 10 Consumer announcement anyway. The briefing was a bit on the slow side, at least if you are used to E3 keynotes, but it contained a fair amount of useful information. Some of the things discussed are future-oriented, but some will arrive soon. So let's get right into it.
Price and Upgrade Options
Microsoft has not announced an official price for Windows 10, if the intent is to install it on a new PC. If you are attempting to upgrade a machine that currently runs Windows 7 or Windows 8.1, then that will be a free upgrade if done within the first year. Windows Phone 8.1 users are also eligible for a no-cost upgrade to Windows 10 if done in the first year.
Quote Terry Myerson of Microsoft, “Once a device is upgraded to Windows 10, we will be keeping it current for the supported lifetime of the device.” This is not elaborated on, but it seems like a weird statement given what we have traditionally expected from Windows. One possible explanation is that Microsoft intends for Windows to be a subscription service going forward, which would be the most obvious extension of “Windows as a Service”. On the other hand, they could be going for the per-device revenue option with Bing, Windows Store, and other initiatives being long tail. If so, I am a bit confused about what constitutes a new device for systems that are regularly upgraded, like what our readers are typically interested in. All of that will eventually be made clear, but not yet.
A New Build for Windows 10
Late in the keynote, Microsoft announced the availability of new preview builds for Windows 10. This time, users of Windows Phone 8.1 will also be able to see the work in progress. PC “Insiders” will get access to their build “in the next week” and phones will get access “in Feburary”. Ars Technica seems to believe that this is scheduled for Sunday, February 1st, which is a really weird time to release a build but their source might be right.
We don't know exactly what will be in it, though. In my predictions, I guessed that a DirectX 12 SDK might be available (or at least some demos) in the next build. That has not been mentioned, which probably would have been if it were true. I expect the next possibility (if we're not surprised in the next one-to-ten days when the build drops) is Game Developers Conference (GDC 2015), which starts on March 2nd.
The New Web Browser: Project Spartan
My guess was that Spartan would be based on DirectX 12. Joe Belfiore said that it is using a new, standards-compliant rendering engine and basically nothing more. The event focused on specific features. The first is note taking, which basically turns the web browser into a telestrator that can also accept keyboard comment blocks. The second is a reading mode that alters content into a Microsoft Word-like column. The third is “reading lists”, which is basically a “read it later” feature that does offline caching. The fourth is Adobe PDF support, which works with the other features of Spartan such as note taking and reading lists.
Which Transitions Into Cortana
The fifth feature of Spartan is Cortana integration, which will provide auto-suggestions based on the information that the assistant software has. The example they provided was auto-suggesting the website for his wife's flight. Surprisingly, when you attempt to control a Spartan, Cortana does not say “There's two of us in here now, remember?” You know, in an attempt to let you know she's service that's integrated into the browser.
Otherwise, it's an interesting demo. I might even end up using it when it comes out, but these sorts of things do not really interest me too much. We have been at the point where, for my usage, the operating system is really not in the way anymore. It feels like there is very little friction between me and getting what I want done, done. Of course, people felt that way about rotary phones until touch-tone came out, and I keep an open mind to better methods. It's just hard to get me excited about voice-activated digital assistants.
As I stated before, DirectX 12 was mentioned but a release date was not confirmed. What they did mention was a bit of relative performance. DirectX 12 supposedly uses about half of the power consumption of DirectX 11, which is particularly great for mobile applications. It can also handle scenes with many more objects. A FutureMark demo was displayed, with the DirectX 11 version alongside a DirectX 12 version. The models seem fairly simple, but the DirectX 12 version appears to running at over 100 FPS when the DirectX 11 version outright fails.
Other gaming features were mentioned. First, Windows 10 will allow shadow recording the last 30 seconds of footage from any game. You might think that NVIDIA would be upset about that, and they might be, but that is significantly less time than ShadowPlay or other recording methods. Second, Xbox One will be able to stream gameplay to any PC in your house. I expect this is the opposite direction than what people hope for, rather wishing for high-quality PC footage to be easily streamed to TVs with a simple interface. It will probably serve a purpose for some use case, though.
Well that was a pretty long event, clocking in at almost two-and-a-half hours. The end had a surprise announcement of an augmented reality (not virtual reality) headset, called the “HoloLens”, which is developed by the Kinect team. I am deliberately not elaborating on it because I was not at the event and I have not tried it. I will say that the most interesting part about it, for me, is the Skype integration, because that probably hints at Microsoft's intentions with the product.
For the rest of us, it touched on a number of interesting features but, like the Enterprise event, did not really dive in. It would have been nice to get some technical details about DirectX 12, but that obviously does not cater to the intended audience. Unless an upcoming build soft-launches a DirectX 12 preview (or Spartan) so that we can do our own discovery, we will probably need to wait until GDC and/or BUILD to find out more.
Until then, you could watch the on-demand version at Microsoft's website.
Meet the M320
Logitech is brand synonymous with mice, joysticks and other peripherals, providing a handy way to interact with your computer for over 20 years. Anyone who has used a computer for any amount of time knows Logitech and have used a variety of their products. Their peripheral lineup has come a long way from the beginnings, with washable keyboards, webcams and mice with over two dozen programmable buttons.
In this case we are looking at the M320 Wireless Mouse with three buttons and scroll wheel, a rubberized grip shaped for the right hand and an offset optical sensor with 1000 dpi resolution.
The Logitech M320 comes in a user friendly clamshell package with cut out flap on the back which is actually effective in opening the packaging without the need of a utility knife or a couple of stitches on your hand. Perhaps even more impressive is the fact that it ships with a battery included; not the rechargeable kind but certainly a nice touch for those of us who remember receiving toys that were unusable until someone made a trip to the store to pick up the required mix of AAA's, D's or 9V's. The documentation claims the battery will last for two years and while there was obviously no way to put that to the test the automatic sleep mode and physical power switch will ensure that your battery life will not be inconveniently short.
Finding Your Clique
One of the difficulties with purchasing a mechanical keyboard is that they are quite expensive and vary greatly in subtle, but important ways. First and foremost, we have the different types of keyswitches. These are the components that are responsible for making each button behave, and thus varying them will lead to variations in how those buttons react and feel.
Until recently, the Cherry MX line of switches were the basis of just about every major gaming mechanical keyboard, although we will discuss recent competitors later on. Its manufacturer, Cherry Corp / ZF Electronics, maintained a strict color code to denote the physical properties of each switch. These attributes range from the stiffness of the spring to the bumps and clicks felt (or heard) as the key travels toward its bottom and returns back up again.
|45 cN||Cherry MX Red||
Cherry MX Brown
Cherry MX Blue
Cherry MX White (old B)
|55 cN||Cherry MX Clear|
|60 cN||Cherry MX Black|
|80 cN||Cherry MX Linear Grey (SB)||Cherry MX Tactile Grey (SB)||
Cherry MX Green (SB)
Cherry MX White (old A)
Cherry MX White (2007+)
|90 cN||IBM Model M (not mechanical)|
|105 cN||Cherry MX Click Grey (SB)|
|150+ cN||Cherry MX Super Black|
(SB) Denotes switches with stronger springs that are primarily for, or only for, Spacebars. The Click Grey is intended for spacebars on Cherry MX White, Green, and Blue keyboards. The MX Green is intended for spacebars on Cherry MX Blue keyboards (but a few rare keyboards use these for regular keys). The MX Linear Grey is intended for spacebars on Cherry MX Black keyboards.
The four main Cherry MX switches are: Blue, Brown, Black, and Red. Other switches are available, such as the Cherry MX Green, Clear, three types of Grey, and so forth. You can separate (I believe) all of these switches into three categories: Linear, Tactile, and Clicky. From there, the only difference is the force curve, usually from the strength of the spring but also possibly from the slider features (you'll see what I mean in the diagrams below).
If you’re a fan of digital video and music, you’ve likely heard the name “Plex” floating around. Plex (not to be confused with EVE Online’s in-game subscription commodity) is free media center software that lets users manage and stream a wide array of videos, audio files, and pictures to virtually any computer and a growing number of mobile devices and electronics. As a Plex user from the very beginning, I’ve seen the software change and evolve over the years into the versatile and powerful service it is today.
My goal with this article twofold. First, as an avid Plex user, I’d like to introduce the software to users have yet to hear about or try it. Second, for those already using or experimenting with Plex, I hope that I can provide some “best practices” when it comes to configuring your servers, managing your media, or just using the software in general.
Before we dive into the technical aspects of Plex, let’s look at a brief overview of the software’s history and the main components that comprise the Plex ecosystem today.
Although now widely supported on a range of platforms, Plex was born in early 2008 as an OS X fork of the Xbox Media Center project (XBMC). Lovingly named “OSXBMC” (get it?) by its creators, the software was initially a simple media player for Mac, with roughly the same capabilities as the XBMC project from which it was derived. (Note: XBMC changed its name to “Kodi” in August, although you’ll still find plenty of people referring to the software by its original name).
A few months into the project, the OSXBMC team decided to change the name to “Plex” and things really started to take off for the nascent media software. Unlike the XBMC/Kodi community, which focused its efforts primarily on the playback client, the Plex team decided to bifurcate the project with two distinct components: a dedicated media server and a dedicated playback client.
The dedicated media server made Plex unique among its media center peers. Once properly set up, it gave users with very little technical knowledge the ability to maintain a server that was capable of delivering their movies, TV shows, music, and pictures on demand throughout the house and, later, the world. We'll take a more detailed look at each of the Plex components next.
The “brains” behind the entire Plex ecosystem is Plex Media Server (PMS). This software, available for Windows, Linux, and OS X, manages your media database, metadata, and any necessary transcoding, which is one of its best features. Although far from error-free, the PMS encoding engine can convert virtually any video codec and container on the fly to a format requested by a client device. Want to play a high-bitrate 1080p MKV file with a 7.1 DTS-HD MA soundtrack on your Roku? No problem; Plex will seamlessly transcode that high quality source file to the proper format for Roku, as well as your iPad, or your Galaxy S5, and many other devices, all without having to store multiple copies of your video files.
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
We thank you for your support of PC Perspective through all of 2014. The links included below embed our affiliate code to Amazon.com (when applicable) and if you are doing other shopping for the holidays this year we would appreciate it if you used the button above before perusing Amazon.com. In case you want to know the affiliate cod directly, it is: pcper0a4-20.
Intel Core i7-4790K Haswell Processor
Last year our pick for the best high-performance processor was the Core i7-4770K, and it sold for $379. This year we have a part running 500 MHz faster, though at higher power, for $80 less. If you are still waiting for a time to upgrade your processor (and hey, games will need more cores sooner rather than later!), the Core i7-4790K looks like a great option and now looks like a great time.
NVIDIA GeForce GTX 980 4GB
Likely the most controversial selection in our gift guide, the GeForce GTX 980 is an interesting product. It's expensive compared to the other options from AMD like the Radeon R9 290X or even the R9 290, but it is also a better performing part; just not by much. The selection process of a GTX 980 stems from other things: G-Sync support, game bundles with Far Cry 4 and The Crew available, GeForce Experience, driver stability and frequency, etc. The GTX 970 is another good choice along these lines but as you'll see below...AMD has a strong contender as well.
It could be a good... start.
So this is what happens when you install pre-release software on a production machine.
Sure, I only trusted it as far as a second SSD with Windows 7 installed, but it would be fair to say that I immersed myself in the experience. It was also not the first time that I evaluated upcoming Microsoft OSes on my main machine, having done the same for Windows Vista and Windows 7 as both were in production. Windows 8 was the odd one out, which was given my laptop. In this case, I was in the market for a new SSD and was thus willing to give it a chance, versus installing Windows 7 again.
So far, my experience has been roughly positive. The first two builds have been glitchy. In the first three days, I have rebooted my computer more times than I have all year (which is about 1-2 times per month). It could be the Windows Key + Arrow Key combinations dropping randomly, Razer Synapse deciding to go on strike a couple of times until I reinstall it, the four-or-so reboots required to install a new build, and so forth. You then also have the occasional issue of a Windows service (or DWM.exe) deciding that it would max out a core or two.
But it is pre-release software! That is all stuff to ignore. The only reason I am even mentioning it is so people do not follow in my footsteps and install it on their production machines, unless they are willing to have pockets of downtime here or there. Even then, the latest build, 9879, has been fairly stable. It has been installed all day and has not given me a single issue. This is good, because it is the last build we will get until 2015.
What we will not ignore is the features. For the first two builds, it was annoying to use with multiple monitors. Supposedly to make it easier to align items, mouse cursors would remain locked inside each monitor's boundary until you provide enough velocity to have it escape to the next one. This was the case with Windows 8.1 as well, but you were given registry entries to disable the feature. Those keys did not work with Windows 10. But, with Build 9879, that seems to have been disabled unless you are currently dragging a window. In this case, a quick movement would pull windows between monitors, while a slow movement would perform a Snap.
This is me getting ready to snap a window on the edge between two monitors with just my mouse.
In a single build, they turned this feature from something I wanted to disable, to something that actually performs better (in my opinion) than Windows 7. It feels great.
Now on to a not-so-pleasant experience: updating builds.
Simply put, you can click "Check Now" and "Download Update" all that you want, but it will just sit there doing nothing until it feels like it. During the update from 9860 to 9879, I was waiting with the PC Settings app open for three hours. At some point, I got suspicious and decided to monitor network traffic: nothing. So I did the close app, open app, re-check dance a few times, and eventually gave up. About a half of an hour after I closed PC Settings the last time, my network traffic spiked to the maximum that my internet allows, which task manager said was going to a Windows service.
Shortly after, I was given the option to install the update. After finishing what I was doing, I clicked the install button and... it didn't seem to do anything. After about a half of an hour, it prompted me to restart my computer with a full screen message that you cannot click past to save your open windows - it is do it or postpone it one or more hours, there is no in-between. About another twenty minutes (and four-or-five reboots) after I chose to reboot, I was back up and running.
Is that okay? Sure. When you update, you clearly need to do stuff and that could take your computer several minutes. It would be unrealistic to complain about a 20-minute install. The only real problem is that it waits for extended periods of time doing nothing (measured, literally nothing) until it decides that the time is right, and that time is NOW! It may have been three hours after you originally cared, but the time is NOW!
Come on Microsoft, let us know what is going on behind the scenes, and give us reliable options to pause or suspend the process before the big commitment moments.
So that is where I am, one highly positive experience and one slightly annoying one. Despite my concerns about Windows Store (which I have discussed at length in the past and are still valid) this operating system seems to be on a great path. It is a work in progress. I will keep you up to date, as my machine is kept up to date.
One Small Step
While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.
First, let's start with 3DMark.
Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.
A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)
If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.
While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.
Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.
These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.
However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.
In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.
Optical + Accelerometer
When I met with Logitech while setting up for our Hardware Workshop at Quakecon this year, they wanted to show me a new mouse they were coming out with. Of course I was interested, but to be honest, mice have seemingly gone to a point where I could very rarely tell them apart in terms of performance. Logitech promised me this would be different. The catch? The G402 Hyperion Fury includes not just an optical sensor but an accelerometer and gyro combo.
Pretty much all mice today use optical sensors to generate data. The sensors are, basically, taking hundreds or thousands of photos of the surface of your desk or mouse and compare them to each other to measure how far and how fast you have moved your mouse. Your PC then takes that data from the mouse at a USB polling rate, up to 1000 Hz with this mouse, and translates it into mouse movement on your desktop and in games.
There is an issue though - at very high speeds of mouse movement, the optical sensor can fail. It essentially loses track of where it is on the surface and can no longer provide accurate data back to the system. At this point, depending on the design of the mouse and driver, the mouse may just stop sending data all together or just attempt to "guess" for a short period of time. Clearly that's not ideal and means that gamers (or any user for that matter) is getting inaccurate measurements. Boo.
To be quite honest though, that doesn't happen with modern mice at your standard speeds, or even standard "fast" gaming motions. According to Logitech, the optical sensor will start to lose tracking somewhere in the 150-180 IPS, or inches per second. That's quite a lot. More precisely that is 3.8 meters per second or 8.5 miles per hour.
Introduction, Hardware, and Subjective Feel
This review comes before the end of the pre-order period. The reason why I targeted that deadline is because the pre-order perks are quite significant. First, either version of the mouse is listed for about $50 off of its MSRP (which is half price for the plastic version). EVGA also throws in a mouse pad for registering your purchase. The plastic mouse is $49.99 during its pre-order period ($99.99 MSRP) and its carbon fiber alternative is $79.99 ($129.99 MSRP). EVGA has supplied us with the plastic version for review.
Being left-handed really puts a damper on my choice of gaming mice. If the peripheral is designed to contain thumb buttons, it needs to either be symmetric (because a right hand's thumb buttons would be controlled by my pinky or ring finger) or be an ergonomic, curved mouse which comes in a special version for lefties that is mirrored horizontally (which is an obvious risk, especially when the market of left-handed gamers is further split by those who learned to force themselves to use right-handed mice).
Upgrades from Anker
Last year we started to have a large amount of mobile devices around the office including smartphones, tablets and even convertibles like the ASUS T100, all of which were charged with USB connections. While not a hassle when you are charging one or two units at time, having 6+ on our desks on any day started to become a problem for our less numerous wall outlets. Our solution last year was Anker's E150 25 watt wall charger that we did a short video overview on.
It was great but had limitations including different charging rates depending on the port you connected it to, limited output of 5 Amps total for all five ports and fixed outputs per port. Today we are taking a look at a pair of new Anker devices that implement smart ports called PowerIQ that enable the battery and wall charger to send as much power to the charging device as it requests, regardless of what physical port it is attached to.
We'll start with the updated Anker 40 watt 5-port wall charger and then move on to discuss the 3-port mobile battery charger, both of which share the PowerIQ feature.
Anker 40 watt 5-Port Wall Charger
The new Anker 5-port wall charger is actually smaller than the previous generation but offers superior specifications at all feature points. This unit can push out more than 40 watts total combined through all five USB ports, 5 volts at as much as 8 amps. All 8 amps can in fact go through a single USB charging port we are told if there was a device that would request that much - we don't have anything going above 2.3A it seems in our offices.
Any USB port can be used for any device on this new model, it doesn't matter where it plugs in. This great simplifies things from a user experience point of view as you don't have to hold the unit up to your face to read the tiny text that existed on the E150. With 8 amps spread across all five ports you should have more than enough power to charge all your devices at full speed. If you happen to have five iPads charging at the same time, that would exceed 8A and all the devices charge rates would be a bit lower.
AM1 Walks New Ground
After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.
While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.
Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.
Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.
Athlon and Pentium Live On
Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.
However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.
Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.
|AMD System||Intel System|
|Processor||AMD Athlon X4 760K - $85||Intel Pentium G3220 - $65|
|Cores / Threads||4 / 4||2 / 2|
|Motherboard||Gigabyte F2A55M-HD2 - $60||ASUS H81M-E - $60|
|Graphics||MSI R9 270 Gaming - $180||MSI R9 270 Gaming - $180|
|System Memory||Corsair 8GB DDR3-1600 (1x8GB) - $73||Corsair 8GB DDR3-1600 (1x8GB) - $73|
|Hard Drive||Western Digital 1TB Caviar Green - $60||Western Digital 1TB Caviar Green - $60|
|Power Supply||Cooler Master GX 450W - $50||Cooler Master GX 450W - $50|
|Case||Cooler Master N200 MicroATX - $50||Cooler Master N200 MicroATX - $50|
(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)
These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.
First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.
What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.
As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.
1920x1080, 2560x1440, 3840x2160
Join us on March 11th at 9pm ET / 6pm PT for a LIVE Titanfall Game Stream! You can find us at http://www.pcper.com/live. You can subscribe to our mailing list to be alerted whenever we have a live event!!
We canceled the event due to the instability of Titanfall servers. We'll reschedule soon!!
With the release of Respawn's Titanfall upon us, many potential PC gamers are going to be looking for suggestions on compiling a list of parts targeted at a perfect Titanfall experience. The good news is, even with a fairly low investment in PC hardware, gamers will find that the PC version of this title is definitely the premiere way to play as the compute power of the Xbox One just can't compete.
In this story we'll present three different build suggestions, each addressing a different target resolution but also better image quality settings than the Xbox One can offer. We have options for 1080p, the best option that the Xbox could offer, 2560x1440 and even 3840x2160, better known as 4K. In truth, the graphics horsepower required by Titanfall isn't overly extreme, and thus an entire PC build coming in under $800, including a full copy of Windows 8.1, is easy to accomplish.
Target 1: 1920x1080
First up is old reliable, the 1920x1080 resolution that most gamers still have on their primary gaming display. That could be a home theater style PC hooked up to a TV or monitors in sizes up to 27-in. Here is our build suggestion, followed by our explanations.
|Titanfall 1080p Build|
|Processor||Intel Core i3-4330 - $137|
|Motherboard||MSI H87-G43 - $96|
|Memory||Corsair Vengeance LP 8GB 1600 MHz (2 x 4GB) - $89|
|Graphics Card||EVGA GeForce GTX 750 Ti - $179|
|Storage||Western Digital Blue 1TB - $59|
|Case||Corsair 200R ATX Mid Tower Case - $72|
|Power Supply||Corsair CX 500 watt - $49|
|OS||Windows 8.1 OEM - $96|
|Total Price||$781 - Amazon Full Cart|
Our first build comes in at $781 and includes some incredibly competent gaming hardware for that price. The Intel Core i3-4330 is a dual-core, HyperThreaded processor that provides more than enough capability to push Titanfall any all other major PC games on the market. The MSI H87 motherboard lacks some of the advanced features of the Z87 platform but does the job at a lower cost. 8GB of Corsair memory, though not running at a high clock speed, provides more than enough capacity for all the programs and applications you could want to run.
What Mantle signifies about GPU architectures
Mantle is a very interesting concept. From the various keynote speeches, it sounds like the API is being designed to address the current state (and trajectory) of graphics processors. GPUs are generalized and highly parallel computation devices which are assisted by a little bit of specialized silicon, when appropriate. The vendors have even settled on standards, such as IEEE-754 floating point decimal numbers, which means that the driver has much less reason to shield developers from the underlying architectures.
Still, Mantle is currently a private technology for an unknown number of developers. Without a public SDK, or anything beyond the half-dozen keynotes, we can only speculate on its specific attributes. I, for one, have technical questions and hunches which linger unanswered or unconfirmed, probably until the API is suitable for public development.
Or, until we just... ask AMD.
Our response came from Guennadi Riguer, the chief architect for Mantle. In it, he discusses the API's usage as a computation language, the future of the rendering pipeline, and whether there will be a day where Crossfire-like benefits can occur by leaving an older Mantle-capable GPU in your system when purchasing a new, also Mantle-supporting one.
Q: Mantle's shading language is said to be compatible with HLSL. How will optimizations made for DirectX, such as tweaks during shader compilation, carry over to Mantle? How much tuning will (and will not) be shared between the two APIs?
[Guennadi] The current Mantle solution relies on the same shader generation path games the DirectX uses and includes an open-source component for translating DirectX shaders to Mantle accepted intermediate language (IL). This enables developers to quickly develop Mantle code path without any changes to the shaders. This was one of the strongest requests we got from our ISV partners when we were developing Mantle.
Follow-Up: What does this mean, specifically, in terms of driver optimizations? Would AMD, or anyone else who supports Mantle, be able to re-use the effort they spent on tuning their shader compilers (and so forth) for DirectX?
[Guennadi] With the current shader compilation strategy in Mantle, the developers can directly leverage DirectX shader optimization efforts in Mantle. They would use the same front-end HLSL compiler for DX and Mantle, and inside of the DX and Mantle drivers we share the shader compiler that generates the shader code our hardware understands.