All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and First Impressions
When I reviewed the first LIVA mini-PC from ECS one year ago I was impressed by the concept of a full Windows computer in an enclosure about the size of a can of cola, which included everything you needed to get started out of the box. The problem with that first LIVA was that it was a little underpowered for the current generation of operating systems, and with the introduction of the LIVA X the performance improved only slightly; though it was a much more polished product overall. So how does the latest LIVA - the X2 - stack up? We'll find that out here.
The first thing you're bound to notice with the X2 is the markedly different style compared to the first two. Where last year’s LIVA X had a sleek, lower-profile appearance, with the LIVA X2 we have something completely different, which I won’t judge one way or the other as this is a matter of personal taste. I do miss the angular black plastic housing from last year’s version, but the fit and finish of the X2 is very nice regardless of what you think of the rounded body and white and chrome plastic finish. (ECS also offers a LIVA “Core” barebone kit that follows the aesthetic of the LIVA X.)
So what’s new beyond the appearance? After only the most minor tweak to the SoC between the first LIVA and its followup, the LIVA X (moving a single SKU up from an Intel Bay Trail-M Celeron N2807 to the N2808), this new X2 has a completely different Intel solution under the hood with its Braswell SoC - the Intel Celeron N3050 processor, a dual-core part with 2 MB of cache and a 2.16 GHz top speed. Considering that even the <$150 Intel Compute Stick offers a quad-core CPU (the Z3735F, a Bay Trail SoC) I was a little skeptical of the dual-core option here, but we’ll just have to see how it performs.
Three generations of LIVA
Four High Powered Mini ITX Systems
Thanks to Sebastian for helping me out with some of the editorial for this piece and to Ken for doing the installation and testing on the system builds! -Ryan
While some might wonder where the new Radeon R9 Nano fits in a market that offers the AMD Fury X for the same price, the Nano is a product that defines a new category in the PC enthusiast community. It is a full-scale GPU on an impossibly small 6-inch PCB, containing the same core as the larger liquid-cooled Fury X, but requiring 100 watts less power than Fury X and cooled by a single-fan dual-slot air cooler.
The R9 Nano design screams compatibility. It has the ability to fit into virtually any enclosure (including many of the smallest mini-ITX designs), as long as the case supports a dual-slot (full height) GPU. The total board length of 6 inches is shorter than a mini-ITX motherboard, which is 6.7 inches square! Truly, the Nano has the potential to change everything when it comes to selecting a small form-factor (SFF) enclosure.
Typically, a gaming-friendly enclosure would need at minimum a ~270 mm GPU clearance, as a standard 10.5-inch reference GPU translates into 266.7 mm in length. Even very small mini-ITX enclosures have had to position components specifically to allow for these longer cards – if they wanted to be marketed as compatible with a full-size GPU solution, of course. Now with the R9 Nano, smaller and more powerful than any previous ITX-specific graphics card to date, one of the first questions we had was a pretty basic one: what enclosure should we put this R9 Nano into?
With no shortage of enclosures at our disposal to try out a build with this new card, we quickly discovered that many of them shared a design choice: room for a full-length GPU. So, what’s the advantage of the Nano’s incredibly compact size? It must be pointed out that larger (and faster) Fury X has the same MSRP, and at 7.5 inches the Fury X will fit comfortably in cases that have spacing for the necessary radiator.
Finding a Case for Nano
While even some of the tiniest mini-ITX enclosures (EVGA Hadron, NCASE M1, etc.) offer support for a 10.5-in GPU, there are several compact mini-ITX cases that don’t support a full-length graphics card due to their small footprint. While by no means a complete list, here are some of the options out there (note: there are many more mini-ITX cases that don’t support a full-height or dual-slot expansion card at all, such as slim HTPC enclosures):
|Cooler Master||Elite 110||$47.99, Amazon.com|
|Lian Li||PC-O5||$377, Amazon.com|
|Lian Li||PC-Q01||$59.99, Newegg.com|
|Lian Li||PC-Q03||$74.99, Newegg.com|
|Lian Li||PC-Q07||$71.98, Amazon.com|
|Lian Li||PC-Q30||$139.99, Newegg.com|
|Lian Li||PC-Q33||$134.99, Newegg.com|
|Rosewill||Legacy V3 Plus-B||$59.99, Newegg.com|
The list is dominated by Lian Li, who offers a number of cube-like mini-ITX enclosures that would ordinarily be out of the question for a gaming rig, unless one of the few ITX-specific cards were chosen for the build. Many other fine enclosure makers (Antec, BitFenix, Corsair, Fractal Design, SilverStone, etc.) offer mini-ITX enclosures that support full-length GPUs, as this has pretty much become a requirement for an enthusiast PC case.
Introduction and Test Hardware
The PC gaming world has become divided by two distinct types of games: those that were designed and programmed specifically for the PC, and console ports. Unfortunately for PC gamers it seems that far too many titles are simply ported over (or at least optimized for consoles first) these days, and while PC users can usually enjoy higher detail levels and unlocked frame rates there is now the issue of processor core-count to consider. This may seem artificial, but in recent months quite a few games have been released that require at least a quad-core CPU to even run (without modifying the game).
One possible explanation for this is current console hardware: PS4 and Xbox One systems are based on multi-core AMD APUs (the 8-core AMD "Jaguar"). While a quad-core (or higher) processor might not be techincally required to run current games on PCs, the fact that these exist on consoles might help to explain quad-core CPU as a minimum spec. This trend could simply be the result of current x86 console hardware, as developement of console versions of games is often prioritized (and porting has become common for development of PC versions of games). So it is that popular dual-core processors like the $69 Intel Pentium Anniversary Edition (G3258) are suddenly less viable for a future-proofed gaming build. While hacking these games might make dual-core CPUs work, and might be the only way to get such a game to even load as the CPU is checked at launch, this is obviously far from ideal.
Is this much CPU really necessary?
Rather than rail against this quad-core trend and question its necessity, I decided instead to see just how much of a difference the processor alone might make with some game benchmarks. This quickly escalated into more and more system configurations as I accumulated parts, eventually arriving at 36 different configurations at various price points. Yeah, I said 36. (Remember that Budget Gaming Shootout article from last year? It's bigger than that!) Some of the charts that follow are really long (you've been warned), and there’s a lot of information to parse here. I wanted this to be as fair as possible, so there is a theme to the component selection. I started with three processors each (low, mid, and high price) from AMD and Intel, and then three graphics cards (again, low, mid, and high price) from AMD and NVIDIA.
Here’s the component rundown with current pricing*:
- AMD Athlon X4 860K - $74.99
- AMD FX 8350 - $165.93
- AMD FX 9590 (with AIO cooler) - $259.99
- Intel Core i3-4130 - $118
- Intel Core i5-4440 - $184.29
- Intel Core i7-4790K - $338.99
Graphics cards tested:
- AMD Radeon R7 260X (ASUS 2GB OC) - $137.24
- AMD Radeon R9 280 (Sapphire Dual-X) - $169.99
- AMD Radeon R9 290X (MSI Lightning) - $399
- NVIDIA GeForce GTX 750 Ti (OEM) - $149.99
- NVIDIA GeForce GTX 770 (OEM) - $235
- NVIDIA GeForce GTX 980 (ASUS STRIX) - $519
*These prices were current as of 6/29/15, and of course fluctuate.
Introduction and First Impressions
The Zotac ZBOX CI321 nano is a mini PC kit in the vein of the Intel NUC, and this version features a completely fanless design with built-in wireless for silent integration into just about any location. So is it fast enough to be an HTPC or desktop productivity machine? We will find out here.
I have reviewed a couple of mini-PCs in the past few months, most recently the ECS LIVA X back in January. Though the LIVA X was not really fast enough to be used as a primary device it was small and inexpensive enough to be an viable product depending on a user’s needs. One attractive aspect of the LIVA designs, and any of the low-power computers introduced recently, is the passive nature of such systems. This has unfortunately resulted in the integration of some pretty low-performance CPUs to stay within thermal (and cost) limits, but this is beginning to change. The ZBOX nano we’re looking at today carries on the recent trend of incorporating slightly higher performance parts as its Intel Celeron processor (the 2961Y) is based on Haswell, and not the Atom cores at the heart of so many of these small systems.
Another parallel to the Intel NUC is the requirement to bring your own memory and storage, and the ZBOX CI321 nano accepts a pair of DDR3 SoDIMMs and 2.5” storage drives. The Intel Celeron 2961Y processor supports up to 1600 MHz dual-channel DDR3L which allows for much higher memory bandwidth than many other mini-PCs, and the storage controller supports SATA 6.0 Gbps which allows for higher performance than the eMMC storage found in a lot of mini-PCs, depending on the drive you choose to install. Of course your mileage will vary depending on the components selected to complete the build, but it shouldn’t be difficult to build a reasonably fast system.
Digging into a specific market
A little while ago, I decided to think about processor design as a game. You are given a budget of complexity, which is determined by your process node, power, heat, die size, and so forth, and the objective is to lay out features in the way that suits your goal and workload best. While not the topic of today's post, GPUs are a great example of what I mean. They make the assumption that in a batch of work, nearby tasks are very similar, such as the math behind two neighboring pixels on the screen. This assumption allows GPU manufacturers to save complexity by chaining dozens of cores together into not-quite-independent work groups. The circuit fits the work better, and thus it lets more get done in the same complexity budget.
Carrizo is aiming at a 63 million unit per year market segment.
This article is about Carrizo, though. This is AMD's sixth-generation APU, starting with Llano's release in June 2011. For this launch, Carrizo is targeting the 15W and 35W power envelopes for $400-$700 USD notebook devices. AMD needed to increase efficiency on the same, 28nm process that we have seen in their product stack since Kabini and Temash were released in May of 2013. They tasked their engineers to optimize their APU's design for these constraints, which led to dense architectures and clever features on the same budget of complexity, rather than smaller transistors or a bigger die.
15W was their primary target, and they claim to have exceeded their own expectations.
Backing up for a second. Beep. Beep. Beep. Beep.
When I met with AMD last month, I brought up the Bulldozer architecture with many individuals. I suspected that it was a quite clever design that didn't reach its potential because of external factors. As I started this editorial, processor design is a game and, if you can save complexity by knowing your workload, you can do more with less.
Bulldozer looked like it wanted to take a shortcut by cutting elements that its designers believed would be redundant going forward. First and foremost, two cores share a single floating point (decimal) unit. While you need some floating point capacity, upcoming workloads could use the GPU for a massive increase in performance, which is right there on the same die. As such, the complexity that is dedicated to every second FPU can be cut and used for something else. You can see this trend throughout various elements of the architecture.
A substantial upgrade for Thunderbolt
Today at Computex, Intel took the wraps off of the latest iteration of Thunderbolt, a technology that I am guessing many of you thought was dead in the water. It turns out that's not the case, and this new set of features that Thunderbolt 3 offers may in fact push it over the crest and give it the momentum needed to become a useable and widespread standard.
First, Thunderbolt 3 starts with a new piece of silicon, code named Alpine Ridge. Not only does Alpine Ridge increase the available Thunderbolt bandwidth to 40 Gbps but it also adds a native USB 3.1 host controller on the chip itself. And, as mobile users will be glad to see, Intel is going to start utilizing the new USB Type-C (USB-C) connector as the standard port rather than mini DisplayPort.
This new connector type, that was already a favorite among PC Perspective staff because of its size and its reversibility, will now be the way connectivity and speed increases this generation with Thunderbolt. This slide does a good job of summarizing the key take away from the TB3 announcement: 40 Gbps, support for two 4K 60 Hz displays, 100 watt (bi-directional) charging capability, 15 watt device power and support for four protocols including Thunderbolt, DisplayPort, USB and PCI Express.
Protocol support is important and Thunderbolt 3 over USB-C will be able to connect directly to a DisplayPort monitor, to an external USB 3.1 storage drive, an old thumb drive or a new Thunderbolt 3 docking station. This is truly unrivaled flexibility from a single connector. The USB 3.1 controller is backward compatible as well: feel free to connect any USB device to it that you can adapt to the Type-C connection.
From a raw performance perspective Thunderbolt 3 offers a total of 40 Gbps of bi-directional bandwidth, twice that of Thunderbolt 2 and 4x what we get with USB 3.1. That offers users the ability to combine many different devices, multiple displays and network connections and have plenty of headroom.
With Thunderbolt 3 you get twice as much raw video bandwidth, two DP 1.2 streams, allowing you to run not just a single 4K display at 60 Hz but two of them, all over a single TB3 cable. If you want to connect a 5K display though, you will be limited to just one of them.
For mobile users, which I think is the area where Thunderbolt 3 will be the most effective, the addition of USB 3.1 allows for charging capability up to 100 watts. This is in addition to the 15 watts of power that Thunderbolt provides to devices directly - think external storage, small hubs/docks, etc.
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.
Some familiar scenery
If you thought that Intel was going to slow down on its iteration in the SFF (small form factor) system design, you were sadly mistaken. It was February when Intel first sent us a NUC based on Broadwell, an iterative upgrade over a couple of generations for this very small platform, 4" x 4", that showed proved to be interesting from a technology stand point but didn't shift expectations of the puck-sized PC business.
Today we are looking at yet another NUC, also using a Broadwell processor, though this time the CPU is running quite a bit faster, with Intel Iris 6100 graphics and a noticeably higher TDP. The Core i7-5557U is still a dual-core / HyperThreaded processor but it increases base and Turbo clocks by wide margins, offering as much as 35% better CPU performance and mainstream gaming performance boosts in the same range. This doesn't mean the NUC 5i7RYH will overtake your custom built desktop but it does make it a lot more palatable for everyday PC users.
Oh, and we have an NVMe PCI Express SSD inside this beast as well. (Waaaaaa??)
No Longer the Media Center of Attention
Gabe Aul, of Microsoft's Windows Insiders program, has confirmed on Twitter that Windows 10 will drop support for Windows Media Center due to a decline in usage. This is not surprising news as Microsoft has been deprecating the Media Center application for a while now. In Windows 8.x, the application required both the “Pro” SKU of the operating system, and then users needed to install an optional add-on above and beyond that. The Media Center Pack cost $10 over the price of Windows 8.x Pro unless you claimed a free license in the promotional period surrounding Windows 8's launch.
While Media Center has been officially abandoned, its influence on the industry (and vice versa) is an interesting story. For a time, it looked like Microsoft had bigger plans that were killed by outside factors and other companies seem to be eying the money that Microsoft left on the table.
There will be some speculation here.
We could go back to the days of WebTV, but we won't. All you need to know is that Microsoft lusted over the living room for years. Windows owned the office and PC gaming was taking off with strong titles (and technologies) from Blizzard, Epic, iD, Valve, and others. DirectX was beloved by developers, which led to the original Xbox. Their console did not get a lot of traction, but they respected it as a first-generation product that was trying to acquire a foothold late in a console generation. Financially, the first Xbox would cost Microsoft almost four billion dollars more than it made.
At the same time, Microsoft was preparing Windows to enter the living room. This was the company's power house and it acquired significant marketshare wherever it went, due to its ease of development and its never-ending supply of OEMs, even if the interface itself was subpar. Their first attempt at bringing Windows to the living room was Windows XP Media Center Edition. This spin-off of Windows XP could only be acquired by OEMs to integrate into home theater PCs (HTPCs). The vision was interesting, using OEM competition to rapidly prototype what users actually want in a PC attached to a TV.
This leads us to Windows Vista, which is where Media Center came together while the OS fell apart.
When I first was handed the Intel Compute Stick product at CES back in January, my mind began to race with a lot of questions. The first set were centered around the capabilities of the device itself: where could it be used, how much performance could Intel pack into it and just how many users would be interested in a product like this? Another set of questions was much more philosophical in nature: why was Intel going in this direction, does this mean an end for the emphasis on high performance componentry from Intel and who comes up with these darned part numbers?
I have since settled my mind on the issues surrounding Intel’s purpose with the Compute Stick and began to dive into the product itself. On the surface the Intel Compute Stick is a product entering late into a potentially crowded market. We already have devices like the Roku, Google Chromecast, the Apple TV, and even the Amazon Fire TV Stick. All of those devices share some of the targets and goals of the Compute Stick, but the one area where Intel’s product really stands out is flexibility. The Roku has the most pre-built applications and “channels” for a streaming media box. The Chromecast is dirt cheap at just $30 or so. Even Amazon’s Fire TV Stick is clearly the best choice for streaming Amazon’s own multimedia services. But the Intel Compute Stick can do all of those things – in addition to operating as a standalone PC with Windows or Linux. Anything you can do I can do better…
But it’s not a product without a few flaws, most of which revolve around the status of the current operating system designs for TVs and larger displays. Performance obviously isn’t peeling the paint off any walls, as you would expect. But I still think at for $150 with a full copy of Windows 8.1 with Bing, the Intel Compute Stick is going to find more fans that you might have first expected.
Intel Pushes Broadwell to the Next Unit of Computing
Intel continues to invest a significant amount of money into this small form factor product dubbed the Next Unit of Computing, or NUC. When it was initially released in December of 2012, the NUC was built as an evolutionary step of the desktop PC, part of a move for Intel to find new and unique form factors that its processors can exist in. With a 4" x 4" motherboard design the NUC is certainly a differentiating design and several of Intel's partners have adopted it for products of their: Gigabyte's BRIX line being the most relevant.
But Intel's development team continues to push the NUC platform forward and today we are evaluating the most recent iteration. The Intel NUC5i5RYK is based on the latest 14nm Broadwell processor and offers improved CPU performance, a higher speed GPU and lower power consumption. All of this is packed into a smaller package than any previous NUC on the market and the result is both impressive and totally expected.
A Walk Around the NUC
To most poeple the latest Intel NUC will look very similar to the previous models based on Ivy Bridge and Haswell. You'd be right of course - the fundamental design is unchanged. But Intel continues to push forward in small ways, nipping and tucking away. But the NUC is still just a box. An incredibly small one with a lot of hardware crammed into it, but a box none the less.
While I can appreciate the details including the black and silver colors and rounded edges, I think that Intel needs to find a way to add some more excitement into the NUC product line going forward. Admittedly, it is hard to inovate in that directions with a focus on size and compression.
SFF PCs get an upgrade
Ultra compact computers, otherwise known as small form factor PCs, are a rapidly increasing market as consumers realize that, for nearly all purposes other than gaming and video editing, Ultrabook-class hardware is "fast enough". I know that some of our readers will debate that fact, and we welcome the discussion, but as CPU architectures continue to improve in both performance and efficiency, you will be able to combine higher performance into smaller spaces. The Gigabyte BRIX platform is the exact result that you expect to see with that combination.
Previously, we have seen several other Gigabyte BRIX devices including our first desktop interaction with Iris Pro graphics, the BRIX Pro. Unfortunately though, that unit was plagued by noise issues - the small fan spun pretty fast to cool a 65 watt processor. For a small computer that would likely sit on top of your desk, that's a significant drawback.
Intel Ivy Bridge NUC, Gigabyte BRIX S Broadwell, Gigabyte BRIX Pro Haswell
This time around, Gigabyte is using the new Broadwell-U architecture in the Core i7-5500U and its significantly lower, 15 watt TDP. That does come with some specification concessions though, including a dual-core CPU instead of a quad-core CPU and a peak Turbo clock rate that is 900 MHz lower. Comparing the Broadwell BRIX S to the more relevant previous generation based on Haswell, we get essentially the same clock speed, a similar TDP, but also an improved core architecture.
Today we are going to look at the new Gigabyte BRIX S featuring the Core i7-5500U and an NFC chip for some interesting interactions. The "S" designates that this model could support a full size 2.5-in hard drive in addition to the mSATA port.
Introduction, Specs, and First Impressions
In our review of the original LIVA mini-PC we found it to be an interesting product, but it was difficult to identify a specific use-case for it; a common problem with the mini-PC market. Could the tiny Windows-capable machine be a real desktop replacement? That first LIVA just wasn't there yet. The Intel Bay Trail-M SoC was outmatched when playing 1080p Flash video content and system performance was a little sluggish overall in Windows 8.1, which wasn't aided by the limitation of 2GB RAM. (Performance was better overall with Ubuntu.) The price made it tempting but it was too underpowered as one's only PC - though a capable machine for many tasks.
Fast forward to today, when the updated version has arrived on my desk. The updated LIVA has a cool new name - the “X” - and the mini computer's case has more style than before (very important!). Perhaps more importantly, the X boasts upgraded internals as well. Could this new LIVA be the one to replace a desktop for productivity and multimedia? Is this the moment we see the mini-PC come into its own? There’s only one way to find out. But first, I have to take it out of the box.
Chipset: Intel® Bay Trail-M/Bay Trail-I SOC
Memory: DDR3L 2GB/4GB
Expansion Slot: 1 x mSATA for SSD
Storage: eMMC 64GB/32GB
Audio: HD Audio Subsystem by Realtek ALC283
LAN: Realtek RTL8111G Gigabit Fast Ethernet Controller
USB: 1 x USB3.0 Port, 2 x USB2.0 Ports
Video Output: 1 x HDMI Port, 1 x VGA Port
Wireless: WiFi 802.11 b/g/n & Bluetooth 4.0
PCB Size: 115 x 75 mm
Dimension: 135 x 83 x 40 mm
VESA Support: 75mm / 100mm
Adapter Input: AC 100-240V, Output: DC 12V / 3A
OS Support: Linux based OS, Windows 7 (via mSATA SSD) Windows 8/8.1
Thanks to ECS for providing the LIVA X for review!
Packaging and Contents
The LIVA X arrives in a smaller box than its predecessor, and one with a satin finish cuz it's extra fancy.
Big Power, Small Size
Though the mindset that a small PC is a slow PC is fading, there are still quite a few readers out there that believe the size of your components will indicate how well they perform. That couldn't be further from the case, and this week we decided to build a small, but not tiny, PC to showcase that small can be beautiful too!
Below you will find a complete list of parts and components used in our build - but let me say right off the bat, to help alleviate as much vitriol in the comments as possible, there are quite a few ways you could build this system to either get a lower price, or higher performance, or quieter design, etc. Our selections were based on a balance of both with a nod towards expansion in a few cases.
Take a look:
|MicroATX Gaming Build|
|Processor||Intel Core i7-4790K - $334
Corsair Hydro Series H80i - $87
|Motherboard||Gigabyte Z97MX-Gaming 5 - $127|
|Memory||G.Skill Ripjaws X 8GB DDR3-2133 - $88|
|Graphics Card||EVGA GeForce GTX 970 FTW - $399|
|Storage||Samsung 250GB 850 EVO - $139
Western Digital 2TB Green - $79
|Case||Corsair Carbide Series Air 240 - $89|
|Power Supply||Seasonic Platinum 860 watt PSU - $174|
|OS||Windows 8.1 x64 - $92|
|Total Price||$1602 - Amazon Full Cart|
The starting point for this system is the Intel Core i7-4790K, the top-end Haswell processor for the Z97 chipset. In fact, the Core i7-4790K is a Devil's Canyon part, created by Intel to appease the enthusiast looking for an overclockable and high clocked quad-core part. This CPU will only lag behind the likes of the Haswell-E LGA2011 processors, but at just $340 or so, is significantly less expensive. Cooling the 4790K is Corsair's Hydro Series H80i double-thickness self contained water cooler.
For the motherboard I selected the Gigabyte Z97MX-Gaming 5, a MicroATX motherboard that combines performance and features in a mATX form factor, perfect for our build. This board includes support for SLI and CrossFire, has audio OP-AMP support, USB ports dedicated for DACs, M.2 storage support, Killer networking and more.
Introduction and Specifications
Several weeks ago, during an episode of the PC Perspective Podcast, we talked about a new all-in-one machine from MSI with a focus on gaming. Featuring a quad-core Intel Haswell processor and a GeForce GTX 980M GPU, the MSI AG270 2QE takes the best available hardware for mobile gaming and stuffs them into a machine with an integrated 1080p touch screen. The result is likely to be the most potent gaming AIO that you will find available; it should be more than capable of tackling modern games at the integrated panel's 1920x1080 resolution.
A gaming all-in-one is an interesting idea - a cross between the typical gaming desktop and a gaming laptop, an AIO splits the difference in a couple of interesting ways. It's more portable than a desktop and monitor combination for sure, but definitely heavier and bulkier than MSI's own GT72 for example. The AG270 offers a much larger screen (at 1080p) than any gaming notebook on its own, which improves the overall gaming experience without the need for additional hardware. While not ideal, it is totally feasible to take the AG270 with you to a neighbor's house for some LAN party action.
So what do you get with the MSI AG270 2QE, and more specifically, with the 037US kit we are reviewing today? Let's find out.
Continue reading our review of the MSI AG270 2QE-037US gaming all-in-one!!
Often times, one of the suggestions of what to do with older PC components is to dedicate it to a Home Theater PC. While in concept this might seem like a great idea, you can do a lot of things with full control over the box hooked up to your TV, I think it's a flawed concept.
With a HTPC, some of the most desired traits include low power consumption, quiet operation, all while maintaining a high performance level so you can do things like transcode video quickly. Older components that you have outgrown don't tend to be nearly as efficient as newer components. To have a good HTPC experience, you really want to pick components from the ground up, which is why I was excited to take a look at the Steiger Dynamics Maven Core HTPC.
As it was shipped to us, our Maven Core is equipped with an Intel Core i5-4690K and an NVIDIA GTX 980. By utilizing two of the most power efficient architectures available, Intel's Haswell and NVIDIA's Maxwell, the Maven should be able to sip power while maintaining low temperature and noise. While a GTX 980 might be overkill for just HTPC applications, it opens up a lot of possibilities for couch-style PC gaming with things like Steam Big Picture mode.
From the outside, the hand-brushed aluminum Steiger Dynamics system takes the form of traditional high-end home theater gear. At 6.85-in tall, or almost 4U if you are comfortable with that measurement system, the Maven Core is a large device, but does not stand out in a collection of AV equipment. Additionally, when you consider the standard Blu-Ray drive and available Ceton InfiniTV Quad PCIe CableCARD tuner giving this system the capability of replacing both a cable set top box and dedicated Blu-Ray player all together, the size becomes easier to deal with.
Digging deeper into the hardware specs of the Maven Core we find some familiar components. The Intel Core i5-4690K sits in an ASUS Z97-A motherboard along with 8GB of Corsair DDR3-1866 memory. For storage we have a 250GB Samsung 840 EVO SSD paired with a Western Digital 3TB Hard Drive for mass storage of your media.
Cooling for the CPU is provided by a Corsair H90 with a single Phanteks fan to help keep the noise down. Steiger Dynamics shipped our system with a Seasonic Platinum-series 650W power supply, including their custom cabling option. For $100, they will ship your system with custom, individually sleeved Power Supply and SATA drive cables. The sleeving and cable management are impressive, but $100 would be a difficult upsell of a PC that you are likely never going to see the inside of.
As we mentioned earlier, this machine also shipped with a Ceton InfiniTV 4 PCIe CableCARD tuner. While CableCARD is a much maligned technology that never really took off, when you get it working it can be impressive. Our impressions of the InfiniTV can be found later in this review.
When Intel revealed their miniature PC platform in 2012, the new “Next Unit of Computing” (NUC) was a tiny motherboard with a custom case, and admittedly very little compute power. Well, maybe not so much with the admittedly: “The Intel NUC is an ultra-compact form factor PC measuring 4-inch by 4-inch. Anything your tower PC can do, the Intel NUC can do and in 4 inches of real estate.” That was taken from Intel’s NUC introduction, and though their assertion was perhaps a bit premature, technology does continue its rapid advance in the small form-factor space. We aren’t there yet by any means, but the fact that a mini-ITX computer can be built with the power of an ATX rig (limited to single-GPU, of course) suggests that it could happen for a mini-PC in the not so distant future.
With NUC the focus was clearly on efficiency over performance, and with very low power and noise there were practical applications for such a device to offset the marginal "desktop" performance. The viability of a NUC would definitely depend on the user and their particular needs, of course. If you could find a place for such a device (such as a living room) it may have been worth the cost, as the first of the NUC kits were fairly expensive (around $300 and up) and did not include storage or memory. These days a mini PC can be found starting as low as $100 or so, but most still do not include any memory or storage. They are tiny barebones PC kits after all, so adding components is to be expected...right?
It’s been a couple of years now, and the platform continues to evolve - and shrink to some startlingly small sizes. Of the Intel-powered micro PC kits on today’s market the LIVA from ECS manages to push the boundaries of this category in both directions. In addition to boasting a ridiculously small size - actually the smallest in the world according to ECS - the LIVA is also very affordable. It carries a list price of just $179 (though it can be found for less), and that includes onboard memory and storage. And this is truly a Windows PC platform, with full Windows 8.1 driver support from ECS (previous versions are not supported).
If there is one message that I get from NVIDIA's GeForce GTX 900M-series announcement, it is that laptop gaming is a first-class citizen in their product stack. Before even mentioning the products, the company provided relative performance differences between high-end desktops and laptops. Most of the rest of the slide deck is showing feature-parity with the desktop GTX 900-series, and a discussion about battery life.
First, the parts. Two products have been announced: The GeForce GTX 980M and the GeForce GTX 970M. Both are based on the 28nm Maxwell architecture. In terms of shading performance, the GTX 980M has a theoretical maximum of 3.189 TFLOPs, and the GTX 970M is calculated at 2.365 TFLOPs (at base clock). On the desktop, this is very close to the GeForce GTX 770 and the GeForce GTX 760 Ti, respectively. This metric is most useful when you're compute bandwidth-bound, at high resolution with complex shaders.
The full specifications are:
|GTX 980M||GTX 970M||
|Memory||Up to 4GB||Up to 3GB||4GB||4GB||4GB/8GB|
|Memory Rate||2500 MHz||2500 MHz||7.0 (GT/s)||7.0 (GT/s)||2500 MHz|
As for the features, it should be familiar for those paying attention to both desktop 900-series and the laptop 800M-series product launches. From desktop Maxwell, the 900M-series is getting VXGI, Dynamic Super Resolution, and Multi-Frame Sampled AA (MFAA). From the latest generation of Kepler laptops, the new GPUs are getting an updated BatteryBoost technology. From the rest of the GeForce ecosystem, they will also get GeForce Experience, ShadowPlay, and so forth.
For VXGI, DSR, and MFAA, please see Ryan's discussion for the desktop Maxwell launch. Information about these features is basically identical to what was given in September.
BatteryBoost, on the other hand, is a bit different. NVIDIA claims that the biggest change is just raw performance and efficiency, giving you more headroom to throttle. Perhaps more interesting though, is that GeForce Experience will allow separate one-click optimizations for both plugged-in and battery use cases.
The power efficiency demonstrated with the Maxwell GPU in Ryan's original GeForce GTX 980 and GTX 970 review is even more beneficial for the notebook market where thermal designs are physically constrained. Longer battery life, as well as thinner and lighter gaming notebooks, will see tremendous advantages using a GPU that can run at near peak performance on the maximum power output of an integrated battery. In NVIDIA's presentation, they mention that while notebooks on AC power can use as much as 230 watts of power, batteries tend to peak around 100 watts. Given that a full speed, desktop-class GTX 980 has a TDP of 165 watts, compared to the 250 watts of a Radeon R9 290X, translates into notebook GPU performance that will more closely mirror its desktop brethren.
Of course, you probably will not buy your own laptop GPU; rather, you will be buying devices which integrate these. There are currently five designs across four manufacturers that are revealed (see image above). Three contain the GeForce GTX 980M, one has a GTX 970M, and the other has a pair of GTX 970Ms. Prices and availability are not yet announced.
The Road to 1080p
The stars of the show: a group of affordable GPU options
When preparing to build or upgrade a PC on any kind of a budget, how can you make sure you're extracting the highest performance per dollar from the parts you choose? Even if you do your homework comparing every combination of components is impossible. As system builders we always end up having to look at various benchmarks here and there and then ultimately make assumptions. It's the nature of choosing products within an industry that's completely congested at every price point.
Another problem is that lower-priced graphics cards are usually benchmarked on high-end test platforms with Core i7 processors - which is actually a necessary thing when you need to eliminate CPU bottlenecks from the mix when testing GPUs. So it seems like it might be valuable (and might help narrow buying choices down) if we could take a closer look at gaming performance from complete systems built with only budget parts, and see what these different combinations are capable of.
With this in mind I set out to see just how much it might take to reach acceptable gaming performance at 1080p (acceptable being 30 FPS+). I wanted to see where the real-world gaming bottlenecks might occur, and get a feel for the relationship between CPU and GPU performance. After all, if there was no difference in gaming performance between, say, a $40 and an $80 processor, why spend twice as much money? The same goes for graphics. We’re looking for “good enough” here, not “future-proof”.
The components in all their shiny boxy-ness (not everything made the final cut)
If money was no object we’d all have the most amazing high-end parts, and play every game at ultra settings with hundreds of frames per second (well, except at 4K). Of course most of us have limits, but the time and skill required to assemble a system with as little cash as possible can result in something that's actually a lot more rewarding (and impressive) than just throwing a bunch of money at top-shelf components.
The theme of this article is good enough, as in, don't spend more than you have to. I don't want this to sound like a bad thing. And if along the way you discover a bargain, or a part that overperforms for the price, even better!
Yet Another AM1 Story?
We’ve been talking about the AMD AM1 platform since its introduction, and it makes a compelling case for a low cost gaming PC. With the “high-end” CPU in the lineup (the Athlon 5350) just $60 and motherboards in the $35 range, it makes sense to start here. (I actually began this project with the Sempron 3820 as well, but it just wasn’t enough for 1080p gaming by a long shot so the test results were quickly discarded.) But while the 5350 is an APU, I didn't end up testing it without a dedicated GPU. (Ok, I eventually did but it just can't handle 1080p.)
But this isn’t just a story about AM1 after all. Jumping right in here, let's look at the result of my research (and mounting credit card debt). All prices were accurate as I wrote this, but are naturally prone to fluctuate:
|Memory||4GB Samsung OEM PC3-12800 DDR3-1600 (~$40 Value)|
|Storage||Western Digital Blue 1TB Hard Drive - $59.99|
|Power Supply||EVGA 430 Watt 80 PLUS PSU - $39.99|
|OS||Windows 8.1 64-bit - $99|
So there it is. I'm sure it won't please everyone, but there is enough variety in this list to support no less than 16 different combinations, and you'd better believe I ran each test on every one of those 16 system builds!
Athlon and Pentium Live On
Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.
However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.
Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.
|AMD System||Intel System|
|Processor||AMD Athlon X4 760K - $85||Intel Pentium G3220 - $65|
|Cores / Threads||4 / 4||2 / 2|
|Motherboard||Gigabyte F2A55M-HD2 - $60||ASUS H81M-E - $60|
|Graphics||MSI R9 270 Gaming - $180||MSI R9 270 Gaming - $180|
|System Memory||Corsair 8GB DDR3-1600 (1x8GB) - $73||Corsair 8GB DDR3-1600 (1x8GB) - $73|
|Hard Drive||Western Digital 1TB Caviar Green - $60||Western Digital 1TB Caviar Green - $60|
|Power Supply||Cooler Master GX 450W - $50||Cooler Master GX 450W - $50|
|Case||Cooler Master N200 MicroATX - $50||Cooler Master N200 MicroATX - $50|
(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)
These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.
First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.
What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.
As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.