All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
500GB on the go
Corsair seems to have its fingers in just about everything these days so why not mobile storage, right? The Voyager Air a multi-function device that Corsair calls as "portable wireless drive, home network drive, USB drive, and wireless hub." This battery powered device is meant to act as a mobile hard drive for users that need more storage on the go including PCs and Macs as well as iOS and Android users.
The Voyager Air can also act as a basic home NAS device with a Gigabit Ethernet connection on board for all the computers on your local network. And if you happen to have DLNA ready Blu-ray players or TVs nearby, they can access the video and audio stored on the Voyager Air as well.
Available in either red or black, with 500GB and 1TB capacities, the Voyager Air is slim and sleek, meant to be seen not hidden in a closet.
The front holds the power switch and WiFi on/off switch as well as back-lit icons to check for power, battery life and connection status.
The Densest 2.5 Hours Imaginable
Introduction and Technical Specifications
Courtesy of Cooler Master
The HAF XB mid tower case is the newest member of the Cooler Master HAF line of cases. Touted as a LAN box, this cube-shaped case has both looks and features that appeal to any enthusiasts. We decided to put the HAF XB on our test bench to validate these claims. At a base price of $99.99, the HAF XB is a bargain for the features you are getting.
Courtesy of Cooler Master
Courtesy of Cooler Master
Cooler Master designed the HAF XB with a scratch-resistance, flat-black colored coating applied to all surfaces. Both side panels have integrated hand-holds for easy lifting and transport to your event and the front and top panels contain non-impeding mesh grills allowing for optimal airflow across your vital system components. Integrated into the case's front panel are power and reset buttons, power indicator LEDs, audio input and output port, USB 3.0 device ports, two 5.25" device bays, and two hot-swappable hard drive bays.
Frame Pacing for CrossFire
When the Radeon HD 7990 launched in April of this year, we had some not-so-great things to say about it. The HD 7990 depends on CrossFire technology to function and we had found quite a few problems with AMD's CrossFire technology over the last months of testing with our Frame Rating technology, the HD 7990 "had a hard time justifying its $1000 price tag." Right at launch, AMD gave us a taste of a new driver that they were hoping would fix the frame pacing and frame time variance issues seen in CrossFire, and it looked positive. The problem was that the driver wouldn't be available until summer.
As I said then: "But until that driver is perfected, is bug free and is presented to buyers as a made-for-primetime solution, I just cannot recommend an investment this large on the Radeon HD 7990."
Today could be a very big day for AMD - the release of the promised driver update that enables frame pacing on AMD 7000-series CrossFire configurations including the Radeon HD 7990 graphics cards with a pair of Tahiti GPUs.
It's not perfect yet and there are some things to keep an eye on. For example, this fix will not address Eyefinity configurations which includes multi-panel solutions and the new 4K 60 Hz displays that require a tiled display configuration. Also, we found some issues with more than two GPU CrossFire that we'll address in a later page too.
New Driver Details
Starting with 13.8 and moving forward, AMD plans to have the frame pacing fix integrated into all future drivers. The software team has implemented a software based frame pacing algorithm that simply monitors the time it takes for each GPU to render a frame, how long a frame is displayed on the screen and inserts delays into the present calls when necessary to prevent very tightly timed frame renders. This balances or "paces" the frame output to the screen without lowering the overall frame rate. The driver monitors this constantly in real-time and minor changes are made on a regular basis to keep the GPUs in check.
As you would expect, this algorithm is completely game engine independent and the games should be completely oblivious to all that is going on (other than the feedback from present calls, etc).
This fix is generic meaning it is not tied to any specific game and doesn't require profiles like CrossFire can from time to time. The current implementation will work with DX10 and DX11 based titles only with DX9 support being added later with another release. AMD claims this was simply a development time issue and since most modern GPU-bound titles are DX10/11 based they focused on that area first. In phase 2 of the frame pacing implementation AMD will add in DX9 and OpenGL support. AMD wouldn't give me a timeline for implementation though so we'll have to see how much pressure AMD continues with internally to get the job done.
It has come to my attention that you are planning on producing and selling a device to be called “NVIDIA SHIELD.” It should be noted that even though it shares the same name, this device has no matching attributes of the super-hero comic-based security agency. Please adjust.
When SHIELD was previewed to the world at CES in January of this year, there were a hundred questions about the device. What would it cost? Would the build quality stand up to expectations? Would the Android operating system hold up as a dedicated gaming platform? After months of waiting a SHIELD unit finally arrived in our offices in early July, giving us plenty of time (I thought) to really get a feel for the device and its strengths and weakness. As it turned out though, it still seemed like an inadequate amount of time to really gauge this product. But I am going to take a stab at it, feature by feature.
NVIDIA SHIELD aims to be a mobile gaming platform based on Android with a flip out touch-screen interface, high quality console design integrated controller, and added features like PC game streaming and Miracast support.
Initial Unboxing and Overview of Product Video
At the heart of NVIDIA SHIELD is the brand new Tegra 4 SoC, NVIDIA’s latest entry into the world of mobile processors. Tegra 4 is a quad-core, ARM Cortex-A15 based SoC that includes a 5th A15 core built on lower power optimized process technology to run background and idle tasks using less power. This is very similar to what NVIDIA did with Tegra 3’s 4+1 technology, and how ARM is tackling the problem with big.LITTLE philosophy.
Introduction and Specifications
Last week, Samsung flew a select group of press out to Seoul, Korea. The event was the 2013 Samsung Global SSD Summit. Here we saw the launch of a new consumer SSD, the 840 EVO:
This new SSD aims to replace the older 840 (non-Pro) model with one that is considerably more competitive. Let's just right into the specs:
NVIDIA Finally Gets Serious with Tegra
Tegra has had an interesting run of things. The original Tegra 1 was utilized only by Microsoft with Zune. Tegra 2 had a better adoption, but did not produce the design wins to propel NVIDIA to a leadership position in cell phones and tablets. Tegra 3 found a spot in Microsoft’s Surface, but that has turned out to be a far more bitter experience than expected. Tegra 4 so far has been integrated into a handful of products and is being featured in NVIDIA’s upcoming Shield product. It also hit some production snags that made it later to market than expected.
I think the primary issue with the first three generations of products is pretty simple. There was a distinct lack of differentiation from the other ARM based products around. Yes, NVIDIA brought their graphics prowess to the market, but never in a form that distanced itself adequately from the competition. Tegra 2 boasted GeForce based graphics, but we did not find out until later that it was comprised of basically four pixel shaders and four vertex shaders that had more in common with the GeForce 7800/7900 series than it did with any of the modern unified architectures of the time. Tegra 3 boasted a big graphical boost, but it was in the form of doubling the pixel shader units and leaving the vertex units alone.
While NVIDIA had very strong developer relations and a leg up on the competition in terms of software support, it was never enough to propel Tegra beyond a handful of devices. NVIDIA is trying to rectify that with Tegra 4 and the 72 shader units that it contains (still divided between pixel and vertex units). Tegra 4 is not perfect in that it is late to market and the GPU is not OpenGL ES 3.0 compliant. ARM, Imagination Technologies, and Qualcomm are offering new graphics processing units that are not only OpenGL ES 3.0 compliant, but also offer OpenCL 1.1 support. Tegra 4 does not support OpenCL. In fact, it does not support NVIDIA’s in-house CUDA. Ouch.
Jumping into a new market is not an easy thing, and invariably mistakes will be made. NVIDIA worked hard to make a solid foundation with their products, and certainly they had to learn to walk before they could run. Unfortunately, running effectively entails having design wins due to outstanding features, performance, and power consumption. NVIDIA was really only average in all of those areas. NVIDIA is hoping to change that. Their first salvo into offering a product that offers features and support that is a step above the competition is what we are talking about today.
Specifications and Overview
Talk to most PC enthusiasts today, be they gamers or developers, and ask them what technology they are most interested in for the next year or so and you will most likely hear about 4K somewhere in the discussion. While the world of consumer electronics and HDTV has been stuck in the rut of 1080p for quite some time now, computers, smartphones and tablets are racing in the direction of higher resolutions and higher pixel densities. 4K is a developing standard that pushes screen resolutions to 4K x 2K pixels and if you remove the competing options discussion (3840x2160 versus 4096x2160 are the most prominent) this move is all good news for the industry.
I first dove into the area of 4K displays when I purchased the SEIKI SE50UY04 50-in 4K TV in April for $1300 when it popped up online. The TV showed up days later and we did an unboxing and preview of the experience and I was blown away by the quality difference by moving to a 3840x2160 screen, even with other caveats to be had. It was a 30 Hz panel, half a typical LCD computer display today, it had limited functionality and it honestly wasn't the best quality TV I had ever used. But it was 4K, it was inexpensive and it was available.
It was hard to beat at the time but the biggest drawback was the lack of 60 Hz support, the ability for the screen to truly push 60 frames per second to the panel. This caused some less than desirable results with Windows usage and even in gaming where visual tearing was more prominent when Vsync was disabled. But a strength of this design was that it only required a single HDMI connection and would work with basically any current graphics systems. I did some Frame Rating game performance testing at 4K and found that GPU horsepower was definitely a limiting factor.
Today I follow up our initial unboxing and preview of the ASUS PQ321Q 4K monitor with a more thorough review and summary of our usage results. There is quite a bit that differs between our experience with the SEIKI and the ASUS panels and it is more than just the screen sizes.
Introduction and Design
With the release of Haswell upon us, we’re being treated to an impacting refresh of some already-impressive notebooks. Chief among the benefits is the much-championed battery life improvements—and while better power efficiency is obviously valuable where portability is a primary focus, beefier models can also benefit by way of increased versatility. Sure, gaming notebooks are normally tethered to an AC adapter, but when it’s time to unplug for some more menial tasks, it’s good to know that you won’t be out of juice in a couple of hours.
Of course, an abundance of gaming muscle never hurts, either. As the test platform for one of our recent mobile GPU analyses, MSI’s 15.6” GT60 gaming notebook is, for lack of a better description, one hell of a beast. Following up on Ryan’s extensive GPU testing, we’ll now take a more balanced and comprehensive look at the GT60 itself. Is it worth the daunting $1,999 MSRP? Does the jump to Haswell provide ample and economical benefits? And really, how much of a difference does it make in terms of battery life?
Our GT60 test machine featured the following configuration:
In case it wasn’t already apparent, this device makes no compromises. Sporting a desktop-grade GPU and a quad-core Haswell CPU, it looks poised to be the most powerful notebook we’ve tested to date. Other configurations exist as well, spanning various CPU, GPU, and storage options. However, all available GT60 configurations feature a 1080p anti-glare screen, discrete graphics (starting at the GTX 670M and up), Killer Gigabit LAN, and a case built from metal and heavy-duty plastic. They also come preconfigured with Windows 8, so the only way to get Windows 7 with your GT60 is to purchase it through a reseller that performs customizations.
Overclocked GTX 770 from Galaxy
When NVIDIA launched the GeForce GTX 770 at the very end of May, we started to get in some retail samples from companies like Galaxy. While our initial review looked at the reference models, other add-in card vendors are putting their own unique touch on the latest GK104 offering and Galaxy was kind enough to send us their GeForce GTX 770 2GB GC model that uses a unique, more efficient cooler design and also runs at overclocked frequencies.
If you haven't yet read up on the GTX 770 GPU, you should probably stop by my first review of the GTX 770 to see what information you are missing out on. Essentially, the GTX 770 is a full-spec GK104 Kepler GPU running at higher clocks (both core and memory speeds) compared to the original GTX 680. The new reference clocks for the GTX 770 were 1046 MHz base clock, 1085 MHz Boost clock and a nice increase to 7.0 GHz memory speeds.
Galaxy GeForce GTX 770 2GB GC Specs
The Galaxy GC model is overclocked with a new base clock setting of 1111 MHz and a higher Boost clock of 1163 MHz; both are about 6.5-7.0% higher than the original clocks. Galaxy has left the memory speeds alone though keeping them running at 7.0 GHz effectively.
Some more 4K love!
This morning Fedex dropped off a new product at our offices, one that I was very eagerly awaiting: the ASUS PQ321Q 31.5-in 4K 60 Hz monitor!
While we are far from ready to post a full review of the display and have lots of more game testing to get to, we did host a live stream for the unboxing and initial testing of the PQ321Q that I think is worth sharing.
In this video we do a walk around the $3500 4K display, hook it up to both NVIDIA and AMD test bed at 60 Hz and then proceed to install 3-Way SLI Titans to see how it games! Enjoy this quick preview before our full review of the ASUS PQ321Q.
A 27-in Table PC
While foraging through the land that is Las Vegas during the 2013 Consumer Electronic Show, we ran into Lenovo and they showed us a unique PC design they were calling the "Table PC". The Lenovo IdeaCentre Horizon is a 27-in All-in-One design that is finally available in the market and brings some very interesting design decisions and use cases.
At its heart, the IdeaCentre Horizon is a 27-in 1920x1080 display with an AIO PC design that includes some pretty standard Intel-based Ultrabook-style hardware. That includes an Intel Core i5-3337U dual-core processor, a discrete NVIDIA GeForce GTX GT620M graphics processor, a 1TB 5400 RPM HDD and 8GB of DDR3-1600 memory.
But this computer is much more important than simply the hardware it is built around. Built to switch between a standard AIO configuration but allows for a fold-down, multi-user interface with custom software for interaction, the Horizon attempts to bring life to low-cost computers built for more than one user at a time.
From a physical perspective, the IdeaCentre Horizon has the normal and expected design cues. There is an HD webcam up top for Skype calls, touch-based buttons for volume and brightness, indicator lights for drive usage, power states, etc.
The 1920x1080 10-point touch screen on the Horizon was nice, but not great. For a 27-in display that you are going to be interfacing with very closely, the pixel density is definitely lower than our 1080p 21-in touch screen AIO floating around the office. There were some minor glare issues as well, even with the Lenovo "anti-glare coating" while using the Horizon in the fully laid down, flat position.
Introduction and Technical Specifications
Courtesy of GIGABYTE
The GIGABYTE Z87X-OC Force is the flagship board in GIGABYTE's LGA1150 line of boards. GIGABYTE used the previous generation flagship board, the Z77X-UP7, as a template and improved the design to take the new build to even greater performance heights. In addition to the fan-cooled and optionally water-cooled heat pipe, the board includes the latest iteration of GIGABYTE's power system, dubbed Ultra Durable 5 Plus. For a flagship board, performance and craftsmanship comes at a premium. For $419.99, the Z87-OC Force is a sound investment.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
GIGABYTE designed the Z87X-OC Force with an impressive 16-phase digital power delivery system, powered by International Rectifier (IR) manufactured PowIRstage™ ICs and PWM controllers. GIGABYTE integrated a plethora of features into the Z87X-OC Force including: 10 SATA 6Gb/s ports; dual Intel GigE NICs; five PCI-Express x16 slots for up to quad-card support; two PCI-Express x1 slots; onboard power, reset, BIOS reset, pre-power (OC Ignition), base clock up/down, CPU ratio up/down, OC Tag, OC Gear, and OC Turbo buttons; dual BIOS, BIOS select, PCIe port, and LN2 switches; 2-digit diagnostic LED display; integrated voltage measurement points; and USB 2.0 and 3.0 port support.
Courtesy of GIGABYTE
Another Wrench – GeForce GTX 760M Results
Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board. While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.
The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics. Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.
This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology. It is configured with 2GB of GDDR5 memory running at 2.0 GHz.
If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first. Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing. And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today.
If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop. The data presented below depends on that background knowledge!
Okay, you’ve been warned – on to the results.
A quick look at a great accessory
Though we are a PC hardware and technology website by day, we are also video creators by night (and sometimes day as well). If you don't believe me, check out our PC Perspective video tag or even our very own YouTube channel. See?!?
While we do have a big fancy studio setup for in-house production, sometimes on the road you just need something quick and easy but also high quality for recording. While our collection of DSLR cameras does amazing with video quality, the audio from the in-camera microphones has always sucked and lugging around wireless mic packs seemed unnecessary much of the time.
Enter the RODE VideoMic.
This $170 shotgun, directional microphone is from one of the most well recognized and respected companies in pro-sumer audio. In the short video below I show you what you get in the box (not much) and how much you can improve your audio with this simple add-on.
Overall, I have to say I was very impressed with the RODE VideoMic and anyone looking to improve the quality of their videos with an audio upgrade should give this option a try!
Battle of the IGPs
Our long journey with Frame Rating, a new capture-based analysis tool to measure graphics performance of PCs and GPUs, began almost two years ago as a way to properly evaluate the real-world experiences for gamers. What started as a project attempting to learn about multi-GPU complications has really become a new standard in graphics evaluation and I truly believe it will play a crucial role going forward in GPU and game testing.
Today we use these Frame Rating methods and tools, which are elaborately detailed in our Frame Rating Dissected article, and apply them to a completely new market: notebooks. Even though Frame Rating was meant for high performance discrete desktop GPUs, the theory and science behind the entire process is completely applicable to notebook graphics and even on the integrated graphics solutions on Haswell processors and Richland APUs. It also is able to measure performance of discrete/integrated graphics combos from NVIDIA and AMD in a unique way that has already found some interesting results.
Battle of the IGPs
Even though neither side wants us to call it this, we are testing integrated graphics today. With the release of Intel’s Haswell processor (the Core i7/i5/i3 4000) the company has upgraded the graphics noticeably on several of their mobile and desktop products. In my first review of the Core i7-4770K, a desktop LGA1150 part, the integrated graphics now known as the HD 4600 were only slightly faster than the graphics of the previous generation Ivy Bridge and Sandy Bridge. Even though we had all the technical details of the HD 5000 and Iris / Iris Pro graphics options, no desktop parts actually utilize them so we had to wait for some more hardware to show up.
When Apple held a press conference and announced new MacBook Air machines that used Intel’s Haswell architecture, I knew I could count on Ken to go and pick one up for himself. Of course, before I let him start using it for his own purposes, I made him sit through a few agonizing days of benchmarking and testing in both Windows and Mac OS X environments. Ken has already posted a review of the MacBook Air 11-in model ‘from a Windows perspective’ and in that we teased that we had done quite a bit more evaluation of the graphics performance to be shown later. Now is later.
So the first combatant in our integrated graphics showdown with Frame Rating is the 11-in MacBook Air. A small, but powerful Ultrabook that sports more than 11 hours of battery life (in OS X at least) but also includes the new HD 5000 integrated graphics options. Along with that battery life though is the GT3 variation of the new Intel processor graphics that doubles the number of compute units as compared to the GT2. The GT2 is the architecture behind the HD 4600 graphics that sits with nearly all of the desktop processors, and many of the notebook versions, so I am very curious how this comparison is going to stand.
Introduction and Technical Specifications
Courtesy of MSI
The Z87 MPOWER board is one of the first boards released as part of MSI's Z87 update of the much vaunted MPOWER line. The board supports the next generation of Intel processors based on the LGA1150 socket (Haswell), and is packed full of features and overclocking goodness. In-line with the MPower's established theme, the Z87 MPower board maintain's the black and yellow stylings from the last generation with the MSI brand logo proudly displeased on the chipset heat sink. At a retail price of $229.99, the Z87 MPower would be a nice value add to any enthusiast or gaming system.
Courtesy of MSI
Courtesy of MSI
In keeping with its enthusiast appeal, MSI incorporated a full 16-phase digital power delivery system into the Z87 MPower, ensuring CPU stability under the most intense system loads. MSI integrated the following features into the Z87 MPower: eight SATA 6Gb/s ports; an mSATA 6Gb/s port; a Killer E2205 GigE NIC; Atheros 802.11n WiFi and Bluetooth adapter support; three PCI-Express x16 slots for up to tri-card support; four PCI-Express x1 slots; Lucidlogix Virtu® MVP 2.0 support; onboard power, reset, BIOS reset, base clock control, OC Genie, and Go2BIOS buttons; multi-BIOS and OC Genie mode switches; 2-digit diagnostic LED display; and USB 2.0 and 3.0 port support.
Courtesy of MSI
The GPU Midrange Gets a Kick
I like budget video cards. They hold a soft spot in my heart. I think the primary reason for this is that I too was once a poor college student and could not afford the really expensive cards. Ok, so this was maybe a few more years ago than I like to admit. Back when the Matrox Millennium was very expensive, I ended up getting the STB Lightspeed 128 instead. Instead of the 12 MB Voodoo 2 I went for the 8 MB version. I was never terribly fond of paying top dollar for a little extra performance. I am still not fond of it either.
The sub-$200 range is a bit of a sweet spot that is very tightly packed with products. These products typically perform in the range of a high end card from 3 years ago, yet still encompass the latest features of the top end products from their respective companies. These products can be overclocked by end users to attain performance approaching cards in the $200 to $250 range. Mind, there are some specific limitations to the amount of performance one can actually achieve with these cards. Still, what a user actually gets is very fair when considering the price.
Today I cover several flavors of cards from three different manufacturers that are based on the AMD HD 7790 and the NVIDIA GTX 650 Ti BOOST chips. These range in price from $129 to $179. The features on these cards are amazingly varied, and there are no “sticker edition” parts to be seen here. Each card is unique in its design and the cooling strategies are also quite distinct. Users should not expect to drive monitors above 1920x1200, much less triple monitors in Surround and Eyefinity.
Now let us quickly go over the respective chips that these cards are based on.
You need some extra space? We got that.
Corsair likes to make cases, I think we know that much for sure by now. I mean, they have designed something on the order of 13 different chassis, not including color variations. The very first case Corsair shipped was the Obsidian 800D and at the time it was massive, just down right big.
Today though we are doing a preview of an even bigger case: the Obsidian 900D. Seriously, the Obsidian 350D can nearly fit inside the primary bay of this case...
Measuring 27 inches tall, 25 inches deep and more than 10 inches wide, the 900D will dominate just about any landscape you put it on, including a desk as you can tell in the video above.
It's height and depth give the 900D space for more processing and cooling components than you will likely ever really need to use, unless you're one of those few crazies out there. I mean, just looking in the window below you'll see we have a standard ATX motherboard installed and it is just dwarfed.
Get notified when we go live!