Manufacturer: Cyonic

Introduction and Features

Introduction

2-Banner.jpg

Introducing Cyonic, a new player in the global PC power supply arena. Founded in 2013, Cyonic’s goal is to become a global brand of high performance computer parts and accessories. The business is starting off by selling power supplies and Cyonic will soon have three product lines: the AU Series (fixed cables), AUx Series (all modular cables), and the Arise Series (sold exclusively in Japan). The AU and AUx Series will both contain three models, ranging in output capacities of 450W, 550W, and 650W. In this review we will take a detailed look at the AU-450x we received for evaluation.

3-AU-450x-side.jpg

The new Cyonic AUx Series power supplies feature fully modular cables, quiet operation and high efficiency. And they are housed in a compact chassis that measures only 140mm (5.5”) deep. Cyonic suggests the AUx Series power supplies are “the ideal choice for office use, casual gaming, and Home Theater PCs”. Getting into the PC power supply market might seem a rather daunting task, but Cyonic has partnered with Seasonic as their OEM, which is certainly a good start.

Cyonic AUx PSU Key Features:

•    450W, 550W and 650W models
•    Fully modular cables design
•    Compact ATX chassis: only 140mm (5.5”) deep
•    80 PLUS Gold certified, at least 90% efficiency under 50% load
•    120mm Fluid Dynamic Bearing fan for long life and quiet operation
•    Intelligent Fan Manager for very quiet operation
•    High quality components including 105°C Japanese electrolytic capacitors
•    Compatibility with Intel's 4th Generation Core processors
•    Safety Protections: OPP, OVP, UVP and SCP
•    Conforms to ATX12V v2.31 and EPS 2.92 standards
•    5-Year warranty

Please continue reading our review of the Cyonic AU-450x PSU!!!

Subject: General Tech
Manufacturer: Logitech

Introduction, Packaging, and A Closer Look

Introduction

We haven’t had a chance to sit down with any racing wheels for quite some time here at PC Perspective. We have an old Genius wheel on a shelf in the back of our closet here at the office. Ryan played around with that a few years back, and that was the extent of the racing wheel usage here at home base. Josh, on the other hand, frequents driving sims with a Thrustmaster F430. I hadn’t ventured into racing sims, though I do dabble with the real thing a bit.

Given our previous  Logitech coverage, and especially following our recent Q&A covering the new LogitechG line, it only made sense for us to take a look at the new Logitech G29 Driving Force Racing Wheel.

DSC03140.jpg

Today we are covering the G29, which is a PS3/PS4/PC specific model from Logitech. There is an Xbox/PC variant coming soon in the form of the G920, with will have a different (fewer) button layout and no LED RPM/shift display.

Read on for our review of the Logitech G29 Driving Force Racing Wheel!

Author:
Manufacturer: Wasabi Mango

Overview

A few years ago, we took our first look at the inexpensive 27" 1440p monitors which were starting to flood the market via eBay sellers located in Korea. These monitors proved to be immensely popular and largely credited for moving a large number of gamers past 1080p.

However, in the past few months we have seen a new trend from some of these same Korean monitor manufacturers. Just like the Seiki Pro SM40UNP 40" 4K display that we took a look at a few weeks ago, the new trend is large 4K monitors.

Built around a 42-in LG AH-IPS panel, the Wasabi Mango UHD420 is an impressive display. Inclusion of HDMI 2.0 and DisplayPort 1.2 allow you to achieve 4K at a full 60Hz and 4:4:4 color gamut. At a cost of just under $800 on Amazon, this is an incredibly appealing value.

IMG_2939.JPG

Whether or not the UHD420 is a TV or a monitor is actually quite the tossup. The lack of a tuner
might initially lead you to believe it's not a TV. Inclusion of a DisplayPort connector, and USB 3.0 hub might make you believe it's a monitor, but it's bundled with a remote control (entirely in Korean). In reality, this display could really be used for either use case (unless you use OTA tuning), and really starts to blur the lines between a "dumb" TV and a monitor. You'll also find VESA 400x400mm mounting holes on this display for easy wall mounting.

Continue reading our overview of the Wasabi Mango UHD420 4K HDMI 2.0 FreeSync Display!!

Subject: Mobile
Manufacturer: MSI

Introduction and First Impressions

The MSI GT72 Dominator Pro G gaming laptop is a beast of a portable, with a GeForce GTX 980M graphics card and a 5th-Gen Intel Core i7 processor within its massive frame. And this iteration of the GT72 features NVIDIA's G-SYNC technology, which should help provide smooth gameplay on its 75 Hz IPS display.

gt72g_01.jpg

The gaming laptop market is filled with options at just about any price you can imagine (as long as your imagination starts at around $1000), and there are seemingly limitless combinations of specs and minute configuration differences even within a particular brand’s offering. A few names stand out in this market, and MSI has created a product meant to stand tall against the likes of Alienware and ASUS ROG. And it doesn’t just stand tall, it stands wide - and deep for that matter. Running about the size of home plate on a regulation baseball diamond (well, approximately anyway), this is nearly 8 ½ lbs of PC gaming goodness.

Not everyone needs a 17-inch notebook, but there’s something awesome about these giant things when you see them in person. The design of this GT72 series is reminiscent of an exotic sports car (gaming laptops in general seem to have fully embraced the sports car theme), and if you’re considering completely replacing a desktop for gaming and all of your other computing the extra space it takes up is more than worth it if you value a large display and full keyboard. Doubtless there are some who would simply be augmenting a desktop experience with a supremely powerful notebook like this, but for most people laptops like this are a major investment that generally replaces the need for a dedicated PC tower.

g72g_logo.jpg

What about the cost? It certainly isn’t “cheap” considering the top-of-the-line specs, and price is clearly the biggest barrier to entry with a product like this - far beyond the gargantuan size. Right off the bat I’ll bring up this laptop’s $2099 retail price - and not because I think it’s high. It’s actually very competitive as equipped. And in addition to competitive pricing MSI is also ahead of the curve a bit with its adoption of the 5th-Gen Core i7 Broadwell mobile processors, while most gaming laptops are still on Haswell. Broadwell’s improved efficiency should help with battery life a bit, but your time away from a power plug is always going to be limited with gaming laptops!

Continue reading our review of the MSI GT72 Dominator Pro G G-Sync Notebook!!

Manufacturer: PC Perspective

... But Is the Timing Right?

Windows 10 is about to launch and, with it, DirectX 12. Apart from the massive increase in draw calls, Explicit Multiadapter, both Linked and Unlinked, has been the cause of a few pockets of excitement here and there. I am a bit concerned, though. People seem to find this a new, novel concept that gives game developers the tools that they've never had before. It really isn't. Depending on what you want to do with secondary GPUs, game developers could have used them for years. Years!

Before we talk about the cross-platform examples, we should talk about Mantle. It is the closest analog to DirectX 12 and Vulkan that we have. It served as the base specification for Vulkan that the Khronos Group modified with SPIR-V instead of HLSL and so forth. Some claim that it was also the foundation of DirectX 12, which would not surprise me given what I've seen online and in the SDK. Allow me to show you how the API works.

amd-2015-mantle-execution-model.png

Mantle is an interface that mixes Graphics, Compute, and DMA (memory access) into queues of commands. This is easily done in parallel, as each thread can create commands on its own, which is great for multi-core processors. Each queue, which are lists leading to the GPU that commands are placed in, can be handled independently, too. An interesting side-effect is that, since each device uses standard data structures, such as IEEE754 decimal numbers, no-one cares where these queues go as long as the work is done quick enough.

Since each queue is independent, an application can choose to manage many of them. None of these lists really need to know what is happening to any other. As such, they can be pointed to multiple, even wildly different graphics devices. Different model GPUs with different capabilities can work together, as long as they support the core of Mantle.

microsoft-dx12-build15-ue4frame.png

DirectX 12 and Vulkan took this metaphor so their respective developers could use this functionality across vendors. Mantle did not invent the concept, however. What Mantle did is expose this architecture to graphics, which can make use of all the fixed-function hardware that is unique to GPUs. Prior to AMD's usage, this was how GPU compute architectures were designed. Game developers could have spun up an OpenCL workload to process physics, audio, pathfinding, visibility, or even lighting and post-processing effects... on a secondary GPU, even from a completely different vendor.

Vista's multi-GPU bug might get in the way, but it was possible in 7 and, I believe, XP too.

Read on to see a couple reasons why we are only getting this now...

Subject: Displays
Manufacturer: BenQ

Overdrive initialized

We have been tracking the differences between AMD’s FreeSync and Nvidia’s G-Sync for some time now. The launch of FreeSync-capable displays started out a bit shaky, as some features we took for granted went missing. The first round of FreeSync displays we reviewed came with non-functional overdrive when the display / GPU pipeline was operating in FreeSync mode.

ghost1.jpg

Comparison of overdrive response in first round FreeSync displays. Images should look like the ROG Swift (left), which was correctly applying overdrive.

While AMD apparently fixed a portion of this problem in a subsequent driver update, getting overdrive to function in these early displays would require a firmware update. Unlike what you may be used to with a motherboard or SSD firmware, displays are not typically end-user upgradeable. This meant that even if manufacturers produced a fix, owners would have to send in their display to be updated (and be without it for several weeks).

The only manufacturer to step forward and retroactively support overdrive in their first gen FreeSync panel was BenQ. In a statement issued via TFTCentral:

BenQ have confirmed that the FreeSync/AMA issue which affected their XL2730Z display has now been fixed. This issue caused the overdrive (AMA) feature to not function when the screen was connected to a FreeSync capable system. As a result, users could not make use of the AMA feature and benefit from the improved response times that the 'normal' AMA mode offered, as compared with AMA Off. See our review for more information.

A driver update from AMD is already available and should be downloaded from their website. In addition BenQ will be releasing a firmware update for the monitor itself to fix this issue. Current stocks in distribution are being recalled and updated with retailers so future purchases should already carry this new firmware. This is expected to apply for stock purchased AFTER 1st July, as V002 firmware screens should be shipped by BenQ to distributors in late June.

For those who already have an XL2730Z if you want to, you can return it to BenQ for them to carry out the firmware update for you. This only applies if the user is experiencing issues with the performance of the screen. There is no simple way for the end user to update the firmware themselves and it is not encouraged. Users should contact BenQ support through their relevant country website for more information on how to return their screen for the update.

The catch with the above is that the statement came from BenQ PR for Europe, and we nor TFTCentral have been able to confirm any equivalent upgrade process in place for the USA. We did note in various online reviews that those receiving their BenQ XL2730Z in the last week of June confirmed having the new V002 firmware.

If you have one of these panels, verifying your firmware is simple. Hold down the menu button while powering up the display (you will have to hold the power button for a few seconds before you hear a beep).

service-menu.jpg

The display will power up and appear as normal, except that now pressing the menu button again will bring up the above service menu. Those with the update will have “V002” as the starting text of the ‘F/W Version’ result.

ASUS-overdrive.jpg

Overdrive functioning on the ASUS MG279Q IPS FreeSync display, showing an odd simultaneous ‘negative ghost’ outline of a slightly ghosted image.

We have been eager to retest the BenQ since hearing of this updated firmware revision. While we have seen overdrive functioning in the recent ASUS MG279Q, it was not a perfect implementation, and we were curious to know if BenQ’s implementation fared any better.

BenQ.jpg

Continue reading for the results of our testing!

Introduction and Technical Specifications

Introduction

02-prod.jpg

Courtesy of Bitspower

The Bitspower AIX99R5E Nickel Plated water block set is a set of full cover blocks for cooling the X99 chipset and CPU VRMs on the ASUS Rampage V Extreme motherboard. The block set is split into two parts, a block for the CPU VRM circuits and a block / cover plate for the chipset area. Each block has two G1/4" threaded ports for coolant inlet/outlet.

03-flyapart-install.jpg

Courtesy of Bitspower

04-prod-install.jpg

Courtesy of Bitspower

The blocks are held to the board using screws through the board bottom and using the pre-existing cooler mounting holes. Thermal tape is used in between the VRM chips and the block. The chipset block uses thermal paste to interface with the board's chipset surface. The chipset block covers the left side of the board with fingers that sit in between the PCIe ports.

Technical Specifications (taken from the Bitspower website)

Dimension N+S
( LxWxH )
212m x 161mm x 26mm
Dimension MOS
( LxWxH )
104.7mm x 23.5mm x 26mm
Thread G1/4" x 2
Included 1. Backup O-Ring x 1Set.
2. Thermal PAD Included.
3. Mounting Screws/Accessories.

Continue reading our review of the Bitspower AIX99R5E Nickel Plated Full Cover Block!

Author:
Subject: Mobile
Manufacturer: Acer

Overview

Gaming laptops are something that most people are quick to reject as out of their price range. There is a lot of sense in this train of thought. We know that laptop components are inherently lower performing than their desktop counterparts, and significantly more expensive. So the idea of spending more money for less powerful components seems like a bad trade off for the added gains of portability for many gamers.

However, we also seem to be in a bit of a plateau as far as generation-to-generation performance gain with desktop components. Midrange processors from a few generations ago are still more than capable of playing the vast majority of games, and even lower-end modern GPUs are able to game at 1080p.

So maybe it's time to take another look at the sub-$1000 gaming notebook options, and that's exactly what we are doing today with the Acer Aspire V 15 Nitro Black Edition.

IMG_2396.JPG

The Aspire V Nitro is equipped with fairly modest components when compared to what most people think of gaming laptops as. Where machines such as the MSI GT70 Dominator or ASUS G751 seem to take the kitchen sink approach towards mobile gaming machines, The Aspire V is a more carefully balanced option.

  Acer Aspire V 15 Nitro Black Edition
Processor Intel Core i7-4720HQ 2.6 GHz
Screen 15.6" 1920x1080
Memory 8GB DDR3
Graphics Card NVIDIA GTX 960M 4GB
Storage 1 TB Hard Drive
Dimensions (W x D x H) 15.34" x 10.14" x 0.86" - 0.94"
Weight 5.29 lbs.

Anchored by an Intel Core i7-4720HQ and a GTX 960M, the Aspire V Nitro isn't trying to reach to the top stack of mobile performance. A 15.6" display along with 8GB of RAM, and a single 1TB spindle drive are all logical choices for a machine aimed towards gaming on a budget.

While it's difficult for us to recommend that you buy any machine without an SSD these days, a 1TB drive is great for game storage on a machine like there. There are also other configurations optiosn which add SATA M.2 SSDs alongside the 1TB drive, and we managed to open up our sample and put an SSD in ourselves with little pain.

Click here to read more about the Acer Aspire V 15 Nitro Black Edition!

Manufacturer: Corsair

Introduction and Features

Introduction

Corsair offers a large selection of PC power supplies and today we will be taking a detailed look at their new RM850i PSU. The RMi Series includes four models: the RM650i, RM750i, RM850i, and RM1000i. As you can see in the diagram below, the RMi Series is located squarely in the middle of Corsair’s power supply lineup. Corsair currently offers thirty four different models ranging from the 350 watt VS350 all the way up to the king-of-the-hill 1,500 watt AX1500i.

2-Compare-Series.jpg

Corsair PSU Comparison

The new Corsair RMi Series power supplies are equipped with fully modular cables and optimized for very quiet operation and high efficiency. RMi Series power supplies incorporate Zero RPM Fan Mode, which means the fan does not spin until the power supply is under a moderate to heavy load. The cooling fan is custom-designed for use in PSUs to deliver low noise and high static pressure. All of the RMi Series power supplies are 80 Plus Gold certified for high efficiency.

The Corsair RMi Series is built with high-quality components, including all Japanese made electrolytic capacitors, and Corsair guarantees these PSUs to deliver clean, stable, continuous power, even at ambient temperatures up to 50°C. Each RMi Series power supply also supports Corsair’s Link software to monitor various power supply parameters and enable/disable OCP on the +12V outputs.

3-RM850i-side.jpg

RMi vs. RM Series Advantages

Corsair has incorporated a number of enhancements, which differentiate the new RMi Series from the original RM Series. Here is an overview:

•    RMi Series comes with a 7-year warranty, instead of the RM’s 5-years
•    ALL Japanese made capacitors ensure long life and best in class performance
•    RMi Series is rated for full output at 50°C, instead of 40°C
•    Fluid Dynamic Bearing fan delivers longer life than RM’s rifle-bearing fan
•    Additional Corsair Link control capabilities and features

And for all these additional features, you only pay ~$10 USD price premium!

Corsair RM850i PSU Features summary:

•    850W continuous DC output (up to 50°C)
•    7-Year Warranty and Comprehensive Customer Support
•    80 PLUS Gold certified, at least 90% efficiency under 50% load
•    Corsair Link ready for real-time monitoring and control
•    Ability to switch between single and multiple +12V rails
•    Fully modular cables for easy installation
•    Zero RPM Fan Mode for silent operation up to 40% load
•    Quiet fluid dynamic fan bearing for long life and quiet operation
•    High quality components including all Japanese electrolytic capacitors
•    Active Power Factor correction (0.99) with Universal AC input
•    Safety Protections : OCP, OVP, UVP, SCP, OTP, and OPP
•    MSRP for the RM850i : $159.99 USD

Please continue reading our review of the Corsair RM850i PSU!!!

Author:
Manufacturer: Sapphire

Fiji brings the (non-X) Fury

Last month was a big one for AMD. At E3 the company hosted its own press conference to announce the Radeon R9 300-series of graphics as well as the new family of products based on the Fiji GPU. It started with the Fury X, a flagship $650 graphics card with an integrated water cooler that was well received.  It wasn't perfect by any means, but it was a necessary move for AMD to compete with NVIDIA on the high end of the discrete graphics market.

02.jpg

At the event AMD also talked about the Radeon R9 Fury (without the X) as the version of Fiji that would be taken by board partners to add custom coolers and even PCB designs. (They also talked about the R9 Nano and a dual-GPU version of Fiji, but nothing new is available on those products yet.) The Fury, priced $100 lower than the Fury X at $549, is going back to a more classic GPU design. There is no "reference" product though, so cooler and PCB designs are going to vary from card to card. We already have two different cards in our hands that differ dramatically from one another.

The Fury cuts down the Fiji GPU a bit with fewer stream processors and texture units, but keeps most other specs the same. This includes the 4GB of HBM (high bandwidth memory), 64 ROP count and even the TDP / board power. Performance is great and it creates an interesting comparison between itself and the GeForce GTX 980 cards on the market. Let's dive into this review!

Continue reading our review of the Sapphire Radeon R9 Fury 4GB with CrossFire Results!

Author:
Manufacturer: Various

SLI and CrossFire

Last week I sat down with a set of three AMD Radeon R9 Fury X cards, our sampled review card as well as two retail cards purchased from Newegg, to see how the reports of the pump whine noise from the cards was shaping up. I'm not going to dive into that debate again here in this story as I think we have covered it pretty well thus far in that story as well as on our various podcasts, but rest assured we are continuing to look into the revisions of the Fury X to see if AMD and Cooler Master were actually able to fix the issue.

group1.jpg

What we have to cover today is something very different, and likely much more interesting for a wider range of users. When you have three AMD Fury X cards in your hands, you of course have to do some multi-GPU testing with them. With our set I was able to run both 2-Way and 3-Way CrossFire with the new AMD flagship card and compare them directly to the comparable NVIDIA offering, the GeForce GTX 980 Ti.

There isn't much else I need to do to build up this story, is there? If you are curious how well the new AMD Fury X scales in CrossFire with two and even three GPUs, this is where you'll find your answers.

Continue reading our results testing the AMD Fury X and GeForce GTX 980 Ti in 3-Way GPU configurations!!

Subject: Storage
Manufacturer: OCZ

Introduction, Specifications and Packaging

Introduction:

Since their acquisition by Toshiba in early 2014, OCZ has gradually transitioned their line of SSD products to include parts provided by their parent company. Existing products were switched over to Toshiba flash memory, and that transition went fairly smoothly, save the recent launch of their Vector 180 (which had a couple of issues noted in our review). After that release, we waited for the next release from OCZ, hoping for something fresh, and that appears to have just happened:

150707-180041.jpg

OCZ sent us a round of samples for their new OCZ Trion 100 SSD. This SSD was first teased at Computex 2015. This new model would not only use Toshiba sourced flash memory, it would also displace the OCZ / Indilinx Barefoot controller with Toshiba's own. Then named 'Alishan', this is now officially called the 'Toshiba Controller TC58'. As we found out during Computex, this controller employs Toshiba's proprietary Quadruple Swing-By Code (QSBC) error correction technology:

QSBC.png

Error correction tech gets very wordy, windy, and technical and does so very quickly, so I'll do my best to simplify things. Error correction is basically some information interleaved within the data stored on a given medium. Pretty much everything uses it in some form or another. Some Those 700MB CD-R's you used to burn could physically hold over 1GB of data, but all of that extra 'unavailable' space was error correction necessary to deal with the possible scratches and dust over time. Hard drives do the same sort of thing, with recent changes to how the data is interleaved. Early flash memory employed the same sort of simple error correction techniques initially, but advances in understanding of flash memory error modes have led to advances in flash-specific error correction techniques. More advanced algorithms require more advanced math that may not easily lend itself to hardware acceleration. Referencing the above graphic, BCH is simple to perform when needed, while LDPC is known to be more CPU (read SSD controller CPU) intensive. Toshiba's proprietary QSB tech claims to be 8x more capable of correcting errors, but what don't know what, if any, performance penalty exists on account of it.

We will revisit this topic a bit later in the review, but for now lets focus on the other things we know about the Trion 100. The easiest way to explain it is this is essentially Toshiba's answer to the Samsung EVO series of SSDs. This Toshiba flash is configured in a similar fashion, meaning the bulk of it operates in TLC mode, while a portion is segmented off and operates as a faster SLC-mode cache. Writes first go to the SLC area and are purged to TLC in the background during idle time. Continuous writes exceeding the SLC cache size will drop to the write speed of the TLC flash.

Read on for the full review!

Author:
Subject: General Tech
Manufacturer: Logitech

Logitech Focuses in on Gaming

Logitech has been around seemingly forever.  The Swiss based company is ubiquitous in the peripherals market, providing products ranging from keyboards and mice, to speakers and headsets.  There is not much that the company does not offer when it comes to PC peripherals.  Their 3 button mice back in the day were considered cutting edge that also happened to be semi-programmable.  Since that time we have seen them go from ball mice, to optical mice, to the latest laser based products that offer a tremendous amount of precision.

g230_01.jpg

Gaming has become one of the bigger movers for Logitech, and they have revamped their entire lineup as well as added a few new products to hopefully cash in on the popularity of modern gaming.  To further address this market Logitech has designed and marketed a new batch of gaming headsets.  These promise to be moderately priced, but high quality products that bear the Logitech name.  We go from the very basic up to the top 7.1 wireless products.  Originally these covered a pretty significant price range, but lately the discounts have been extremely deep.  The lowest end gaming headset is at $40US while the 7.1 wireless model comes in around $90 US. 

I am looking at two models today that span the lowest end to the 2nd highest.  The first headset is the G230 analog set.  The second is the G35 wired 7.1 USB with Dolby Headphone technology.  I have never been a fan of wireless headphones, but the G35 should be a fairly good approximation of the performance of that part.

g35_01.jpg

My goal is to look at these two wired units and see what Logitech can offer at these two very affordable price points.

Click here to read the entire Logitech G230 and G35 review!

Subject: Systems
Manufacturer: PC Perspective
Tagged: quad-core, gpu, gaming, cpu

Introduction and Test Hardware

logos.jpg

The PC gaming world has become divided by two distinct types of games: those that were designed and programmed specifically for the PC, and console ports. Unfortunately for PC gamers it seems that far too many titles are simply ported over (or at least optimized for consoles first) these days, and while PC users can usually enjoy higher detail levels and unlocked frame rates there is now the issue of processor core-count to consider. This may seem artificial, but in recent months quite a few games have been released that require at least a quad-core CPU to even run (without modifying the game).

One possible explanation for this is current console hardware: PS4 and Xbox One systems are based on multi-core AMD APUs (the 8-core AMD "Jaguar"). While a quad-core (or higher) processor might not be techincally required to run current games on PCs, the fact that these exist on consoles might help to explain quad-core CPU as a minimum spec. This trend could simply be the result of current x86 console hardware, as developement of console versions of games is often prioritized (and porting has become common for development of PC versions of games). So it is that popular dual-core processors like the $69 Intel Pentium Anniversary Edition (G3258) are suddenly less viable for a future-proofed gaming build. While hacking these games might make dual-core CPUs work, and might be the only way to get such a game to even load as the CPU is checked at launch, this is obviously far from ideal.

4790K_box.jpg

Is this much CPU really necessary?

Rather than rail against this quad-core trend and question its necessity, I decided instead to see just how much of a difference the processor alone might make with some game benchmarks. This quickly escalated into more and more system configurations as I accumulated parts, eventually arriving at 36 different configurations at various price points. Yeah, I said 36. (Remember that Budget Gaming Shootout article from last year? It's bigger than that!) Some of the charts that follow are really long (you've been warned), and there’s a lot of information to parse here. I wanted this to be as fair as possible, so there is a theme to the component selection. I started with three processors each (low, mid, and high price) from AMD and Intel, and then three graphics cards (again, low, mid, and high price) from AMD and NVIDIA.

Here’s the component rundown with current pricing*:

Processors tested:

Graphics cards tested:

  • AMD Radeon R7 260X (ASUS 2GB OC) - $137.24
  • AMD Radeon R9 280 (Sapphire Dual-X) - $169.99
  • AMD Radeon R9 290X (MSI Lightning) - $399
  • NVIDIA GeForce GTX 750 Ti (OEM) - $149.99
  • NVIDIA GeForce GTX 770 (OEM) - $235
  • NVIDIA GeForce GTX 980 (ASUS STRIX) - $519

*These prices were current as of  6/29/15, and of course fluctuate.

Continue reading our Quad-Core Gaming Roundup: How Much CPU Do You Really Need?

Subject: Storage
Manufacturer: Samsung

Introduction, Specifications and Packaging

Introduction:

Where are all the 2TB SSDs? It's a question we've been hearing since they started to go mainstream seven years ago. While we have seen a few come along on the enterprise side as far back as 2011, those were prohibitively large, expensive, and out of reach of most consumers. Part of the problem initially was one of packaging. Flash dies simply were not of sufficient data capacity (and could not be stacked in sufficient quantities) as to reach 2TB in a consumer friendly form factor. We have been getting close lately, with many consumer focused 2.5" SATA products reaching 1TB, but things stagnated there for a bit. Samsung launched their 850 EVO and Pro in capacities up to 1TB, with plenty of additional space inside the 2.5" housing, so it stood to reason that the packaging limit was no longer an issue, so why did they keep waiting?

The first answer is one of market demand. When SSDs were pushing $1/GB, the thought of a 2TB SSD was great right up to the point where you did the math and realized it would cost more than a typical enthusiast-grade PC. That was just a tough pill to swallow, and market projections showed it would take more work to produce and market the additional SKU than it would make back in profits.

The second answer is one of horsepower. No, this isn't so much a car analogy as it is simple physics. 1TB SSDs had previously been pushing the limits of controller capabilities of flash and RAM addressing, as well as handling Flash Translation Layer lookups as well as garbage collection and other duties. This means that doubling a given model SSD capacity is not as simple as doubling the amount of flash attached to the controller - that controller must be able to effectively handle twice the load.

With all of that said, it looks like we can finally stop asking for those 2TB consumer SSDs, because Samsung has decided to be the first to push into this space:

150705-191310.jpg

Today we will take a look at the freshly launched 2TB version of the Samsung 850 EVO and 850 Pro. We will put these through the same tests performed on the smaller capacity models. Our hope is to verify that the necessary changes Samsung made to the controller are sufficient to keep performance scaling or at least on-par with the 1TB and smaller models of the same product lines.

Read on for the full review!

Introduction and Technical Specifications

Introduction

In our previous article here, we demonstrated how to mod the EVGA GTX 970 SC ACX 2.0 video card to get higher performance and significantly lower running temps. Now we decided to take two of these custom modded EVGA GTX 970 cards to see how well they perform in an SLI configuration. ASUS was kind enough to supply us with one of their newly introduced ROG Enthusiast SLI Bridges for our experiments.

ASUS ROG Enthusiast SLI Bridge

02-rog-3way-adapter.jpg

Courtesy of ASUS

03-rog-adapter-profile.jpg

Courtesy of ASUS

For the purposes of running the two EVGA GTX 970 SC ACX 2.0 video cards in SLI, we chose to use the 3-way variant of ASUS' ROG Enthusiast SLI Bridge so that we could run the tests with full 16x bandwidth across both cards (with the cards in PCIe 3.0 x16 slots 1 and 3 in our test board). This customized SLI adapter features a powered red-colored ROG logo embedded in its brushed aluminum upper surface. The adapter supports 2-way and 3-way SLI in a variety of board configurations.

04-all-adapters.jpg

Courtesy of ASUS

ASUS offers their ROG Enthusiast SLI Bridge in 3 sizes for various variations on 2-way, 3-way, and 4-way SLI configurations. All bridges feature the top brushed-aluminum cap with embedded glowing ROG logo.

Continue reading our article on Modding the EVGA GTX 970 SC Graphics Card!

05-sli-config.jpg

Courtesy of ASUS

The smallest bridge supports 2-way SLI configurations with either a two or three slot separation. The middle sized bridge supports up to a 3-way SLI configuration with a two slot separation required between each card. The largest bridge support up to a 4-way SLI configuration, also requiring a two slot separation between each card used.

Technical Specifications (taken from the ASUS website)

Dimensions 2-WAY: 97 x 43 x 21 (L x W x H mm)
3-WAY: 108 x 53 x 21 (L x W x H mm)
4-WAY: 140 x 53 x 21 (L x W x H mm)
Weight 70 g (2-WAY)
91 g (3-WAY)
123 g(4-WAY)
Compatible GPU set-ups 2-WAY: 2-WAY-S & 2-WAY-M
3-WAY: 2-WAY-L & 3-WAY
4-WAY: 4-WAY
Contents 2-WAY: 1 x optional power cable & 2 PCBs included for varying configurations
3-WAY: 1 x optional power cable
4-WAY: 1 x optional power cable

Continue reading our story!!

Tick Tock Tick Tock Tick Tock Tock

A few websites have been re-reporting on a leak from BenchLife.info about Kaby Lake, which is supposedly a second 14nm redesign (“Tock”) to be injected between Skylake and Cannonlake.

UPDATE (July 2nd, 3:20pm ET): It has been pointed out that many hoaxes have come out of the same source, and that I should be more clear in my disclaimer. This is an unconfirmed, relatively easy to fake leak that does not have a second, independent source. I reported on it because (apart from being interesting enough) some details were listed on the images, but not highlighted in the leak, such as "GT0" and a lack of Iris Pro on -K. That suggests that the leaker got the images from somewhere, but didn't notice those details, which implies that the original source was hoaxed by an anonymous source, who only seeded the hoax to a single media outlet, or that it was an actual leak.

Either way, enjoy my analysis but realize that this is a single, unconfirmed source who allegedly published hoaxes in the past.

intel-2015-kaby-lake-leak-01.png

Image Credit: BenchLife.info

If true, this would be a major shift in both Intel's current roadmap as well as how they justify their research strategies. It also includes a rough stack of product categories, from 4.5W up to 91W TDPs, including their planned integrated graphics configurations. This leads to a pair of interesting stories:

How Kaby Lake could affect Intel's processors going forward. Since 2006, Intel has only budgeted a single CPU architecture redesign for any given fabrication process node. Taking two attempts on the 14nm process buys time for 10nm to become viable, but it could also give them more time to build up a better library of circuit elements, allowing them to assemble better processors in the future.

What type of user will be given Iris Pro? Also, will graphics-free options be available in the sub-Enthusiast class? When buying a processor from Intel, the high-end mainstream processors tend to have GT2-class graphics, such as the Intel HD 4600. Enthusiast architectures, such as Haswell-E, cannot be used without discrete graphics -- the extra space is used for more cores, I/O lanes, or other features. As we will discuss later, Broadwell took a step into changing the availability of Iris Pro in the high-end mainstream, but it doesn't seem like Kaby Lake will make any more progress. Also, if I am interpreting the table correctly, Kaby Lake might bring iGPU-less CPUs to LGA 1151.

Keeping Your Core Regular

To the first point, Intel has been on a steady tick-tock cycle since the Pentium 4 architecture reached the 65nm process node, which was a “tick”. The “tock” came from the Conroe/Merom architecture that was branded “Core 2”. This new architecture was a severe departure from the high clock, relatively low IPC design that Netburst was built around, which instantaneously changed the processor landscape from a dominant AMD to an Intel runaway lead.

intel-tick-tock.png

After 65nm and Core 2 started the cycle, every new architecture alternated between shrinking the existing architecture to smaller transistors (tick) and creating a new design on the same fabrication process (tock). Even though Intel has been steadily increasing their R&D budget over time, which is now in the range of $10 to $12 billion USD each year, creating smaller, more intricate designs with new process nodes has been getting harder. For comparison, AMD's total revenue (not just profits) for 2014 was $5.51 billion USD.

Read on to see more about what Kaby Lake could mean for Intel and us.

Author:
Manufacturer: AMD

Retail cards still suffer from the issue

In our review of AMD's latest flagship graphics card, the Radeon R9 Fury X, I noticed and commented on the unique sound that the card was producing during our testing. A high pitched whine, emanating from the pump of the self-contained water cooler designed by Cooler Master, was obvious from the moment our test system was powered on and remained constant during use. I talked with a couple of other reviewers about the issue before the launch of the card and it seemed that I wasn't alone. Looking around other reviews of the Fury X, most make mention of this squeal specifically.

09.jpg

Noise from graphics cards come in many forms. There is the most obvious and common noise from on-board fans and the air it moves. Less frequently, but distinctly, the sound of inductor coil whine comes up. Fan noise spikes when the GPU gets hot, causing the fans to need to spin faster and move more air across the heatsink, which keeps everything running cool. Coil whine changes pitch based on the frame rate (and the frequency of power delivery on the card) and can be alleviated by using higher quality components on the board itself.

But the sound of our Fury X was unique: it was caused by the pump itself and it was constant. The noise it produced did not change as the load on the GPU varied. It was also 'pitchy' - a whine that seemed to pierce through other sounds in the office. A close analog might be the sound of an older, CRT TV or monitor that is left powered on without input.

In our review process, AMD told us the solution was fixed. In an email sent to the media just prior to the Fury X launch, an AMD rep stated:

In regards to the “pump whine”, AMD received feedback that during open bench testing some cards emit a mild “whining” noise.  This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump.  Since the AMD Radeon™ R9 Fury X radiator fan is near silent, this pump noise is more noticeable.  
 
The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump.  This problem has been resolved and a fix added to production parts and is not an issue.

I would disagree that this is "normal" but even so, taking AMD at its word, I wrote that we heard the noise but also that AMD had claimed to have addressed it. Other reviewers noted the same comment from AMD, saying the result was fixed. But very quickly after launch some users were posting videos on YouTube and on forums with the same (or worse) sounds and noise. We had already started bringing in a pair of additional Fury X retail cards from Newegg in order to do some performance testing, so it seemed like a logical next step for us to test these retail cards in terms of pump noise as well.

First, let's get the bad news out of the way: both of the retail AMD Radeon R9 Fury X cards that arrived in our offices exhibit 'worse' noise, in the form of both whining and buzzing, compared to our review sample. In this write up, I'll attempt to showcase the noise profile of the three Fury X cards in our possession, as well as how they compare to the Radeon R9 295X2 (another water cooled card) and the GeForce GTX 980 Ti reference design - added for comparison.

Continue reading our look into the pump noise of the AMD Fury X Graphics Card!

Introduction, Specifications, and Packaging

Lexar is Micron’s brand covering SD Cards, microSD Cards, USB flash drives, and card readers. Their card readers are known for being able to push high in the various speed grades, typically allowing transfers (for capable SD cards) much faster than what a typical built-in laptop or PC SD card reader is capable of. Today we will take a look at the Lexar ‘Professional Workflow’ line of flash memory connectivity options from Lexar.

150121-164114.jpg

This is essentially a four-bay hub device that can accept various card readers or other types of devices (a USB flash storage device as opposed to just a reader, for example). The available readers range from SD to CF to Professional Grade CFast cards capable of over 500 MB/sec.

We will be looking at the following items today:

  • Professional Workflow HR2
    • Four-bay Thunderbolt™ 2/USB 3.0 reader and storage drive hub
  • Professional Workflow UR1
    • Three-slot microSDHC™/microSDXC™ UHS-I USB 3.0 reader
  • Professional Workflow SR1
    • SDHC™/SDXC™ UHS-I USB 3.0 reader
  • Professional Workflow CFR1
    • CompactFlash® USB 3.0 reader
  • Professional Workflow DD256
    • 256GB USB 3.0 Storage Drive

Note that since we were sampled these items, Lexar has begun shipping a newer version of the SR1. The SR2 is a SDHC™/SDXC™ UHS-II USB 3.0 reader. Since we had no UHS-II SD cards available to test, this difference would not impact any of our testing speed results. There is also an HR1 model which has only USB 3.0 support and no Thunderbolt, coming in at a significantly lower cost when compared with the HR2 (more on that later).

Continue reading for our review of all of the above!

Subject: Displays
Manufacturer: ASUS

Introduction, Specifications, and Packaging

AMD fans have been patiently waiting for a proper FreeSync display to be released. The first round of displays using the Adaptive Sync variable refresh rate technology arrived with an ineffective or otherwise disabled overdrive feature, resulting in less than optimal pixel response times and overall visual quality, especially when operating in variable refresh rate modes. Meanwhile G-Sync users had overdrive functionality properly functioning , as well as a recently introduced 1440P IPS panel from Acer. The FreeSync camp was overdue for an IPS 1440P display superior to that first round of releases, hopefully with those overdrive issues corrected. Well it appears that ASUS, the makers of the ROG Swift, have just rectified that situation with a panel we can finally recommend to AMD users:

DSC02594.jpg

Before we get into the full review, here is a sampling of our recent display reviews from both sides of the camp:

  • ASUS PG278Q 27in TN 1440P 144Hz G-Sync
  • Acer XB270H 27in TN 1080P 144Hz G-Sync
  • Acer XB280HK 28in TN 4K 60Hz G-Sync
  • Acer XB270HU 27in IPS 1440P 144Hz G-Sync
  • LG 34UM67 34in IPS 25x18 21:9 48-75Hz FreeSync
  • BenQ XL2730Z 27in TN 1440P 40-144Hz FreeSync
  • Acer XG270HU 27in TN 1440P 40-144Hz FreeSync
  • ASUS MG279Q 27in IPS 1440P 144Hz FreeSync(35-90Hz) < You are here

The reason for there being no minimum rating on the G-Sync panels above is explained in our article 'Dissecting G-Sync and FreeSync - How the Technologies Differ', though the short version is that G-Sync can effectively remain in VRR down to <1 FPS regardless of the hardware minimum of the display panel itself.

Continue reading as we will look at this new ASUS MG279Q 27" 144Hz 1440P IPS FreeSync display!