All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
NVIDIA's Tegra X1
NVIDIA seems to like begin on a one year cycle with their latest Tegra products. Many years ago we were introduced to the Tegra 2, and the year after that the Tegra 3, and the year after that the Tegra 4. Well, NVIDIA did spice up their naming scheme to get away from the numbers (not to mention the potential stigma of how many of those products actually made an impact in the industry). Last year's entry was the Tegra K1 based on the Kepler graphics technology. These products were interesting due to the use of the very latest, cutting edge graphics technology in a mobile/low power format. The Tegra K1 64 bit variant used two “Denver” cores that were actually designed by NVIDIA.
While technically interesting, the Tegra K1 series have made about the same impact as the previous versions. The Nexus 9 was the biggest win for NVIDIA with these parts, and we have heard of a smattering of automotive companies using Tegra K1 in those applications. NVIDIA uses the Tegra K1 in their latest Shield tablet, but they do not typically release data regarding the number of products sold. The Tegra K1 looks to be the most successful product since the original Tegra 2, but the question of how well they actually sold looms over the entire brand.
So why the history lesson? Well, we have to see where NVIDIA has been to get a good idea of where they are heading next. Today, NVIDIA is introducing the latest Tegra product, and it is going in a slightly different direction than what many had expected.
The reference board with 4 GB of LPDDR4.
Introduction and Features
be quiet! claims to be Germany’s number one brand in PC power supplies and they are continuing to expand sales into North American markets. In this review we will be taking a detailed look at the new be quiet! Straight Power 10 800W power supply with Cable Management. There are four power supplies in the Straight Power 10 CM Series, which include 500W, 600W, 700W, and 800W models.
be quiet! designed the Straight Power 10 CM Series to provide high efficiency with minimal noise for systems that demand whisper-quiet operation without compromising on power quality. In addition to the Straight Power 10 Series, be quiet! offers a full range of power supplies in ATX, SFX, and TFX form factors.
(Courtesy of be quiet!)
All of the new Straight Power 10 Cable Management Series power supplies are semi-modular (all cables are modular except for the fixed 24-pin ATX cable). Along with 80 Plus Gold certified high efficiency, the Straight Power 10 800W modular power supply has been designed for quiet operation. It uses be quiet!’s latest SilentWings3 135mm fan for virtually silent operation (at low to mid power levels). The fan speed starts out very slow and remains slow and quiet through mid power levels.
be quiet! Straight Power 10 800W CM PSU Key Features:
• 800W continuous DC output (ATX12V v2.4, EPS 2.92 compliant)
• Virtually inaudible SilentWings3 135mm cooling fan
• 80 PLUS Gold certified efficiency (up to 93%)
• Premium 105°C rated parts enhance stability and reliability
• Powerful GPU support with four PCI-E connectors
• User-friendly cable management reduces clutter and improves airflow
• NVIDIA SLI Ready and AMD CrossFire X certified
• ErP 2014 ready and meets Energy Star 6.0 guidelines
• Zero load design supports Intel’s Deep Power Down C6 & C7 modes
• Fully Intel Haswell compatible
• Active Power Factor correction (0.99) with Universal AC input
• German product conception, design and quality control
• Safety Protections : OCP, OVP, UVP, SCP, OTP, and OPP
• 5-Year warranty
• MSRP for the Straight Power 10 800W CM PSU: $169.00 USD
Big Power, Small Size
Though the mindset that a small PC is a slow PC is fading, there are still quite a few readers out there that believe the size of your components will indicate how well they perform. That couldn't be further from the case, and this week we decided to build a small, but not tiny, PC to showcase that small can be beautiful too!
Below you will find a complete list of parts and components used in our build - but let me say right off the bat, to help alleviate as much vitriol in the comments as possible, there are quite a few ways you could build this system to either get a lower price, or higher performance, or quieter design, etc. Our selections were based on a balance of both with a nod towards expansion in a few cases.
Take a look:
|MicroATX Gaming Build|
|Processor||Intel Core i7-4790K - $334
Corsair Hydro Series H80i - $87
|Motherboard||Gigabyte Z97MX-Gaming 5 - $127|
|Memory||G.Skill Ripjaws X 8GB DDR3-2133 - $88|
|Graphics Card||EVGA GeForce GTX 970 FTW - $399|
|Storage||Samsung 250GB 850 EVO - $139
Western Digital 2TB Green - $79
|Case||Corsair Carbide Series Air 240 - $89|
|Power Supply||Seasonic Platinum 860 watt PSU - $174|
|OS||Windows 8.1 x64 - $92|
|Total Price||$1602 - Amazon Full Cart|
The starting point for this system is the Intel Core i7-4790K, the top-end Haswell processor for the Z97 chipset. In fact, the Core i7-4790K is a Devil's Canyon part, created by Intel to appease the enthusiast looking for an overclockable and high clocked quad-core part. This CPU will only lag behind the likes of the Haswell-E LGA2011 processors, but at just $340 or so, is significantly less expensive. Cooling the 4790K is Corsair's Hydro Series H80i double-thickness self contained water cooler.
For the motherboard I selected the Gigabyte Z97MX-Gaming 5, a MicroATX motherboard that combines performance and features in a mATX form factor, perfect for our build. This board includes support for SLI and CrossFire, has audio OP-AMP support, USB ports dedicated for DACs, M.2 storage support, Killer networking and more.
NVIDIA's G-Sync technology and the monitors that integrate it continue to be one of hottest discussion topics surrounding PC technology and PC gaming. We at PC Perspective have dived into the world of variable refresh rate displays in great detail, discussing the technological reasons for it's existence, talking with co-creator Tom Petersen in studio, doing the first triple-panel Surround G-Sync testing as well as reviewing several different G-Sync monitor's available on the market. We were even the first to find the reason behind the reported flickering a 0 FPS on G-Sync monitors.
A lot of has happened in the world of displays in the year or more since NVIDIA first announced G-Sync technology including a proliferation of low cost 4K panels as well as discussion of FreeSync, AMD's standards-based alternative to G-Sync. We are still waiting for our first hands on time (other than a static demo) with monitors supporting FreeSync / AdaptiveSync and it is quite likely that will occur at CES this January. If it doesn't, AMD is going to have some serious explaining to do...
But today we are looking at the new Acer XB270H, a 1920x1080 27-in monitor with G-Sync support and a 144 Hz refresh rate; a unique combination. In fact, there is no other 27-in 144 Hz 1080p monitor on the market that we are aware of after a quick search of Newegg.com and Amazon.com. But does this monitor offer the same kind of experience as the ASUS ROG Swift PG278Q or even the Acer XB280HK 4K G-Sync panels?
Drobo is frequently referred to as ‘the Apple of external storage products’. They got this name because their products go for the simplest possible out-of-the-box experience. Despite their simplicity, the BeyondRAID concept these units employ remains extremely robust and highly resistant to data loss in even the most extreme cases of drive failures and data loss. I reviewed the DroboPro 8-bay unit over 5 years ago and was so impressed by it that I continue to use one to this day (and it has never lost data, despite occasional hard drive failures).
Over those past 5 years since our review of the DroboPro, Drobo (then known as Data Robotics) has also had a bit of an Apple story. Their original CEO started the company but was ousted by the board in late 2009. He then started Connected Data in 2011, quickly growing to the point where they merged with Drobo in 2013. This was not just a merger of companies, it was a merger of their respective products. The original Transporter was only a single drive unit, where Drobo’s tech supercharged that personal cloud capability to scale all the way up to corporate environments.
Many would say that for that period where their original CEO was absent, Drobo’s products turned more towards profitability, perhaps too soon for the company, as the products released during that period were less than stellar. We actually got a few of those Drobos in for review, but their performance was so inconsistent that we spent more time trying to figure out what was causing the issues than completing a review we could stand behind. With their founder back in the CEO chair, Drobo's path was turned back to its roots - making a good, fast, and low cost product for their customers. This was what they wanted to accomplish back in 2009, but in many ways the available tech was not up to speed yet. USB 2.0 was the fastest widely available standard, aside from iSCSI over Gigabit (but that was pricey to implement and appeared in the DroboPro). Nowadays things are very different. USB 3.0 controllers are vastly more compatible and faster than they used to be, as is SATA controller hardware and ARM microcontrollers. These developments would ultimately enable Drobo to introduce what they wanted to in the first place:
This is the third generation 4-Bay Drobo. The 4-Bay model is what started it all for them, but was a bit underpowered and limited to USB 2.0 speeds. The second gen unit launched mid 2008, adding FireWire as a faster connection option, but it was still slower than most would have liked given its $500 price tag. This third generation unit promises to change all of that.
USB is once again the only connectivity option, but this time it’s USB 3.0. There have previously been other 5-bay Drobos with this as an option (Drobo S, S gen 2, 5D, Mini), but many of those units saw compatibility issues with some USB 3.0 host controllers. We experienced some of these same frustrating incompatibilities first hand, and can confirm those frustrations. Drobo is putting that behind them with a revised chipset, and today we will put it all to the test.
Introduction, Specs, and First Impressions
BitFenix has been making enclosures for the PC market since 2010 (with the massive Colossus E-ATX case), and came to prominence a couple of years later with the introduction of the Prodigy enclosure. While the company has expanded to produce power supplies and peripherals they are still primarily a case manufacturer, as evidenced by the now 31 different models on their product page. Not content to iterate on their existing designs, BitFenix has consistently introduced new chassis ideas for different form-factors and needs.
We reviewed the Colossus Micro-ATX case back in March, and it is again an enclosure built for the venerable micro-ATX form-factor that we’re looking at here. Quite the opposite of the Colossus Micro-ATX's squat design, the Pandora is smooth and very slim.
In the world of computer cases there are many variations, but they are mostly boxes with splashes of style and the occasional window. Companies like In Win are at the opposite end of the spectrum, but the design choices for a case with commitment to artistic intent often entail a considerable price tag, and In Win consistently prices itself out of the mainstream market. So what about the middle ground? Enter the BitFenix Pandora. It boasts eye-catching looks, a slim design that seems even more so given the curved panels, and even has a color LCD screen that can be programmed with the image file of your choice!
The Pandora features a programmable color LCD display, to which I affixed this incredible logo
I don’t want to dissolve into meaningless superlatives, but the Pandora is a striking design. When it was shown at Computex earlier in 2014 it was listed as a mini-ITX enclosure, and while it definitely supports mini-ITX motherboards it is the final product’s micro-ATX support that we focus on in this review. And while it would have been large as a mini-ITX enclosure the Pandora is fairly small as an mini-ATX case, most notably due to that slim profile. This comes at a price, as there won’t be as much room for storage with such a narrow width (and those looking for any optical drive support must look elsewhere). And speaking of price, while the "core" version of the case starts at around $110, this version with programmable display is currently selling for just under $160. Steep, but not outrageous either.
Meet the M320
Logitech is brand synonymous with mice, joysticks and other peripherals, providing a handy way to interact with your computer for over 20 years. Anyone who has used a computer for any amount of time knows Logitech and have used a variety of their products. Their peripheral lineup has come a long way from the beginnings, with washable keyboards, webcams and mice with over two dozen programmable buttons.
In this case we are looking at the M320 Wireless Mouse with three buttons and scroll wheel, a rubberized grip shaped for the right hand and an offset optical sensor with 1000 dpi resolution.
The Logitech M320 comes in a user friendly clamshell package with cut out flap on the back which is actually effective in opening the packaging without the need of a utility knife or a couple of stitches on your hand. Perhaps even more impressive is the fact that it ships with a battery included; not the rechargeable kind but certainly a nice touch for those of us who remember receiving toys that were unusable until someone made a trip to the store to pick up the required mix of AAA's, D's or 9V's. The documentation claims the battery will last for two years and while there was obviously no way to put that to the test the automatic sleep mode and physical power switch will ensure that your battery life will not be inconveniently short.
Introduction and Technical Specifications
Courtesy of GIGABYTE
The X99 Gaming G1 is GIGABYTE's flagship product in their gaming line of Intel X99 chipset-based motherboards. The board support all Intel LGA2011-3 based processors paired with DDR4 memory in up to a quad channel configuration. The X99 Gaming G1 board prominently features GIGABYTE's new Gaming-line branding, adding sleek looks to its feature-packed design. At an MSRP of $349.99, the board comes at a premium price to match it premium status.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
Courtesy of GIGABYTE
The X99 Gaming G1 board was designed to take any abuse thrown its way, packing an 8-phase digital power system. GIGABYTE designed the board's power delivery system using top-rated components, including International Rectify Gen 4 digital PWM controllers and Gen 3 PowIRstage controllers, Cooper Bussmann Server Level chokes, and long life Durable Black Solid capacitors. For the integrated sound solution, GIGABYTE paired the X99 Gaming G1 board with the Creative Sound Core3D&trade quad-core audio processor, high-end audio capacitors, and a removable OP-AMP for a superior and customizable integrated audio experience.
Introduction and Specifications
Several weeks ago, during an episode of the PC Perspective Podcast, we talked about a new all-in-one machine from MSI with a focus on gaming. Featuring a quad-core Intel Haswell processor and a GeForce GTX 980M GPU, the MSI AG270 2QE takes the best available hardware for mobile gaming and stuffs them into a machine with an integrated 1080p touch screen. The result is likely to be the most potent gaming AIO that you will find available; it should be more than capable of tackling modern games at the integrated panel's 1920x1080 resolution.
A gaming all-in-one is an interesting idea - a cross between the typical gaming desktop and a gaming laptop, an AIO splits the difference in a couple of interesting ways. It's more portable than a desktop and monitor combination for sure, but definitely heavier and bulkier than MSI's own GT72 for example. The AG270 offers a much larger screen (at 1080p) than any gaming notebook on its own, which improves the overall gaming experience without the need for additional hardware. While not ideal, it is totally feasible to take the AG270 with you to a neighbor's house for some LAN party action.
So what do you get with the MSI AG270 2QE, and more specifically, with the 037US kit we are reviewing today? Let's find out.
Continue reading our review of the MSI AG270 2QE-037US gaming all-in-one!!
Introduction and Internals
We've seen USB 3.0 in devices for a few years now, but it has only more recently started taking off since controllers, drivers, and Operating Systems have incorporated support for the USB Attached SCSI Protocol. UASP takes care of one of the big disadvantages seen when linking high speed storage devices. USB adds a relatively long and multi-step path for each and every transaction, and the initial spec did not allow for any sort of parallel queuing. The 'Bulk-Only Transport' method was actually carried forward all the way from USB 1.0, and it simply didn't scale well for very low latency devices. The end result was that a USB 3.0 connected SSD performed at a fraction of its capability. UASP fixes that by effectively layering the SCSI protocol over the USB 3.0 link. Perhaps its biggest contributor to the speed boost is SCSI's ability to queue commands. We saw big speed improvements with the Corsair Flash Voyager GTX and other newer UASP enabled flash drives, but it's time we look at some ways to link external SATA devices using this faster protocol. Our first piece will focus on a product from Inateck - their FE2005 2.5" SATA enclosure:
This is a very simple enclosure, with a sliding design and a flip open door at the front.
Introduction and Features
SilverStone has a long-standing reputation for providing a full line of high quality enclosures, power supplies, cooling components, and accessories for PC enthusiasts. Today we are going to mix it up a bit and focus our attention on smaller rather than larger. Not everyone needs or wants a 1,000W plus PC power supply. And if you have a small form factor case or are struggling to find room inside a cramped ATX enclosure, SilverStone’s new SX600-G SFX power supply may just be the solution you are looking for.
The new SX600-G SFX power supply was designed for small form factor cases but comes with an ATX adapter plate so it can be used in a standard ATX enclosure as well. In addition to its small size, the SX600-G features high efficiency (80 Plus Gold certified), all modular flat ribbon-style cables, and provides up to 600W of continuous DC output; pretty impressive for such a small package. Also new is the ability to operate in semi-fanless mode (cooling fan turns off at low power).
SX600-G SFX 600W PSU Standard ATX 600W PSU
The last time we looked at a SilverStone SFX power supply was in 2012 when we reviewed the updated ST45SF-G, which was rated at 450W. SFX power supplies continue to occupy a niche market and address a slightly different set of needs than the standard ATX units we typically use and review at PCPerspective.
Here is what SilverStone has to say about their new SX600-G SFX PSU: “After releasing the breakthrough SFX power supply in 2012 with the ST45SF-G, SilverStone has pushed the technical envelope even further with yet another industry defining design in the SX600-G. This small form factor PSU has the exact same dimensions as its predecessor but its power density has increased from 567W per liter (in the ST45SF-G) to 756W per liter. The result is a standard sized SFX PSU with an incredible 600W of continuous power, a level that is capable of supporting any single graphics card system with ease.
Besides the power increase, the SX600-G comes standard with flexible, flat modular cables similar to those in the PP05-E short cable kit for vastly improved cable management in smaller cases. It also has added semi-fanless capability that was first introduced to SFX PSUs by the ST30SF so its quiet running fan can remain turned off during ideal low load or idle conditions for complete silence. As before, an ATX adapter bracket is included to enable users to install this PSU into any small or even larger cases that do not have SFX mounting holes. For the most ardent SFF enthusiasts, the SX600-G is truly a dream come true that combines the convenience of SFX size and all the top of the line features one can expect from high-end ATX PSUs into one product.”
SilverStone SX600-G SFX Power Supply Key Features:
• Supports standard SFX form factor (and ATX via included adapter)
• 600W Continuous DC output (up to 40°C)
• High efficiency with 80 Plus Gold certification
• 100% Modular cables, flat ribbon-style
• Intelligent semi-fanless operation
• Strict ±3% voltage regulation with low AC ripple and noise
• Class leading single +12V rail with 50A (600W)
• Two PCI-E 6+2 pin connectors
• Protections: OCP, OVP, OPP, and SCP
• Universal AC input and Active PFC
• MSRP $129.99 USD
Finding Your Clique
One of the difficulties with purchasing a mechanical keyboard is that they are quite expensive and vary greatly in subtle, but important ways. First and foremost, we have the different types of keyswitches. These are the components that are responsible for making each button behave, and thus varying them will lead to variations in how those buttons react and feel.
Until recently, the Cherry MX line of switches were the basis of just about every major gaming mechanical keyboard, although we will discuss recent competitors later on. Its manufacturer, Cherry Corp / ZF Electronics, maintained a strict color code to denote the physical properties of each switch. These attributes range from the stiffness of the spring to the bumps and clicks felt (or heard) as the key travels toward its bottom and returns back up again.
|45 cN||Cherry MX Red||
Cherry MX Brown
Cherry MX Blue
Cherry MX White (old B)
|55 cN||Cherry MX Clear|
|60 cN||Cherry MX Black|
|80 cN||Cherry MX Linear Grey (SB)||Cherry MX Tactile Grey (SB)||
Cherry MX Green (SB)
Cherry MX White (old A)
Cherry MX White (2007+)
|90 cN||IBM Model M (not mechanical)|
|105 cN||Cherry MX Click Grey (SB)|
|150+ cN||Cherry MX Super Black|
(SB) Denotes switches with stronger springs that are primarily for, or only for, Spacebars. The Click Grey is intended for spacebars on Cherry MX White, Green, and Blue keyboards. The MX Green is intended for spacebars on Cherry MX Blue keyboards (but a few rare keyboards use these for regular keys). The MX Linear Grey is intended for spacebars on Cherry MX Black keyboards.
The four main Cherry MX switches are: Blue, Brown, Black, and Red. Other switches are available, such as the Cherry MX Green, Clear, three types of Grey, and so forth. You can separate (I believe) all of these switches into three categories: Linear, Tactile, and Clicky. From there, the only difference is the force curve, usually from the strength of the spring but also possibly from the slider features (you'll see what I mean in the diagrams below).
Introduction and Design
MSI’s unapologetically large GT70 “Dominator Pro” series of machines knows its audience well: for every gripe about the notebooks’ hulking sizes, a snicker and a shrug are returned by the community, who rarely value such items as portability as highly as the critics who are hired to judge based on them. These machines are built for power, first and foremost. While featherweight construction and manageable dimensions matter to those regularly tossing machines into their bags, by contrast, MSI’s desktop replacements recognize the meaning of their classification: the flexibility of merely moving around the house with one’s gaming rig is reason enough to consider investing in one.
So its priorities are arguably well in line. But if you want to keep on dominating, regular updates are a necessity, too. And with the GT72 2QE, MSI takes it all up yet another notch: our review unit (GT72 2QE-208US) packs four SSDs in a RAID-0 array (as opposed to the GT70’s three), plus a completely redesigned case which manages to address some of our biggest complaints. Oh yeah, and an NVIDIA GTX 980M GPU with 8 GB GDDR5 RAM—the fastest mobile GPU ever. (You can find much more information and analysis on this GPU specifically in Ryan’s ever-comprehensive review.)
Of course, these state-of-the-art innards come at no small price: $2,999 as configured (around a $2,900 street price), or a few hundred bucks less with storage or RAM sacrifices—a reasonable trade-off considering the marginal benefits one gains from a quad-SSD array or 32 GB of RAM.
A Step Up for FM2+
I have been impressed by the Asus ROG boards for quite a few years now. I believe my first encounter was with the Crosshair IV Formula, followed by the CH IV Extreme with that crazy Lucidlogix controller. These were really outstanding boards at the time, even if one was completely overkill (and not terribly useful for multi-GPU via Lucidlogix). Build quality, component selections, stability, and top notch features have defined these ROG products. The Intel side is just as good, if not better, in that they have a wider selection of boards under the ROG flag.
Asus has had a fairly large hole in their offerings that had not been addressed until fairly recently. The latest AMD APUs based on FM1, FM2, and FM2+ did not have their own ROG member. This was fixed in late summer of this year. Asus released the interestingly named Crossblade Ranger FM2+ motherboard for the AMD APU market.
FM2+ motherboards are, as a rule, fairly inexpensive products. The FM2+ infrastructure does not have to support processors with the 219 watt TDPs that AM3+ does, instead all of the FM2+ based products are 100 watts TDP and below. There are many examples of barebones motherboards for FM2+ that are $80 and less. We have a smattering of higher end motherboards from guys like Gigabyte and MSI, but these are hitting max prices of $110 to $120 US. Asus is offering users in the FM2+ market something a little different from the rest. Users who purchase an AMD APU will be getting much the same overall experience that the top end Intel based ROG customers if they decide to buy the Crossblade Ranger, but for a much lower price.
The bundle is functional, but not overly impressive.
If you’re a fan of digital video and music, you’ve likely heard the name “Plex” floating around. Plex (not to be confused with EVE Online’s in-game subscription commodity) is free media center software that lets users manage and stream a wide array of videos, audio files, and pictures to virtually any computer and a growing number of mobile devices and electronics. As a Plex user from the very beginning, I’ve seen the software change and evolve over the years into the versatile and powerful service it is today.
My goal with this article twofold. First, as an avid Plex user, I’d like to introduce the software to users have yet to hear about or try it. Second, for those already using or experimenting with Plex, I hope that I can provide some “best practices” when it comes to configuring your servers, managing your media, or just using the software in general.
Before we dive into the technical aspects of Plex, let’s look at a brief overview of the software’s history and the main components that comprise the Plex ecosystem today.
Although now widely supported on a range of platforms, Plex was born in early 2008 as an OS X fork of the Xbox Media Center project (XBMC). Lovingly named “OSXBMC” (get it?) by its creators, the software was initially a simple media player for Mac, with roughly the same capabilities as the XBMC project from which it was derived. (Note: XBMC changed its name to “Kodi” in August, although you’ll still find plenty of people referring to the software by its original name).
A few months into the project, the OSXBMC team decided to change the name to “Plex” and things really started to take off for the nascent media software. Unlike the XBMC/Kodi community, which focused its efforts primarily on the playback client, the Plex team decided to bifurcate the project with two distinct components: a dedicated media server and a dedicated playback client.
The dedicated media server made Plex unique among its media center peers. Once properly set up, it gave users with very little technical knowledge the ability to maintain a server that was capable of delivering their movies, TV shows, music, and pictures on demand throughout the house and, later, the world. We'll take a more detailed look at each of the Plex components next.
The “brains” behind the entire Plex ecosystem is Plex Media Server (PMS). This software, available for Windows, Linux, and OS X, manages your media database, metadata, and any necessary transcoding, which is one of its best features. Although far from error-free, the PMS encoding engine can convert virtually any video codec and container on the fly to a format requested by a client device. Want to play a high-bitrate 1080p MKV file with a 7.1 DTS-HD MA soundtrack on your Roku? No problem; Plex will seamlessly transcode that high quality source file to the proper format for Roku, as well as your iPad, or your Galaxy S5, and many other devices, all without having to store multiple copies of your video files.
There are smart people that work at AMD. A quick look at the company's products, including the APU lineup as well as the discrete GPU fields, clearly indicates a lineup of talent in engineering, design, marketing and business. It's not perfect of course, and very few companies can claim to be, but the strengths of AMD are there and easily discernible to those of us on the outside looking in with the correct vision.
Because AMD has smart people working hard to improve the company, they are also aware of its shortcomings. For many years now, the thorn of GPU software has been sticking in AMD's side, tarnishing the name of Radeon and the products it releases. Even though the Catalyst graphics driver has improved substantially year after year, the truth is that NVIDIA's driver team has been keeping ahead of AMD consistently in basically all regards: features, driver installation, driver stability, performance improvements over time.
If knowing is half the battle, acting on that knowledge is at least another 49%. AMD is hoping to address driver concerns now and into the future with the release of the Catalyst Omega driver. This driver sets itself apart from previous releases in several different ways, starting with a host of new features, some incremental performance improvements and a drastically amped up testing and validation process.
AMD considers this a "special edition" driver and is something that they plan to repeat on a yearly basis. That note in itself is an interesting point - is that often enough to really change the experience and perception of the Catalyst driver program going forward? Though AMD does include some specific numbers of tested cases for its validation of the Omega driver (441,000+ automated test runs, 11,000+ manual test runs) we don't have side by side data from NVIDIA to compare it to. If AMD is only doing a roundup of testing like this once a year, but NVIDIA does it more often, then AMD might soon find itself back in the same position it has been.
UPDATE: There has been some confusion based on this story that I want to correct. AMD informed us that it is still planning on releasing other drivers throughout the year that will address performance updates for specific games and bug fixes for applications and titles released between today and the pending update for the next "special edition." AMD is NOT saying that they will only have a driver drop once a year.
But before we worry about what's going to happen in the future, let's look into what AMD has changed and added to the new Catalyst Omega driver released today.
Introduction, Specifications and Packaging
Mid last year, Samsung introduced the 840 EVO. This was their evolutionary step from the 840 Pro, which had launched a year prior. While the Pro was a performance MLC SSD, the EVO was TLC, and for most typical proved just as speedy. The reason for this was Samsung’s inclusion of a small SLC cache on each TLC die. Dubbed TurboWrite, this write-back cache gave the EVO the best write performance of any TLC-based SSD on the market. Samsung had also introduced a DRAM cache based RAPID mode - included with their Magician value added software solution. The EVO was among the top selling SSDs since its launch, despite a small hiccup quickly corrected by Samsung.
Fast forward to June of this year where we saw the 850 Pro. Having tested the waters with 24-layer 3D VNAND, Samsung revises this design, increasing the layer count to 32 and reducing the die capacity from 128Gbit to 86Gbit. The smaller die capacity enables a 50% performance gain, stacked on top of the 100% write speed gain accomplished by the reduced cross talk of the 3D VNAND architecture. These changes did great things for the performance of the 850 Pro, especially in the lower capacities. While competing 120/128GB SSDs were typically limited to 150 MB/sec write speeds, the 128GB 850 Pro cruises along at over 3x that speed, nearly saturating the SATA interface. The performance might have been great, but so was the cost - 850 Pro’s have stuck around $0.70/GB since their launch, forcing budget conscious upgraders to seek competing solutions. What we needed was an 850 EVO, and now I can happily say here it is:
As the 840 EVO was a pretty big deal, I believe the 850 EVO has an equal chance of success, so instead of going for a capacity roundup, this first piece will cover the 120GB and 500GB capacities. A surprising number of our readers run a pair of smaller capacity 840 EVOs in a RAID, so we will be testing a matched pair of 850 EVOs in RAID-0. To demonstrate the transparent performance boosting of RAPID, I’ll also run both capacities through our full test suite with RAPID mode enabled. There is lots of testing to get through, so let’s get cracking!
In the last few years NZXT has emerged as a popular choice for computer builds with stylish cases for a variety of needs. The newest member of the H series, the H440, promises quiet performance and offers a clean look by eliminating optical drive bays entirely from the design. While this might be a deal-breaker for some, the days of the ODD seem to be numbered as more enclosures are making the move away from the 5.25" bay.
Image credit: NZXT
But we aren't looking at just any H440 today, as NZXT has sent along a completely custom version designed in alliance with gaming accessory maker Razer to be "the ultimate gamer's chassis". (This case is currently available direct from NZXT's online store.) In this review we'll look at just what makes this H440 different, and test out a complete build while we're at it. Performance will be as big a metric as appearance here since the H440 is after all an enclosure designed for silence, with noise dampening an integral part of NZXT's construction of the case.
Green with Envy?
From the outset you'll notice the Razer branding extends beyond just special paint and trim, as custom lighting is installed right out of the box to give this incarnation of the H440 a little more gaming personality (though this lighting can be switched off, if desired). Not only do the front and side logos and power button light up green, but the bottom of the case features effects lighting to cast an eerie green glow on your desktop or floor.
Image credit: NZXT
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Maximus VII Impact motherboard is among ASUS' ROG (Republic of Gamers) board offerings in their Intel Z97 Express product line. The board builds on the strengths of its predecessor with the a similar layout and add-in card design implementation. ASUS augmented the new version of the board with an updated chipset and as well as additional support for the latest hard drive and audio technologies. The Maximus VII Impact has a premium price of $239.99 for its small status, but come packed full for features and power to more than justify the cost.
Courtesy of ASUS
Courtesy of ASUS
Courtesy of ASUS
ASUS did not pull any punches in designing the Maximus VII Impact board, integrating a similar 8-phase digital power system as found on the Maximus VII Formula ATX board. The power system combines 60A-rated BlackWing chokes, NexFET MOSFETs with a 90% efficiency rating, and 10k Japanese-source Black Metallic capacitors onto an upright board to minimize the footprint of those components. Additionally, ASUS integrated their updated SupremeFX Impact II audio system for superior audio fidelity using the included SupremeVX Impact II add-in card.
We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.
As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.
Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.
First a bit of misnomer correction / foundation laying:
- The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
- LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
- In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
- The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).
With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:
Measured panel section brightness over time during a 'stall' event. Click to enlarge.
The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.
Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.
A few final points before we go:
- This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
- The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
- The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).
This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.
During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:
"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.
This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.
When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."
So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below.
(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)