All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Drobo is frequently referred to as ‘the Apple of external storage products’. They got this name because their products go for the simplest possible out-of-the-box experience. Despite their simplicity, the BeyondRAID concept these units employ remains extremely robust and highly resistant to data loss in even the most extreme cases of drive failures and data loss. I reviewed the DroboPro 8-bay unit over 5 years ago and was so impressed by it that I continue to use one to this day (and it has never lost data, despite occasional hard drive failures).
Over those past 5 years since our review of the DroboPro, Drobo (then known as Data Robotics) has also had a bit of an Apple story. Their original CEO started the company but was ousted by the board in late 2009. He then started Connected Data in 2011, quickly growing to the point where they merged with Drobo in 2013. This was not just a merger of companies, it was a merger of their respective products. The original Transporter was only a single drive unit, where Drobo’s tech supercharged that personal cloud capability to scale all the way up to corporate environments.
Many would say that for that period where their original CEO was absent, Drobo’s products turned more towards profitability, perhaps too soon for the company, as the products released during that period were less than stellar. We actually got a few of those Drobos in for review, but their performance was so inconsistent that we spent more time trying to figure out what was causing the issues than completing a review we could stand behind. With their founder back in the CEO chair, Drobo's path was turned back to its roots - making a good, fast, and low cost product for their customers. This was what they wanted to accomplish back in 2009, but in many ways the available tech was not up to speed yet. USB 2.0 was the fastest widely available standard, aside from iSCSI over Gigabit (but that was pricey to implement and appeared in the DroboPro). Nowadays things are very different. USB 3.0 controllers are vastly more compatible and faster than they used to be, as is SATA controller hardware and ARM microcontrollers. These developments would ultimately enable Drobo to introduce what they wanted to in the first place:
This is the third generation 4-Bay Drobo. The 4-Bay model is what started it all for them, but was a bit underpowered and limited to USB 2.0 speeds. The second gen unit launched mid 2008, adding FireWire as a faster connection option, but it was still slower than most would have liked given its $500 price tag. This third generation unit promises to change all of that.
USB is once again the only connectivity option, but this time it’s USB 3.0. There have previously been other 5-bay Drobos with this as an option (Drobo S, S gen 2, 5D, Mini), but many of those units saw compatibility issues with some USB 3.0 host controllers. We experienced some of these same frustrating incompatibilities first hand, and can confirm those frustrations. Drobo is putting that behind them with a revised chipset, and today we will put it all to the test.
Introduction, Specs, and First Impressions
BitFenix has been making enclosures for the PC market since 2010 (with the massive Colossus E-ATX case), and came to prominence a couple of years later with the introduction of the Prodigy enclosure. While the company has expanded to produce power supplies and peripherals they are still primarily a case manufacturer, as evidenced by the now 31 different models on their product page. Not content to iterate on their existing designs, BitFenix has consistently introduced new chassis ideas for different form-factors and needs.
We reviewed the Colossus Micro-ATX case back in March, and it is again an enclosure built for the venerable micro-ATX form-factor that we’re looking at here. Quite the opposite of the Colossus Micro-ATX's squat design, the Pandora is smooth and very slim.
In the world of computer cases there are many variations, but they are mostly boxes with splashes of style and the occasional window. Companies like In Win are at the opposite end of the spectrum, but the design choices for a case with commitment to artistic intent often entail a considerable price tag, and In Win consistently prices itself out of the mainstream market. So what about the middle ground? Enter the BitFenix Pandora. It boasts eye-catching looks, a slim design that seems even more so given the curved panels, and even has a color LCD screen that can be programmed with the image file of your choice!
The Pandora features a programmable color LCD display, to which I affixed this incredible logo
I don’t want to dissolve into meaningless superlatives, but the Pandora is a striking design. When it was shown at Computex earlier in 2014 it was listed as a mini-ITX enclosure, and while it definitely supports mini-ITX motherboards it is the final product’s micro-ATX support that we focus on in this review. And while it would have been large as a mini-ITX enclosure the Pandora is fairly small as an mini-ATX case, most notably due to that slim profile. This comes at a price, as there won’t be as much room for storage with such a narrow width (and those looking for any optical drive support must look elsewhere). And speaking of price, while the "core" version of the case starts at around $110, this version with programmable display is currently selling for just under $160. Steep, but not outrageous either.
Meet the M320
Logitech is brand synonymous with mice, joysticks and other peripherals, providing a handy way to interact with your computer for over 20 years. Anyone who has used a computer for any amount of time knows Logitech and have used a variety of their products. Their peripheral lineup has come a long way from the beginnings, with washable keyboards, webcams and mice with over two dozen programmable buttons.
In this case we are looking at the M320 Wireless Mouse with three buttons and scroll wheel, a rubberized grip shaped for the right hand and an offset optical sensor with 1000 dpi resolution.
The Logitech M320 comes in a user friendly clamshell package with cut out flap on the back which is actually effective in opening the packaging without the need of a utility knife or a couple of stitches on your hand. Perhaps even more impressive is the fact that it ships with a battery included; not the rechargeable kind but certainly a nice touch for those of us who remember receiving toys that were unusable until someone made a trip to the store to pick up the required mix of AAA's, D's or 9V's. The documentation claims the battery will last for two years and while there was obviously no way to put that to the test the automatic sleep mode and physical power switch will ensure that your battery life will not be inconveniently short.
Introduction and Technical Specifications
Courtesy of GIGABYTE
The X99 Gaming G1 is GIGABYTE's flagship product in their gaming line of Intel X99 chipset-based motherboards. The board support all Intel LGA2011-3 based processors paired with DDR4 memory in up to a quad channel configuration. The X99 Gaming G1 board prominently features GIGABYTE's new Gaming-line branding, adding sleek looks to its feature-packed design. At an MSRP of $349.99, the board comes at a premium price to match it premium status.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
Courtesy of GIGABYTE
The X99 Gaming G1 board was designed to take any abuse thrown its way, packing an 8-phase digital power system. GIGABYTE designed the board's power delivery system using top-rated components, including International Rectify Gen 4 digital PWM controllers and Gen 3 PowIRstage controllers, Cooper Bussmann Server Level chokes, and long life Durable Black Solid capacitors. For the integrated sound solution, GIGABYTE paired the X99 Gaming G1 board with the Creative Sound Core3D&trade quad-core audio processor, high-end audio capacitors, and a removable OP-AMP for a superior and customizable integrated audio experience.
Introduction and Specifications
Several weeks ago, during an episode of the PC Perspective Podcast, we talked about a new all-in-one machine from MSI with a focus on gaming. Featuring a quad-core Intel Haswell processor and a GeForce GTX 980M GPU, the MSI AG270 2QE takes the best available hardware for mobile gaming and stuffs them into a machine with an integrated 1080p touch screen. The result is likely to be the most potent gaming AIO that you will find available; it should be more than capable of tackling modern games at the integrated panel's 1920x1080 resolution.
A gaming all-in-one is an interesting idea - a cross between the typical gaming desktop and a gaming laptop, an AIO splits the difference in a couple of interesting ways. It's more portable than a desktop and monitor combination for sure, but definitely heavier and bulkier than MSI's own GT72 for example. The AG270 offers a much larger screen (at 1080p) than any gaming notebook on its own, which improves the overall gaming experience without the need for additional hardware. While not ideal, it is totally feasible to take the AG270 with you to a neighbor's house for some LAN party action.
So what do you get with the MSI AG270 2QE, and more specifically, with the 037US kit we are reviewing today? Let's find out.
Continue reading our review of the MSI AG270 2QE-037US gaming all-in-one!!
Introduction and Internals
We've seen USB 3.0 in devices for a few years now, but it has only more recently started taking off since controllers, drivers, and Operating Systems have incorporated support for the USB Attached SCSI Protocol. UASP takes care of one of the big disadvantages seen when linking high speed storage devices. USB adds a relatively long and multi-step path for each and every transaction, and the initial spec did not allow for any sort of parallel queuing. The 'Bulk-Only Transport' method was actually carried forward all the way from USB 1.0, and it simply didn't scale well for very low latency devices. The end result was that a USB 3.0 connected SSD performed at a fraction of its capability. UASP fixes that by effectively layering the SCSI protocol over the USB 3.0 link. Perhaps its biggest contributor to the speed boost is SCSI's ability to queue commands. We saw big speed improvements with the Corsair Flash Voyager GTX and other newer UASP enabled flash drives, but it's time we look at some ways to link external SATA devices using this faster protocol. Our first piece will focus on a product from Inateck - their FE2005 2.5" SATA enclosure:
This is a very simple enclosure, with a sliding design and a flip open door at the front.
Introduction and Features
SilverStone has a long-standing reputation for providing a full line of high quality enclosures, power supplies, cooling components, and accessories for PC enthusiasts. Today we are going to mix it up a bit and focus our attention on smaller rather than larger. Not everyone needs or wants a 1,000W plus PC power supply. And if you have a small form factor case or are struggling to find room inside a cramped ATX enclosure, SilverStone’s new SX600-G SFX power supply may just be the solution you are looking for.
The new SX600-G SFX power supply was designed for small form factor cases but comes with an ATX adapter plate so it can be used in a standard ATX enclosure as well. In addition to its small size, the SX600-G features high efficiency (80 Plus Gold certified), all modular flat ribbon-style cables, and provides up to 600W of continuous DC output; pretty impressive for such a small package. Also new is the ability to operate in semi-fanless mode (cooling fan turns off at low power).
SX600-G SFX 600W PSU Standard ATX 600W PSU
The last time we looked at a SilverStone SFX power supply was in 2012 when we reviewed the updated ST45SF-G, which was rated at 450W. SFX power supplies continue to occupy a niche market and address a slightly different set of needs than the standard ATX units we typically use and review at PCPerspective.
Here is what SilverStone has to say about their new SX600-G SFX PSU: “After releasing the breakthrough SFX power supply in 2012 with the ST45SF-G, SilverStone has pushed the technical envelope even further with yet another industry defining design in the SX600-G. This small form factor PSU has the exact same dimensions as its predecessor but its power density has increased from 567W per liter (in the ST45SF-G) to 756W per liter. The result is a standard sized SFX PSU with an incredible 600W of continuous power, a level that is capable of supporting any single graphics card system with ease.
Besides the power increase, the SX600-G comes standard with flexible, flat modular cables similar to those in the PP05-E short cable kit for vastly improved cable management in smaller cases. It also has added semi-fanless capability that was first introduced to SFX PSUs by the ST30SF so its quiet running fan can remain turned off during ideal low load or idle conditions for complete silence. As before, an ATX adapter bracket is included to enable users to install this PSU into any small or even larger cases that do not have SFX mounting holes. For the most ardent SFF enthusiasts, the SX600-G is truly a dream come true that combines the convenience of SFX size and all the top of the line features one can expect from high-end ATX PSUs into one product.”
SilverStone SX600-G SFX Power Supply Key Features:
• Supports standard SFX form factor (and ATX via included adapter)
• 600W Continuous DC output (up to 40°C)
• High efficiency with 80 Plus Gold certification
• 100% Modular cables, flat ribbon-style
• Intelligent semi-fanless operation
• Strict ±3% voltage regulation with low AC ripple and noise
• Class leading single +12V rail with 50A (600W)
• Two PCI-E 6+2 pin connectors
• Protections: OCP, OVP, OPP, and SCP
• Universal AC input and Active PFC
• MSRP $129.99 USD
Finding Your Clique
One of the difficulties with purchasing a mechanical keyboard is that they are quite expensive and vary greatly in subtle, but important ways. First and foremost, we have the different types of keyswitches. These are the components that are responsible for making each button behave, and thus varying them will lead to variations in how those buttons react and feel.
Until recently, the Cherry MX line of switches were the basis of just about every major gaming mechanical keyboard, although we will discuss recent competitors later on. Its manufacturer, Cherry Corp / ZF Electronics, maintained a strict color code to denote the physical properties of each switch. These attributes range from the stiffness of the spring to the bumps and clicks felt (or heard) as the key travels toward its bottom and returns back up again.
|45 cN||Cherry MX Red||
Cherry MX Brown
Cherry MX Blue
Cherry MX White (old B)
|55 cN||Cherry MX Clear|
|60 cN||Cherry MX Black|
|80 cN||Cherry MX Linear Grey (SB)||Cherry MX Tactile Grey (SB)||
Cherry MX Green (SB)
Cherry MX White (old A)
Cherry MX White (2007+)
|90 cN||IBM Model M (not mechanical)|
|105 cN||Cherry MX Click Grey (SB)|
|150+ cN||Cherry MX Super Black|
(SB) Denotes switches with stronger springs that are primarily for, or only for, Spacebars. The Click Grey is intended for spacebars on Cherry MX White, Green, and Blue keyboards. The MX Green is intended for spacebars on Cherry MX Blue keyboards (but a few rare keyboards use these for regular keys). The MX Linear Grey is intended for spacebars on Cherry MX Black keyboards.
The four main Cherry MX switches are: Blue, Brown, Black, and Red. Other switches are available, such as the Cherry MX Green, Clear, three types of Grey, and so forth. You can separate (I believe) all of these switches into three categories: Linear, Tactile, and Clicky. From there, the only difference is the force curve, usually from the strength of the spring but also possibly from the slider features (you'll see what I mean in the diagrams below).
Introduction and Design
MSI’s unapologetically large GT70 “Dominator Pro” series of machines knows its audience well: for every gripe about the notebooks’ hulking sizes, a snicker and a shrug are returned by the community, who rarely value such items as portability as highly as the critics who are hired to judge based on them. These machines are built for power, first and foremost. While featherweight construction and manageable dimensions matter to those regularly tossing machines into their bags, by contrast, MSI’s desktop replacements recognize the meaning of their classification: the flexibility of merely moving around the house with one’s gaming rig is reason enough to consider investing in one.
So its priorities are arguably well in line. But if you want to keep on dominating, regular updates are a necessity, too. And with the GT72 2QE, MSI takes it all up yet another notch: our review unit (GT72 2QE-208US) packs four SSDs in a RAID-0 array (as opposed to the GT70’s three), plus a completely redesigned case which manages to address some of our biggest complaints. Oh yeah, and an NVIDIA GTX 980M GPU with 8 GB GDDR5 RAM—the fastest mobile GPU ever. (You can find much more information and analysis on this GPU specifically in Ryan’s ever-comprehensive review.)
Of course, these state-of-the-art innards come at no small price: $2,999 as configured (around a $2,900 street price), or a few hundred bucks less with storage or RAM sacrifices—a reasonable trade-off considering the marginal benefits one gains from a quad-SSD array or 32 GB of RAM.
A Step Up for FM2+
I have been impressed by the Asus ROG boards for quite a few years now. I believe my first encounter was with the Crosshair IV Formula, followed by the CH IV Extreme with that crazy Lucidlogix controller. These were really outstanding boards at the time, even if one was completely overkill (and not terribly useful for multi-GPU via Lucidlogix). Build quality, component selections, stability, and top notch features have defined these ROG products. The Intel side is just as good, if not better, in that they have a wider selection of boards under the ROG flag.
Asus has had a fairly large hole in their offerings that had not been addressed until fairly recently. The latest AMD APUs based on FM1, FM2, and FM2+ did not have their own ROG member. This was fixed in late summer of this year. Asus released the interestingly named Crossblade Ranger FM2+ motherboard for the AMD APU market.
FM2+ motherboards are, as a rule, fairly inexpensive products. The FM2+ infrastructure does not have to support processors with the 219 watt TDPs that AM3+ does, instead all of the FM2+ based products are 100 watts TDP and below. There are many examples of barebones motherboards for FM2+ that are $80 and less. We have a smattering of higher end motherboards from guys like Gigabyte and MSI, but these are hitting max prices of $110 to $120 US. Asus is offering users in the FM2+ market something a little different from the rest. Users who purchase an AMD APU will be getting much the same overall experience that the top end Intel based ROG customers if they decide to buy the Crossblade Ranger, but for a much lower price.
The bundle is functional, but not overly impressive.
If you’re a fan of digital video and music, you’ve likely heard the name “Plex” floating around. Plex (not to be confused with EVE Online’s in-game subscription commodity) is free media center software that lets users manage and stream a wide array of videos, audio files, and pictures to virtually any computer and a growing number of mobile devices and electronics. As a Plex user from the very beginning, I’ve seen the software change and evolve over the years into the versatile and powerful service it is today.
My goal with this article twofold. First, as an avid Plex user, I’d like to introduce the software to users have yet to hear about or try it. Second, for those already using or experimenting with Plex, I hope that I can provide some “best practices” when it comes to configuring your servers, managing your media, or just using the software in general.
Before we dive into the technical aspects of Plex, let’s look at a brief overview of the software’s history and the main components that comprise the Plex ecosystem today.
Although now widely supported on a range of platforms, Plex was born in early 2008 as an OS X fork of the Xbox Media Center project (XBMC). Lovingly named “OSXBMC” (get it?) by its creators, the software was initially a simple media player for Mac, with roughly the same capabilities as the XBMC project from which it was derived. (Note: XBMC changed its name to “Kodi” in August, although you’ll still find plenty of people referring to the software by its original name).
A few months into the project, the OSXBMC team decided to change the name to “Plex” and things really started to take off for the nascent media software. Unlike the XBMC/Kodi community, which focused its efforts primarily on the playback client, the Plex team decided to bifurcate the project with two distinct components: a dedicated media server and a dedicated playback client.
The dedicated media server made Plex unique among its media center peers. Once properly set up, it gave users with very little technical knowledge the ability to maintain a server that was capable of delivering their movies, TV shows, music, and pictures on demand throughout the house and, later, the world. We'll take a more detailed look at each of the Plex components next.
The “brains” behind the entire Plex ecosystem is Plex Media Server (PMS). This software, available for Windows, Linux, and OS X, manages your media database, metadata, and any necessary transcoding, which is one of its best features. Although far from error-free, the PMS encoding engine can convert virtually any video codec and container on the fly to a format requested by a client device. Want to play a high-bitrate 1080p MKV file with a 7.1 DTS-HD MA soundtrack on your Roku? No problem; Plex will seamlessly transcode that high quality source file to the proper format for Roku, as well as your iPad, or your Galaxy S5, and many other devices, all without having to store multiple copies of your video files.
There are smart people that work at AMD. A quick look at the company's products, including the APU lineup as well as the discrete GPU fields, clearly indicates a lineup of talent in engineering, design, marketing and business. It's not perfect of course, and very few companies can claim to be, but the strengths of AMD are there and easily discernible to those of us on the outside looking in with the correct vision.
Because AMD has smart people working hard to improve the company, they are also aware of its shortcomings. For many years now, the thorn of GPU software has been sticking in AMD's side, tarnishing the name of Radeon and the products it releases. Even though the Catalyst graphics driver has improved substantially year after year, the truth is that NVIDIA's driver team has been keeping ahead of AMD consistently in basically all regards: features, driver installation, driver stability, performance improvements over time.
If knowing is half the battle, acting on that knowledge is at least another 49%. AMD is hoping to address driver concerns now and into the future with the release of the Catalyst Omega driver. This driver sets itself apart from previous releases in several different ways, starting with a host of new features, some incremental performance improvements and a drastically amped up testing and validation process.
AMD considers this a "special edition" driver and is something that they plan to repeat on a yearly basis. That note in itself is an interesting point - is that often enough to really change the experience and perception of the Catalyst driver program going forward? Though AMD does include some specific numbers of tested cases for its validation of the Omega driver (441,000+ automated test runs, 11,000+ manual test runs) we don't have side by side data from NVIDIA to compare it to. If AMD is only doing a roundup of testing like this once a year, but NVIDIA does it more often, then AMD might soon find itself back in the same position it has been.
UPDATE: There has been some confusion based on this story that I want to correct. AMD informed us that it is still planning on releasing other drivers throughout the year that will address performance updates for specific games and bug fixes for applications and titles released between today and the pending update for the next "special edition." AMD is NOT saying that they will only have a driver drop once a year.
But before we worry about what's going to happen in the future, let's look into what AMD has changed and added to the new Catalyst Omega driver released today.
Introduction, Specifications and Packaging
Mid last year, Samsung introduced the 840 EVO. This was their evolutionary step from the 840 Pro, which had launched a year prior. While the Pro was a performance MLC SSD, the EVO was TLC, and for most typical proved just as speedy. The reason for this was Samsung’s inclusion of a small SLC cache on each TLC die. Dubbed TurboWrite, this write-back cache gave the EVO the best write performance of any TLC-based SSD on the market. Samsung had also introduced a DRAM cache based RAPID mode - included with their Magician value added software solution. The EVO was among the top selling SSDs since its launch, despite a small hiccup quickly corrected by Samsung.
Fast forward to June of this year where we saw the 850 Pro. Having tested the waters with 24-layer 3D VNAND, Samsung revises this design, increasing the layer count to 32 and reducing the die capacity from 128Gbit to 86Gbit. The smaller die capacity enables a 50% performance gain, stacked on top of the 100% write speed gain accomplished by the reduced cross talk of the 3D VNAND architecture. These changes did great things for the performance of the 850 Pro, especially in the lower capacities. While competing 120/128GB SSDs were typically limited to 150 MB/sec write speeds, the 128GB 850 Pro cruises along at over 3x that speed, nearly saturating the SATA interface. The performance might have been great, but so was the cost - 850 Pro’s have stuck around $0.70/GB since their launch, forcing budget conscious upgraders to seek competing solutions. What we needed was an 850 EVO, and now I can happily say here it is:
As the 840 EVO was a pretty big deal, I believe the 850 EVO has an equal chance of success, so instead of going for a capacity roundup, this first piece will cover the 120GB and 500GB capacities. A surprising number of our readers run a pair of smaller capacity 840 EVOs in a RAID, so we will be testing a matched pair of 850 EVOs in RAID-0. To demonstrate the transparent performance boosting of RAPID, I’ll also run both capacities through our full test suite with RAPID mode enabled. There is lots of testing to get through, so let’s get cracking!
In the last few years NZXT has emerged as a popular choice for computer builds with stylish cases for a variety of needs. The newest member of the H series, the H440, promises quiet performance and offers a clean look by eliminating optical drive bays entirely from the design. While this might be a deal-breaker for some, the days of the ODD seem to be numbered as more enclosures are making the move away from the 5.25" bay.
Image credit: NZXT
But we aren't looking at just any H440 today, as NZXT has sent along a completely custom version designed in alliance with gaming accessory maker Razer to be "the ultimate gamer's chassis". (This case is currently available direct from NZXT's online store.) In this review we'll look at just what makes this H440 different, and test out a complete build while we're at it. Performance will be as big a metric as appearance here since the H440 is after all an enclosure designed for silence, with noise dampening an integral part of NZXT's construction of the case.
Green with Envy?
From the outset you'll notice the Razer branding extends beyond just special paint and trim, as custom lighting is installed right out of the box to give this incarnation of the H440 a little more gaming personality (though this lighting can be switched off, if desired). Not only do the front and side logos and power button light up green, but the bottom of the case features effects lighting to cast an eerie green glow on your desktop or floor.
Image credit: NZXT
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Maximus VII Impact motherboard is among ASUS' ROG (Republic of Gamers) board offerings in their Intel Z97 Express product line. The board builds on the strengths of its predecessor with the a similar layout and add-in card design implementation. ASUS augmented the new version of the board with an updated chipset and as well as additional support for the latest hard drive and audio technologies. The Maximus VII Impact has a premium price of $239.99 for its small status, but come packed full for features and power to more than justify the cost.
Courtesy of ASUS
Courtesy of ASUS
Courtesy of ASUS
ASUS did not pull any punches in designing the Maximus VII Impact board, integrating a similar 8-phase digital power system as found on the Maximus VII Formula ATX board. The power system combines 60A-rated BlackWing chokes, NexFET MOSFETs with a 90% efficiency rating, and 10k Japanese-source Black Metallic capacitors onto an upright board to minimize the footprint of those components. Additionally, ASUS integrated their updated SupremeFX Impact II audio system for superior audio fidelity using the included SupremeVX Impact II add-in card.
We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.
As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.
Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.
First a bit of misnomer correction / foundation laying:
- The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
- LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
- In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
- The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).
With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:
Measured panel section brightness over time during a 'stall' event. Click to enlarge.
The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.
Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.
A few final points before we go:
- This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
- The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
- The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).
This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.
During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:
"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.
This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.
When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."
So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below.
(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)
It has been a couple of months since the release of the GeForce GTX 970 and the GM204 GPU that it is based on. After the initial wave of stock on day one, NVIDIA had admittedly struggled to keep these products available. Couple that with rampant concerns over coil whine from some non-reference designs, and you could see why we were a bit hesitant to focus and spend our time on retail GTX 970 reviews.
These issues appear to be settled for the most part. Finding GeForce GTX 970 cards is no longer a problem and users with coil whine are getting RMA replacements from NVIDIA's partners. Because of that, we feel much more comfortable reporting our results with the various retail cards that we have in house, and you'll see quite a few reviews coming from PC Perspective in the coming weeks.
But let's start with the MSI GeForce GTX 970 4GB Gaming card. Based on user reviews, this is one of the most popular retail cards. MSI's Gaming series of cards combines a custom cooler that typically runs quieter and more efficient than reference design, and it comes with a price tag that is within arms reach of the lower cost options as well.
The MSI GeForce GTX 970 4GB Gaming
MSI continues with its Dragon Army branding, and its associated black/red color scheme, which I think is appealing to a wide range of users. I'm sure NVIDIA would like to see a green or neutral color scheme, but hey, there are only so many colors to go around.
It has become increasingly apparent that flash memory die shrinks have hit a bit of a brick wall in recent years. The issues faced by the standard 2D Planar NAND process were apparent very early on. This was no real secret - here's a slide seen at the 2009 Flash Memory Summit:
Despite this, most flash manufacturers pushed the envelope as far as they could within the limits of 2D process technology, balancing shrinks with reliability and performance. One of the largest flash manufacturers was Intel, having joined forces with Micron in a joint venture dubbed IMFT (Intel Micron Flash Technologies). Intel remained in lock-step with Micron all the way up to 20nm, but chose to hold back at the 16nm step, presumably in order to shift full focus towards alternative flash technologies. This was essentially confirmed late last week, with Intel's announcement of a shift to 3D NAND production.
Intel's press briefing seemed to focus more on cost efficiency than performance, and after reviewing the very few specs they released about this new flash, I believe we can do some theorizing as to the potential performance of this new flash memory. From the above illustration, you can see that Intel has chosen to go with the same sort of 3D technology used by Samsung - a 32 layer vertical stack of flash cells. This requires the use of an older / larger process technology, as it is too difficult to etch these holes at a 2x nm size. What keeps the die size reasonable is the fact that you get a 32x increase in bit density. Going off of a rough approximation from the above photo, imagine that 50nm die (8 Gbit), but with 32 vertical NAND layers. That would yield a 256 Gbit (32 GB) die within roughly the same footprint.
Representation of Samsung's 3D VNAND in 128Gbit and 86 Gbit variants.
20nm planar (2D) = yellow square, 16nm planar (2D) = blue square.
Image republished with permission from Schiltron Corporation.
It's likely a safe bet that IMFT flash will be going for a cost/GB far cheaper than the competing Samsung VNAND, and going with a relatively large 256 Gbit (vs. VNAND's 86 Gbit) per-die capacity is a smart move there, but let's not forget that there is a catch - write speed. Most NAND is very fast on reads, but limited on writes. Shifting from 2D to 3D NAND netted Samsung a 2x speed boost per die, and another effective 1.5x speed boost due to their choice to reduce per-die capacity from 128 Gbit to 86 Gbit. This effective speed boost came from the fact that a given VNAND SSD has 50% more dies to reach the same capacity as an SSD using 128 Gbit dies.
Now let's examine how Intel's choice of a 256 Gbit die impacts performance:
- Intel SSD 730 240GB = 16x128 Gbit 20nm dies
- 270 MB/sec writes and ~17 MB/sec/die
- Crucial MX100 128GB = 8x128Gbit 16nm dies
- 150 MB/sec writes and ~19 MB/sec/die
- Samsung 850 Pro 128GB = 12x86Gbit VNAND dies
- 470MB/sec writes and ~40 MB/sec/die
If we do some extrapolation based on the assumption that IMFT's move to 3D will net the same ~2x write speed improvement seen by Samsung, combined with their die capacity choice of 256Gbit, we get this:
- Future IMFT 128GB SSD = 4x256Gbit 3D dies
- 40 MB/sec/die x 4 dies = 160MB/sec
Even rounding up to 40 MB/sec/die, we can see that also doubling the die capacity effectively negates the performance improvement. While the IMFT flash equipped SSD will very likely be a lower cost product, it will (theoretically) see the same write speed limits seen in today's SSDs equipped with IMFT planar NAND. Now let's go one layer deeper on theoretical products and assume that Intel took the 18-channel NVMe controller from their P3700 Series and adopted it to a consumer PCIe SSD using this new 3D NAND. The larger die size limits the minimum capacity you can attain and still fully utilize their 18 channel controller, so with one die per channel, you end up with this product:
- Theoretical 18 channel IMFT PCIE 3D NAND SSD = 18x256Gbit 3D dies
- 40 MB/sec/die x 18 dies = 720 MB/sec
- 18x32GB (die capacity) = 576GB total capacity
Overprovisioning decisions aside, the above would be the lowest capacity product that could fully utilize the Intel PCIe controller. While the write performance is on the low side by PCIe SSD standards, the cost of such a product could easily be in the $0.50/GB range, or even less.
In summary, while we don't have any solid performance data, it appears that Intel's new 3D NAND is not likely to lead to a performance breakthrough in SSD speeds, but their choice on a more cost-effective per-die capacity for their new 3D NAND is likely to give them significant margins and the wiggle room to offer SSDs at a far lower cost/GB than we've seen in recent years. This may be the step that was needed to push SSD costs into a range that can truly compete with HDD technology.
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
We thank you for your support of PC Perspective through all of 2014. The links included below embed our affiliate code to Amazon.com (when applicable) and if you are doing other shopping for the holidays this year we would appreciate it if you used the button above before perusing Amazon.com. In case you want to know the affiliate cod directly, it is: pcper04-20.
Intel Core i7-4790K Haswell Processor
Last year our pick for the best high-performance processor was the Core i7-4770K, and it sold for $379. This year we have a part running 500 MHz faster, though at higher power, for $80 less. If you are still waiting for a time to upgrade your processor (and hey, games will need more cores sooner rather than later!), the Core i7-4790K looks like a great option and now looks like a great time.
NVIDIA GeForce GTX 980 4GB
Likely the most controversial selection in our gift guide, the GeForce GTX 980 is an interesting product. It's expensive compared to the other options from AMD like the Radeon R9 290X or even the R9 290, but it is also a better performing part; just not by much. The selection process of a GTX 980 stems from other things: G-Sync support, game bundles with Far Cry 4 and The Crew available, GeForce Experience, driver stability and frequency, etc. The GTX 970 is another good choice along these lines but as you'll see below...AMD has a strong contender as well.
Introduction: Defining the Quiet Enclosure
The Define R5 is the direct successor to Fractal Design's R4 enclosure, and it arrives with the promise of a completely improved offering in the silent case market. Fractal Design has unveiled the case today, and we have the day-one review ready for you!
We've looked at a couple of budget cases recently from the Swedish enclosure maker, and though still affordable with an MSRP of $109.99 (a windowed version will also be available for $10 more) the Define R5 from Fractal Design looks like a premium part throughout. In keeping with the company's minimalist design aesthetic it features clean styling, and is a standard mid-tower form factor supporting boards from ATX down to mini-ITX. The R5 also offers considerable cooling flexibility with many mounting options for fans and radiators.
The Silent Treatment
One of two included 1000 RPM hydraulic-bearing GP-14 silent fans
There are always different needs to consider when picking an enclosure, from price to application. And with silent cases there is an obvious need to for superior sound-dampening properties, though airflow must be maintained to prevent cooking components as well. With today's review we'll examine the case inside and out and see how a complete build performs with temperature and noise testing.