All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
A Powerful Architecture
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.
Introduction and Technical Specifications
Courtesy of SilverStone
SilverStone Technology is a well known company in the enthusiast space, offering high quality solutions from cases to case-mounted fan controllers and displays. SilverStone has also gone through several generations of CPU all-in-on liquid cooling solutions with the Tundra series being the culmination of these past designs. The Tundra Series TD03 liquid cooler is designed to cool CPUs of any make, including the latest processor offerings from both Intel and AMD. The cooler is comprised of a thick 120mm radiator attached to a uni-body aluminum CPU cooler with integrated copper base plate and pump. To best measure the TD03's performance, we set it against several other high-performance liquid and air-based coolers. The TD03 cooler comes with a retail MSRP of $99.99, putting it at the higher end of the all-in-one cooler price range.
Athlon and Pentium Live On
Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.
However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.
Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.
|AMD System||Intel System|
|Processor||AMD Athlon X4 760K - $85||Intel Pentium G3220 - $65|
|Cores / Threads||4 / 4||2 / 2|
|Motherboard||Gigabyte F2A55M-HD2 - $60||ASUS H81M-E - $60|
|Graphics||MSI R9 270 Gaming - $180||MSI R9 270 Gaming - $180|
|System Memory||Corsair 8GB DDR3-1600 (1x8GB) - $73||Corsair 8GB DDR3-1600 (1x8GB) - $73|
|Hard Drive||Western Digital 1TB Caviar Green - $60||Western Digital 1TB Caviar Green - $60|
|Power Supply||Cooler Master GX 450W - $50||Cooler Master GX 450W - $50|
|Case||Cooler Master N200 MicroATX - $50||Cooler Master N200 MicroATX - $50|
(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)
These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.
First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.
What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.
As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.
Taking it all the way to 12!
Microsoft has been developing DirectX for around 20 years now. Back in the 90s, the hardware and software scene for gaming was chaotic, at best. We had wonderful things like “SoundBlaster compatibility” and 3rd party graphics APIs such as Glide, S3G, PowerSGL, RRedline, and ATICIF. OpenGL was aimed more towards professional applications and it took John Carmack and iD, through GLQuake in 1996, to start the ball moving in that particular direction. There was a distinct need to provide standards across audio and 3D graphics that would de-fragment the industry and developers. DirectX was introduced with Windows 95, but the popularity of Direct3D did not really take off until DirectX 3.0 that was released in late 1996.
DirectX has had some notable successes, and some notable let downs, over the years. DX6 provided a much needed boost in 3D graphics, while DX8 introduced the world to programmable shading. DX9 was the most long-lived version, thanks to it being the basis for the Xbox 360 console with its extended lifespan. DX11 added in a bunch of features and made programming much simpler, all the while improving performance over DX10. The low points? DX10 was pretty dismal due to the performance penalty on hardware that supported some of the advanced rendering techniques. DirectX 7 was around a little more than a year before giving way to DX8. DX1 and DX2? Yeah, those were very unpopular and problematic, due to the myriad changes in a modern operating system (Win95) as compared to the DOS based world that game devs were used to.
Some four years ago, if going by what NVIDIA has said, initial talks were initiated to start pursuing the development of DirectX 12. DX11 was released in 2009 and has been an excellent foundation for PC games. It is not perfect, though. There is still a significant impact in potential performance due to a variety of factors, including a fairly inefficient hardware abstraction layer that relies more upon fast single threaded performance from a CPU rather than leveraging the power of a modern multi-core/multi-thread unit. This has the result of limiting how many objects can be represented on screen as well as different operations that would bottleneck even the fastest CPU threads.
BF4 Integrates FCAT Overlay Support
Back in September AMD publicly announced Mantle, a new lower level API meant to offer more performance for gamers and more control for developers fed up with the restrictions of DirectX. Without diving too much into the politics of the release, the fact that Battlefield 4 developer DICE was integrating Mantle into the Frostbite engine for Battlefield was a huge proof point for the technology. Even though the release was a bit later than AMD had promised us, coming at the end of January 2014, one of the biggest PC games on the market today had integrated a proprietary AMD API.
When I did my first performance preview of BF4 with Mantle on February 1st, the results were mixed but we had other issues to deal with. First and foremost, our primary graphics testing methodology, called Frame Rating, wasn't able to be integrated due to the change of API. Instead we were forced to use an in-game frame rate counter built by DICE which worked fine, but didn't give us the fine grain data we really wanted to put the platform to the test. It worked, but we wanted more. Today we are happy to announce we have full support for our Frame Rating and FCAT testing with BF4 running under Mantle.
A History of Frame Rating
In late 2012 and throughout 2013, testing graphics cards became a much more complicated beast. Terms like frame pacing, stutter, jitter and runts were not in the vocabulary of most enthusiasts but became an important part of the story just about one year ago. Though complicated to fully explain, the basics are pretty simple.
Rather than using software on the machine being tested to measure performance, our Frame Rating system uses a combination of local software and external capture hardware. On the local system with the hardware being evaluated we run a small piece of software called an overlay that draws small colored bars on the left hand side of the game screen that change successively with each frame rendered by the game. Using a secondary system, we capture the output from the graphics card directly, intercepting it from the display output, in real-time in an uncompressed form. With that video file captured, we then analyze it frame by frame, measuring the length of each of those colored bars, how long they are on the screen, how consistently they are displayed. This allows us to find the average frame rate but also to find how smoothly the frames are presented, if there are dropped frames and if there are jitter or stutter issues.
Introduction, Specifications and Packaging
ADATA has been in the storage market for a good while now. I like to think of them as the patient underdog. They don't necessarily come out with the shiny new controller or flash technology. Instead they tend to sit back and wait for a given set of hardware to mature and drop in price a bit. Once that happens, they figure out how to package the matured technology into a device of relatively low cost as compared to the competition. They have done so again today, with their new Premier Pro SP920 lineup:
As hinted at earlier, this line does not use the newest Marvell controller, but as Marvell controllers have been very capable SATA 6Gb/sec units for a long time now, that is not necessarily a bad thing. In addition, Marvell controllers have a track record of gaining significant performance margins as their firmware matures, which makes ADATA's later entrance more of a good thing.
Introduction, Packaging, and Specifications
The BitFenix Colossus has grown into a family of enclosures, from the massive E-ATX original all the way down to their diminutive mini-ITX version. But somewhere in between there lies a case offering some impressive flexibility, while still retaining a small footprint.
As the PC industry has evolved over the last decade, the days of high-performance rigs requiring large towers and full-size ATX and E-ATX motherboards are gone. Of course there is still a market (and need) for full tower systems, and the majority of enthusiast motherboards available are still full ATX. But the evolution in process technology and platforms has allowed for more and more to be done within a smaller footprint, and the micro-ATX form factor has emerged as a solid option for anything from budget systems to extreme multi-GPU gaming powerhouses. Regardless of the path you choose, all of those sweet components need a home, and finding the right computer case has long been a very personal odyssey.
BitFenix entered the PC enclosure market in 2010 with the original Colossus, and since then they have grown into a respected brand with a large and differentiated product offering. From that first massive Colossus to the popular Prodigy mini-ITX, they have created an enclosure for just about any build. And while many cases specialize in one or two particular areas, once in a while you will find an enclosure that just begs for experimentation. The micro-ATX variant of the Colossus from BitFenix is just such a case. Every aspect of this small enclosure has been given a close look by BitFenix, and there are options galore for a variety of builds.
Introduction and Technical Specifications
Courtesy of Cooler Master
Cooler Master is known in the enthusiast community for their innovative designs with product offerings ranging from cases to desktop and laptop cooling implements. Like many other manufacturers, Cooler Master offers its own line of all-in-one liquid cooling solutions. Unique to their Glacer 240L cooler is the ability to easily add additional cooling blocks into the base loop. The Glacer 240L has an fill port integrated into the base of the radiator for drain and refill and uses removable clamps on all connections for easy maintenance and tube reconfiguration. To measure the performance of the Glacer 240L, we set it against several other high-performance liquid and air-based coolers. With a $139.99 MSRP, the Glacer 240L comes at a premium price.
Courtesy of Cooler Master
Courtesy of Cooler Master
Introduction and Features
The highly anticipated 450D mid-tower case is Corsair’s latest addition to their top of the line Obsidian Series and is the first new Obsidian case to be released in 2014. The Obsidian 450D mid-tower enclosure is positioned between Corsair’s 750D full-tower and 350D Micro-ATX enclosures and shares many of the same styling and design concepts of the 350D, 750D and 900D. The 450D is being introduced with an MSRP of $119.99 USD, which makes it considerably less expensive than Corsair’s classic Obsidian 650D mid-tower enclosure ($199.99 USD). It appears the new 450D may eventually become the successor to the 650D but we hope the 650D mid-tower case doesn’t go away any time soon as the two enclosures are still different enough to appeal to different users.
(Courtesy of Corsair)
And in addition to PC enclosures, Corsair continues to offer one of the largest selections of memory products, SSDs, power supplies, coolers, gaming peripherals, and PC accessories currently on the market today!
Here is what Corsair has to say about the Obsidian 450D PC case:
“The 450D Performance Mid-Tower PC case matches the iconic, brushed aluminum design of the Obsidian series with an increased focus on high-airflow, ensuring your system not only looks great, but runs cool.
Behind the 450D’s aluminum mesh intake grill are dual AF140L intake fans to direct airflow straight to a PC’s hottest component, the graphics card. The rear AF120L 120mm fan keeps the airflow moving smoothly and five other optional fan locations give you serious cooling flexibility. The 450D’s fan mounts also accommodate a wide range of water-cooling radiators, with room for up to a 360mm radiator in the roof, a 280mm radiator in the front, and a 240mm radiator in the floor.
The 450D also boasts all of the features that make the Obsidian Series a favorite among enthusiasts around the world. Easily accessible dust filters on the roof, front, and bottom ensure your system will stay looking its best, while modular tool-free 3.5”/2.5” hard disk mounts offer a wide range of storage options, or can be removed entirely to prioritize airflow.”
(Courtesy of Corsair)
Obsidian Series 450D Mid-Tower PC Case Key Points:
• Mid-tower PC case with clean, elegant styling
• Tool-free 2.5”, 3.5” and 5.25” drive installation
• Two AF140L intake fans and one AF120L exhaust fan
• Excellent airflow and low noise levels
• Support for water-cooling in a broad variety of configurations
• Support for 240mm, 280mm, and/or 360mm radiators
• Two dedicated 2.5” SSD drive sleds located behind motherboard
• Included modular (removable) drive cage supports three 2.5”/3.5” drives
• Optional drive cage adds support for three more 2.5”/3.5” drives
• Removable magnetic top filter provides a cleaner look
• Competitive price point
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS ROG Poseidon GTX 780 video card is the latest incarnation of the Republic of Gamer (ROG) Poseidon series. Like the previous Poseidon series products, the Poseidon GTX 780 features a hybrid cooler, capable of air and liquid-based cooling for the GPU and on board components. The AUS ROG Poseidon GTX 780 graphics card comes with an MSRP of $599, a premium price for a premium card .
Courtesy of ASUS
In designing the Poseidon GTX 780 graphics card, ASUS packed in many of premium components you would normally find as add-ons. Additionally, the card features motherboard quality power components, featuring a 10 phase digital power regulation system using ASUS DIGI+ VRM technology coupled with Japanese black metallic capacitors. The Poseidon GTX 780 has the following features integrated into its design: DisplayPort output port, HDMI output port, dual DVI ports (DVI-D and DVI-I type ports), aluminum backplate, integrated G 1/4" threaded liquid ports, dual 90mm cooling fans, 6-pin and 8-pin PCIe-style power connectors, and integrated power connector LEDs and ROG logo LED.
DX11 could rival Mantle
The big story at GDC last week was Microsoft’s reveal of DirectX 12 and the future of the dominant API for PC gaming. There was plenty of build up to the announcement with Microsoft’s DirectX team posting teasers and starting up a Twitter account of the occasion. I hosted a live blog from the event which included pictures of the slides. It was our most successful of these types of events with literally thousands of people joining in the conversation. Along with the debates over the similarities of AMD’s Mantle API and the timeline for DX12 release, there are plenty of stories to be told.
After the initial session, I wanted to setup meetings with both AMD and NVIDIA to discuss what had been shown and get some feedback on the planned direction for the GPU giants’ implementations. NVIDIA presented us with a very interesting set of data that both focused on the future with DX12, but also on the now of DirectX 11.
The reason for the topic is easy to decipher – AMD has built up the image of Mantle as the future of PC gaming and, with a full 18 months before Microsoft’s DirectX 12 being released, how developers and gamers respond will make an important impact on the market. NVIDIA doesn’t like to talk about Mantle directly, but it’s obvious that it feels the need to address the questions in a roundabout fashion. During our time with NVIDIA’s Tony Tamasi at GDC, the discussion centered as much on OpenGL and DirectX 11 as anything else.
What are APIs and why do you care?
For those that might not really understand what DirectX and OpenGL are, a bit of background first. APIs (application programming interface) are responsible for providing an abstraction layer between hardware and software applications. An API can deliver consistent programming models (though the language can vary) and do so across various hardware vendors products and even between hardware generations. They can provide access to feature sets of hardware that have a wide range in complexity, but allow users access to hardware without necessarily knowing great detail about it.
Over the years, APIs have developed and evolved but still retain backwards compatibility. Companies like NVIDIA and AMD can improve DirectX implementations to increase performance or efficiency without adversely (usually at least) affecting other games or applications. And because the games use that same API for programming, changes to how NVIDIA/AMD handle the API integration don’t require game developer intervention.
With the release of AMD Mantle, the idea of a “low level” API has been placed in the minds of gamers and developers. The term “low level” can mean many things, but in general it is associated with an API that is more direct, has a thinner set of abstraction layers, and uses less translation from code to hardware. The goal is to reduce the amount of overhead (performance hit) that APIs naturally impair for these translations. With additional performance available, the CPU cycles can be used by the program (game) or be slept to improve battery life. In certain cases, GPU throughput can increase where the API overhead is impeding the video card's progress.
Passing additional control to the game developers, away from the API or GPU driver developers, gives those coders additional power and improves the ability for some vendors to differentiate. Interestingly, not all developers want this kind of control as it requires more time, more development work, and small teams that depend on that abstraction to make coding easier will only see limited performance advantages.
The reasons for this transition to a lower level API is being driven the by widening gap of performance between CPU and GPUs. NVIDIA provided the images below.
On the left we see performance scaling in terms of GFLOPS and on the right the metric is memory bandwidth. Clearly the performance of NVIDIA's graphics chips has far outpaced (as have AMD’s) what the best Intel desktop processor have been able and that gap means that the industry needs to innovate to find ways to close it.
Introduction and Features
Cooler Master continues to offer a full line of cases, power supplies, and coolers along with numerous other accessories for PC enthusiasts. Today we will be taking a detailed look at Cooler Master’s new V Series 850W power supply.
(Courtesy of Cooler Master)
It has been quite some time since we last reviewed a Cooler Master power supply and we were happy to see one show up recently on the test lab doorstep. We were even happier to learn that the new Cooler Master V Series, which includes three units: the V700, B850 and V1000, are all built by Seasonic, who has a stellar reputation for building some of the best PC power supplies on the planet. All three power supplies in the V Series incorporate all modular cabling, high efficiency (80Plus Gold certified), high quality Japanese made capacitors, a silent 135mm fan with a Fluid Dynamic Bearing, and they come backed by Cooler Master’s 5-year warranty. In addition, the V Series power supplies deliver excellent voltage regulation (particularly on the +12V output) with minimal AC ripple and noise. The Cooler Master V850 power supply is currently selling for $169.99 (newegg.com, March 2014).
Here is what Cooler Master has to say about their new V Series PSUs:
“Developed to be the highest quality, Cooler Master carefully sourced every aspect of the V Series PSU line to produce a high-efficiency and stable power supply. It’s highly efficient even at low loads, extremely stable, and works well beyond the parameters of 80Plus specifications.”
Cooler Master V Series 850W PSU Key Features:
• Fully modular cable design
• Single 850W +12V output that delivers up to 70A (840W)
• 80Plus Gold Certified: up to 93% efficient @ 50% load)
• Silent 135mm Fluid Dynamic Bearing (FDB) fan for low noise and longer life
• Six PCI-E 6+2 pin connectors to support high-end GPUs
• 100% high-quality Japanese capacitors ensure performance and reliability
• Reliable 5-Year warranty
Introduction and Background
Back in 2010, Intel threw a bit of a press thing for a short list of analysts and reviewers out at their IMFT flash memory plant at Lehi, Utah. The theme and message of that event was to announce 25nm flash entering mass production. A few years have passed, and 25nm flash is fairly ubiquitous, with 20nm rapidly gaining as IMFT scales production even higher with the smaller process. Last week, Intel threw a similar event, but instead of showing off a die shrink or even announcing a new enthusiast SSD, they chose to take a step back and brief us on the various design, engineering, and validation testing of their flash storage product lines.
At the Lehi event, I did my best to make off with a 25nm wafer.
Many topics were covered at this new event at the Intel campus at Folsom, CA, and over the coming weeks we will be filling you in on many of them as we take the necessary time to digest the fire hose of intel (pun intended) that we received. Today I'm going to lay out one of the more impressive things I saw at the briefings, and that is the process Intel goes through to ensure their products are among the most solid and reliable in the industry.
EVGA GTX 750 Ti ACX FTW
The NVIDIA GeForce GTX 750 Ti has been getting a lot of attention around the hardware circuits recently, but for good reason. It remains interesting from a technology stand point as it is the first, and still the only, Maxwell based GPU available for desktop users. It's a completely new architecture which is built with power efficiency (and Tegra) in mind. With it, the GTX 750 Ti was able to push a lot of performance into a very small power envelope while still maintaining some very high clock speeds.
NVIDIA’s flagship mainstream part is also still the leader when it comes to performance per dollar in this segment (for at least as long as it takes for AMD’s Radeon R7 265 to become widely available). There has been a few cases that we have noticed where the long standing shortages and price hikes from coin mining have dwindled, which is great news for gamers but may also be bad news for NVIDIA’s GPUs in some areas. Though, even if the R7 265 becomes available, the GTX 750 Ti remains the best card you can buy that doesn’t require a power connection. This puts it in a unique position for power limited upgrades.
After our initial review of the reference card, and then an interesting look at how the card can be used to upgrade an older or under powered PC, it is time to take a quick look at a set of three different retail cards that have made their way into the PC Perspective offices.
On the chopping block today we’ll look at the EVGA GeForce GTX 750 Ti ACX FTW, the Galaxy GTX 750 Ti GC and the PNY GTX 750 Ti XLR8 OC. All of them are non-reference, all of them are overclocked, but you’ll likely be surprised how they stack up.
Introduction and Technical Specifications
Courtesy of Corsair
Corsair expanded from a known presence in the memory space to well-respected entity in the component market, offering everything from cases to all-in-one water coolers. Their newest cooler, the Hydro Series™ H75 Liquid CPU Cooler, features a 120mm x 25mm radiator with dual fans and a copper-based water block. The H75 unit includes mounting support for all current Intel and AMD processor offerings. To gage the performance of the cooler, we set it against several other high-performance liquid and air-based coolers. With a retail MSRP of $84.99, the Hydro Series™ H75 cooler is priced to be competitive.
Courtesy of Corsair
The Hydro Series™ H75 cooler was designed to be a "one size fits all" type cooler, having no space issues fitting in most cases including some of the larger mITX style cases. Corsair used Asetek as an OEM to assist in manufacturing the H75 and appears to be based on the Asetek 550LC all-in-one cooler. As we've seen previously with Corsair OEM products, Corsair had a hand in tweaking the cooler design to meet their performance and aesthetic expectations. The radiator in an all aluminum, thin-finned unit designed to effectively dissipate heat from the liquid medium using the two include Corsair-branded fans.
Courtesy of Corsair
The water block is composed of a two part acrylic top piece, housing the unit's electronics and pump, held to a copper cold plate secured with an inner and outer ring of screws. The inner ring of screws are counter-sunk to prevent mating-related obstruction and was polished to a mirror-like luster.
Courtesy of Corsair
When the Radeon R9 290 and R9 290X first launched last year, they were plagued by issues of overheating and variable clock speeds. We looked at the situation several times over the course of a couple months and AMD tried to address the problem with newer drivers. These drivers did help stabilize clock speeds (and thus performance) of the reference built R9 290 and R9 290X cards but caused noise levels to increase as well.
The real solution was the release of custom cooled versions of the R9 290 and R9 290X from AMD partners like ASUS, MSI and others. The ASUS R9 290X DirectCU II model for example, ran cooler, quieter and more consistently than any of the numerous reference models we had our hands on.
But what about all those buyers that are still purchasing, or have already purchased, reference style R9 290 and 290X cards? Replacing the cooler on the card is the best choice and thanks to our friends at NZXT we have a unique solution that combines standard self contained water coolers meant for CPUs with a custom built GPU bracket.
Our quick test will utilize one of the reference R9 290 cards AMD sent along at launch and two specific NZXT products. The Kraken X40 is a standard CPU self contained water cooler that sells for $100 on Amazon.com. For our purposes though we are going to team it up with the Kraken G10, a $30 GPU-specific bracket that allows you to use the X40 (and other water coolers) on the Radeon R9 290.
Inside the box of the G10 you'll find an 80mm fan, a back plate, the bracket to attach the cooler to the GPU and all necessary installation hardware. The G10 will support a wide range of GPUs, though they are targeted towards the reference designs of each:
NVIDIA : GTX 780 Ti, 780, 770, 760, Titan, 680, 670, 660Ti, 660, 580, 570, 560Ti, 560, 560SE
AMD : R9 290X, 290, 280X*, 280*, 270X, 270 HD7970*, 7950*, 7870, 7850, 6970, 6950, 6870, 6850, 6790, 6770, 5870, 5850, 5830
That is pretty impressive but NZXT will caution you that custom designed boards may interfere.
The installation process begins by removing the original cooler which in this case just means a lot of small screws. Be careful when removing the screws on the actual heatsink retention bracket and alternate between screws to take it off evenly.
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Maximus VI Impact is ASUS' newest mini-ITX member of the Republic of Gamer (ROG) family. ASUS integrated design innovations from its Z77-based mITX board and added in some ROG-based innovations to come up with a wholly unique entity. With an MSRP of $229, the Maximus VI Impact comes in at the higher-end of the mITX price range with enough integrated features to more than justify the cost.
Courtesy of ASUS
Similar to other members of the ROG-based Z87 releases, ASUS designed the Maximus VI Impact board with top of the line power components. The board's digital power system centers on an 8+2 phase power regulation system using 60 amp-rated BlackWing chokes, powIRstage MOSFETS, and 10k-rated Black Metallic capacitors. To save space on the board, the power components are mounted vertically on a hard-attached PCB to the right of the socket with the sound components and wireless networking on vertical removable cards to the upper left of the CPU socket and integrated into the board's rear panel.
So Many MHz, So Little Time...
If you've looked at memory for your system lately you've likely noticed a couple of things. First, memory prices have held steady for the past few months, but are still nearly double what they were a little over a year ago. Second, now that DDR3 has been a mature standard for years, there is a vast selection of RAM from many vendors, all with nearly identical specs. The standard has settled at 1600MHz for DDR3, and most desktop memory is programmed for this speed. Granted, many modules run at overclocked speeds, and there are some out there with pretty outlandish numbers, too - and it’s one of those kits that we take a look at today.
Hardly subtle, the Kingston HyperX 'Predator' dual channel kit for review today is clocked at a ridiculous 1066MHz OVER the 1600MHz standard. That's right, this is 2666MHz memory! It seems like such a big jump would have to provide increased system performance across the board, and that's exactly what we're going to find out.
We all want to get the most out of any component, and finding the best option at a given price is part of planning any new build or upgrade. While every core part is sold at a particular speed, and most can be overclocked, there are still some qualifying factors that make selecting the fastest part for your budget a little more complicated. Speed isn't based on MHz alone – as with processors, where it often comes down to number of cores, how many instructions per clock cycle a given CPU can churn out, etc.
Maxwell and Kepler and...Fermi?
Covering the landscape of mobile GPUs can be a harrowing experience. Brands, specifications, performance, features and architectures can all vary from product to product, even inside the same family. Rebranding is rampant from both AMD and NVIDIA and, in general, we are met with one of the most confusing segments of the PC hardware market.
Today, with the release of the GeForce GTX 800M series from NVIDIA, we are getting all of the above in one form or another. We will also see performance improvements and the introduction of the new Maxwell architecture (in a few parts at least). Along with the GeForce GTX 800M parts, you will also find the GeForce 840M, 830M and 820M offerings at lower performance, wattage and price levels.
With some new hardware comes a collection of new software for mobile users, including the innovative Battery Boost that can increase unplugged gaming time by using frame rate limiting and other "magic" bits that NVIDIA isn't talking about yet. ShadowPlay and GameStream also find their way to mobile GeForce users as well.
Let's take a quick look at the new hardware specifications.
|GTX 880M||GTX 780M||GTX 870M||GTX 770M|
|GPU Code name||Kepler||Kepler||Kepler||Kepler|
|Rated Clock||954 MHz||823 MHz||941 MHz||811 MHz|
|Memory||Up to 4GB||Up to 4GB||Up to 3GB||Up to 3GB|
|Memory Clock||5000 MHz||5000 MHz||5000 MHz||4000 MHz|
Both the GTX 880M and the GTX 870M are based on Kepler, keeping the same basic feature set and hardware specifications of their brethren in the GTX 700M line. However, while the GTX 880M has the same CUDA core count as the 780M, the same cannot be said of the GTX 870M. Moving from the GTX 770M to the 870M sees a significant 40% increase in core count as well as a jump in clock speed from 811 MHz (plus Boost) to 941 MHz.
1920x1080, 2560x1440, 3840x2160
Join us on March 11th at 9pm ET / 6pm PT for a LIVE Titanfall Game Stream! You can find us at http://www.pcper.com/live. You can subscribe to our mailing list to be alerted whenever we have a live event!!
We canceled the event due to the instability of Titanfall servers. We'll reschedule soon!!
With the release of Respawn's Titanfall upon us, many potential PC gamers are going to be looking for suggestions on compiling a list of parts targeted at a perfect Titanfall experience. The good news is, even with a fairly low investment in PC hardware, gamers will find that the PC version of this title is definitely the premiere way to play as the compute power of the Xbox One just can't compete.
In this story we'll present three different build suggestions, each addressing a different target resolution but also better image quality settings than the Xbox One can offer. We have options for 1080p, the best option that the Xbox could offer, 2560x1440 and even 3840x2160, better known as 4K. In truth, the graphics horsepower required by Titanfall isn't overly extreme, and thus an entire PC build coming in under $800, including a full copy of Windows 8.1, is easy to accomplish.
Target 1: 1920x1080
First up is old reliable, the 1920x1080 resolution that most gamers still have on their primary gaming display. That could be a home theater style PC hooked up to a TV or monitors in sizes up to 27-in. Here is our build suggestion, followed by our explanations.
|Titanfall 1080p Build|
|Processor||Intel Core i3-4330 - $137|
|Motherboard||MSI H87-G43 - $96|
|Memory||Corsair Vengeance LP 8GB 1600 MHz (2 x 4GB) - $89|
|Graphics Card||EVGA GeForce GTX 750 Ti - $179|
|Storage||Western Digital Blue 1TB - $59|
|Case||Corsair 200R ATX Mid Tower Case - $72|
|Power Supply||Corsair CX 500 watt - $49|
|OS||Windows 8.1 OEM - $96|
|Total Price||$781 - Amazon Full Cart|
Our first build comes in at $781 and includes some incredibly competent gaming hardware for that price. The Intel Core i3-4330 is a dual-core, HyperThreaded processor that provides more than enough capability to push Titanfall any all other major PC games on the market. The MSI H87 motherboard lacks some of the advanced features of the Z87 platform but does the job at a lower cost. 8GB of Corsair memory, though not running at a high clock speed, provides more than enough capacity for all the programs and applications you could want to run.
Get notified when we go live!