All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Z97-Deluxe motherboard is the flagship board in ASUS' channel line of Intel Z97 chipset boards. ASUS revised its entire motherboard line for the Z97 chipset launch with the Z97-Deluxe featuring the updated channel board aesthetics as well as many design innovations built upon the previous board generation. With an MSRP of $399.99, ASUS priced the Z97-Deluxe at the level of other flagship motherboards with a plethora of integrated features and performance that more than justifies the price.
Courtesy of ASUS
Courtesy of ASUS
ASUS designed the Z97-Deluxe to be a serious competitor in the enthusiast space, integrating a 16+2 phase digital power system to drive the CPU and DRAM with 10k-rated Japanese capacitors used through the board. The board also features an updated heat sink and heat pipe design to optimize passive heat transfer away from the digital VRM chipset and the Z97 chipset.
Introduction: Budget Cooling Options and the Seidon 120V
The Seidon 120V is Cooler Master's newest 120mm all-in-one liquid CPU cooler, and its affordable price adds another option to anyone looking for an aftermarket cooler on a budget. But when we start comparing low-cost options it's valid to wonder just how much better a liquid cooler in this price range might perform over air. To find out we'll test the Seidon 120V against a popular budget air solution, and see how these aftermarket coolers compare against the stock solutions from AMD and Intel.
Image courtesy of Cooler Master
Cooling on a Budget
When you’re pricing out a new computer build these days it’s pretty easy to put together a solid group of components for $500 or so, and these will get you going on all the latest games at HD resolution. Sounds awesome! Of course, within that tight budget certain things are going to have to wait, and right up there on the list is probably some better cooling. It’s easy enough to change out a CPU cooler later, but if the stock cooler is doing the job within the thermal specs of the processor is it really needed? Clearly, AMD and Intel are not going to ship a cooler with their product that can’t keep it cool enough under stock workloads, but having better cooling can allow for overclocking as well as extend the life of not only the CPU, but the components around it on your motherboard. Aftermarket coolers are often able to cool more efficiently as well, producing less noise.
The selection of aftermarket coolers available is, well, ridiculous. As easy as it is to get lost looking at, say, every virtually identical stick of DDR3 memory, scrolling through product pages for CPU cooling is on another level entirely. Liquid cooling systems are much easier to navigate, as there are not only fewer of them, but the pricing segmentation allows for easier selection if you’re on a budget. For instance, the Seidon 120V at around $50 was the least expensive AIO option on Amazon when this review was started (actually coming in at 47.99 shipped, though this has been fluctuating quite a bit lately). Finding a suitable budget air cooler was not so easy, and it needed to be at least comparable to the performance of a liquid cooler, while coming in at or below the $50 mark of the 120V. (This might take a while…)
On the air-cooling side of things narrowing the selection to $50 or less doesn’t help much, as there are still (roughly) 50 million to choose from in that price range. There are going to be so many different preferences and opinions on these, so an easier alternative would be to simply follow the consensus pick, e-tail style. This intensive research project involved visiting Amazon and typing “cpu cooler” into the search box. (OK, that was pretty easy!) The plan was to put whatever came up first under $50 in the cart. Turns out the most popular air-cooler is also under $50 (not surprising). This top result was also from Cooler Master, their Hyper 212 EVO which was selling for under $34 shipped. Done.
Introduction and Features
Corsair announced their latest flagship power supply, the 1,500 watt AX1500i Digital ATX PSU earlier this year at CES. We recently received a retail unit to review and have spent the past week putting it through our suite of tests. The new AX1500i Digital proved to be a very interesting power supply and brings not one, but several outstanding features to the high-end enthusiast market. Over time, we all grow numb to marketing terms like “most technologically advanced”, “state-of-the-art”, “ultra-stable”, “super-high efficiency”, etc., but in the case of the AX1500i Digital PSU, we have seen these claims come to life before our eyes.
Right out of the box, the AX1500i Digital power supply is capable of delivering up to 1,500 watts of continuous DC power (125 Amps on the +12V rail). If that is not impressive enough, the PSU can do it while operating on 115 VAC mains and with an ambient temperature up to 50°C (internal case temperature). This beast was made for multiple power-hungry graphic adapters and overclocked CPUs.
The Corsair AX1500i is one of the very first power supplies to obtain 80 Plus Titanium certification, which requires a PSU to operate with at least 90% efficiency between 10% and 100% load, and with at least 94% efficiency at 50% load.
The AX1500i is a digital power supply, which offers two distinct advantages. First, the AX1500i incorporates a Digital Signal Processor (DSP) to provide digitally-controlled power. This enables the PSU to deliver extremely tight voltage regulation over a wide range of loads. And second, the AX1500i features Corsair Link, which enables the PSU to be connected to the PC’s motherboard (via USB) for real-time monitoring (efficiency and power usage) and control (over-current protection and fan speed profiles).
Silent operation (zero-rpm fan mode up to ~30% load) might not be at the top of your feature list when shopping for a 1,500 watt PSU, but the AX1500i can do it thanks to high efficiency.
(Courtesy of Corsair)
Corsair AX1500i Digital ATX PSU Key Features:
• Digital Signal Processor (DSP) for extremely clean and efficient power
• Corsair Link Interface for monitoring and adjusting performance
• 1,500 watts continuous power output (50°C)
• Dedicated single +12V rail (125A) with user-configurable virtual rails
• 80 Plus Titanium certified, delivering up to 94% efficiency
• Ultra low noise 140mm double ball bearing fan
• Silent, Fanless mode up to ~30% load (~450W)
• Self-test switch to verify power supply functionality
• Premium quality components
• Fully modular cable system
• Conforms to ATX12V v2.4 and EPS 2.92 standards
• Universal AC input (100-240V) with Active PFC
• Over-current, over-voltage, over-temperature, and short circuit protection
• Dimensions: 150mm (W) x 86mm (H) x 225mm (L)
• 7-Year warranty and legendary Corsair customer service
• MSRP: $449.99 USD (available in Q2 2014)
You need a bit of power for this
PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.
For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.
All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.
After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.
The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.
During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.
As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.
Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.
With the release of the next generation desktop board line imminent, we wanted to give you a taste of the new ASUS offerings. ASUS offers support for current and next generation Intel Haswell processors as well as support for SATA 3 and next generation SATA storage solutions with their next generation boards.
ASUS sought to perfect their designs with the new board lines, building off the successess from the Intel Z87-based boards and improving them for better performance and usability. All board lines feature an updated UEFI BIOS with more GUI-driven interfaces and wizards, particularly aimed at those users who don't want to muck around in the Advanced mode settings. Additionally, all board lines feature updated heat sink designs and aesthetics to further differentiate them from their Z87 counterparts. ASUS also enhanced their AI Suite software with 5-Way Optimization functionality, instead of the 4-Way Optimization functionality integrated into the last generation of AI Suite. 5-Way Optimization functionality integrates TPU (Turbo Processing Unit), EPU (Energy Processing Unit), DIGI+ Power Control, Fan XPert 3, and Turbo App into a single cohesive entity.
ASUS will continue to market the next generation boards across their standard four board lines:
- Channel series motherboards
- Republic of Gamers series motherboards
- TUF series motherboards
- Workstation series motherboards
Channel series banner
The Channel series boards include all board designs aimed at mainstream consumers and system builders.
ROG series banner
The ROG (Republic of Gamers) series boards are geared more towards enthusiast and gamer users, featuring a black and red color scheme for eye-catching appeal. The boards also feature innovative designs and integrated components that differentiate them from the other lines.
AMD Makes some Lemonade...
I guess we could say that AMD has been rather busy lately. It seems that a significant amount of the content on PC Perspective this month revolved around the AMD AM1 platform. Before that we had the Kaveri products and the R7 265. AMD also reported some fairly solid growth over the past year with their graphics and APU lines. Things are not as grim and dire as they once were for the company. This is good news for consumers as they will continue to be offered competing solutions that will vie for that hard earned dollar.
AMD is continuing their releases for 2014 with the announcement of their latest low-power and mainstream mobile APUs. These are codenamed “Beema” and “Mullins”, but they are based on the year old Kabini chip. This may cause a few people to roll their eyes as AMD has had some fairly unimpressive refreshes in the past. We saw the rather meager increases in clockspeed and power consumption with Brazos 2.0 a couple of years back, and it looked like this would be the case again for Beema and Mullins.
I was again expecting said meager improvements in power consumption and clockspeeds that we had received all those years ago with Brazos 2.0. Turns out I was wrong. This is a fairly major refresh which does a few things that I did not think were entirely possible, and I’m a rather optimistic person. So why is this release surprising? Let us take a good look under the hood.
Introduction and Technical Specifications
Courtesy of Noctua
Noctua is an established player in the CPU cooler space, offering high quality solutions that cool your processors without killing your eardrums. Their U-series coolers combine unrivaled cooling performance with an innovative design to ensure quiet operation and motherboard compatibility. The flagship product of this line, the NH-U14S, is composed of a single aluminum-finned radiator with six embedded, nickel-plated copper heat pipes running through a copper base plate. For performance testing of the NH-U14S cooler, we put it up against other high-performance liquid and air-based coolers. With a retail MSRP of $75.99, the NH-U14S CPU cooler has a premium price to match its premium size and cooling potential.
Courtesy of Noctua
GIGABYTE is doing some interesting things with their next generation line of boards, offering support for current and next generation Intel Haswell processors. Additionally, these next generation boards will offer support for SATA 3 and next generation SATA storage solutions.
The biggest difference between their new board lines and existing Intel Z87 motherboard lines is how they are differentiating the different board series. GIGABYTE will have a total of four board series in their new product line:
- Normal series motherboards
- Gaming series motherboards
- Overclocking series motherboards
- Black Edition series motherboards
Normal Series motherboard
Courtesy of GIGABYTE
The Normal Series boards encompass their channel boards for mainstream consumers and system builders.
Gaming Series motherboard
The Gaming Series boards are geared more towards enthusiast and gamers users, featuring a black and red color scheme for eye-catching appeal.
Overclocking Series motherboard
Courtesy of GIGABYTE
The Overclocking Series boards are geared towards hard-core enthusiasts and the LN2 record setting crowd with features and designs specifically oriented to eeking out that last bit of speed and performance from the board.
Black Edition Series motherboard
Courtesy of GIGABYTE
The Black Edition Series boards build on several products from the other board lines, ensuring the utmost stability and performance through a specialized factory-performed testing regimen.
Introduction and Features
Back in March, EVGA announced the pending arrival of two new power supplies in their popular SuperNOVA line, the 750G2 and 850G2. Both new power supplies are 80Plus Gold certified and feature all modular cables, high-quality Japanese brand capacitors, a single high-power +12V rail, and a 140mm dual ball bearing cooling fan (with the ability to operate in silent, fan-less mode at low power levels). The 750G2 and 850G2 are also backed by a 10-year warranty (with registration). And last but not least, many PC PSU enthusiasts will be happy to know the new 750G2 and 850G2 are built on the successful Superflower Leadex Gold platform. Superflower is the same OEM that EVGA uses for several of their popular higher output, premium power supplies!
EVGA was founded in 1999 with headquarters in Brea, California. They continue to specialize in producing NVIDIA based graphics adapters and Intel based motherboards and keep expanding their PC power supply product line, which now includes fourteen models ranging from the high-end 1,500W NEX1500 Classified to the budget minded EVGA 430W power supply.
(Courtesy of EVGA)
In this review we will be taking a detailed look at both the EVGA SuperNOVA 750G2 and 850G2 power supplies.
Here is what EVGA has to say about the new SuperNOVA G2 Gold PSUs: “Unleash the next generation in power with the EVGA SuperNOVA 850G2 and 750G2 Power Supplies. Based on the award winning G2 series Power Supplies from EVGA, these PSUs features 80 PLUS Gold rated efficiency, and clean, continuous power to every component. This provides improved efficiency for longer operation, less power consumption, reduced energy costs and minimal heat dissipation. The new ECO Thermal Control Fan System offers fan modes to provide zero fan noise during low load operations. Backed by a 10 year warranty and Japanese capacitor design, the EVGA SuperNOVA G2 is not only the right choice for your system today, it is also the best choice for your system tomorrow."
EVGA SuperNOVA 850 G2 and 750 G2 Gold PSU Key Features:
• 10-Year Warranty and unparalleled EVGA Customer Support
• 80PLUS Gold certified, with up to 90% efficiency under typical loads
• Tight voltage regulation (2%), stable power with low AC ripple and noise
• Highest quality Japanese brand capacitors ensure long-term reliability
• Fully modular cables to reduce clutter and improve airflow
• Quiet dual-ball bearing fan for exceptional reliability and quiet operation
• ECO Intelligent Thermal Control allows silent, fan-less operation at low power
• NVIDIA SLI and AMD Crossfire Ready
• Intel 4th Generation CPU Ready (Haswell, C6/C7 sleep modes)
• Compliance with ErP Lot 6 2013 Requirement
• Active Power Factor correction (0.99) with Universal AC input
• Heavy-duty Protections: OVP, UVP, OCP, OPP, and SCP
• MSRP for the 750 G2 PSU : $129.99 USD (119.99 after mail-in rebate)
• MSRP for the 850 G2 PSU : $159.99 USD ($149.99 after mail-in rebate)
AM1 Walks New Ground
After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.
While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.
Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.
Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.
Competition is a Great Thing
While doing some testing with the AMD Athlon 5350 Kabini APU to determine it's flexibility as a low cost gaming platform, we decided to run a handful of tests to measure something else that is getting a lot of attention right now: AMD Mantle and NVIDIA's 337.50 driver.
Earlier this week I posted a story that looked at performance scaling of NVIDIA's new 337.50 beta driver compared to the previous 335.23 WHQL. The goal was to assess the DX11 efficiency improvements that the company stated it had been working on and implemented into this latest beta driver offering. In the end, we found some instances where games scaled by as much as 35% and 26% but other cases where there was little to no gain with the new driver. We looked at both single GPU and multi-GPU scenarios on mostly high end CPU hardware though.
Earlier in April I posted an article looking at Mantle, AMD's answer to a lower level API that is unique to its ecosystem, and how it scaled on various pieces of hardware on Battlefield 4. This was the first major game to implement Mantle and it remains the biggest name in the field. While we definitely saw some improvements in gaming experiences with Mantle there was work to be done when it comes to multi-GPU scaling and frame pacing.
Both parties in this debate were showing promise but obviously both were far from perfect.
While we were benchmarking the new AMD Athlon 5350 Kabini based APU, an incredibly low cost processor that Josh reviewed in April, it made sense to test out both Mantle and NVIDIA's 337.50 driver in an interesting side by side.
Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games.
As I wrote then:
What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11. When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.
NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.
In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II are...an interesting story. More on that on the next page.)
Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?
Introduction, Specifications, and Contents
Corsair has added another double-width liquid cooler to their growing lineup of all-in-one solutions with the H105, joining the existing H100i and larger H110 in this category.
Image courtesy of Corsair
Initially, the H105 might leave you scratching your head. It's listed on Corsair’s site with the same $119.99 MSRP as the H100i, and both are 240mm designs featuring the same high performance fans. The similarities end there, however, as the design of the H105 is more akin to Corsair's new 120mm H75 (which we recently reviewed) than to the existing 240mm H100/H100i. With the H75 already a solid price/performance pick in Corsair’s lineup - and the various other options still available - it's reasonable to wonder exactly where H105 fits in.
While this new cooler is using the same pair of excellent SP120L PWM fans as the earlier H100i (and H80i), it's the radiator they will be connected to that should separate the H105 from prior efforts. Corsair has implemented a much thicker 240mm rad with the H105 at 35mm (vs. only 27mm on earlier products), and this added thickness should have an noticeable impact on cooling performance, and possibly fan noise as well.
Ultra-Speed RAM, APU-Style
In our review of the Kingston HyperX Predator 2666MHz kit, we discovered what those knowledgeable about Intel memory scaling already knew: for most applications, and specifically games, there is no significant advantage to increases in memory speed past the current 1600MHz DDR3 standard. But this was only half of the story. What about memory scaling with an AMD processor, and specifically an APU? To find out, we put AMD’s top APU, the A10-7850K, to the test!
Ready for some APU memory testing!
AMD has created a compelling option with their APU lineup, and the inclusion of powerful integrated graphics allows for interesting build options with lower power and space requirements, and even make building tiny mini-ITX systems for gaming realistic. It’s this graphical prowess compared to any other onboard solution that creates an interesting value proposition for any gamer looking at a new low-cost build. The newest Kaveri APU’s are getting a lot of attention and they beg the question, is a discrete graphics card really needed for gaming at reasonable settings?
AMD Brings Kabini to the Desktop
Perhaps we are performing a study of opposites? Yesterday Ryan posted his R9 295X2 review, which covers the 500 watt, dual GPU monster that will be retailing for $1499. A card that is meant for only the extreme enthusiast who has plenty of room in their case, plenty of knowledge about their power supply, and plenty of electricity and air conditioning to keep this monster at bay. The product that I am reviewing could not be any more different. Inexpensive, cool running, power efficient, and can be fit pretty much anywhere. These products can almost be viewed as polar opposites.
The interesting thing of course is that it shows how flexible AMD’s GCN architecture is. GCN can efficiently and effectively power the highest performing product in AMD’s graphics portfolio, as well as their lowest power offerings in the APU market. The performance scales very linearly when it comes to adding in more GCN compute cores.
The product that I am of course referring to are the latest Athlon and Sempron APUs that are based on the Kabini architecture which fuses Jaguar x86 cores with GCN compute cores. These APUs were announced last month, but we did not have the chance at the time to test them. Since then these products have popped up in a couple of places around the world, but this is the first time that reviewers have officially received product from AMD and their partners.
A Powerful Architecture
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.
Introduction and Technical Specifications
Courtesy of SilverStone
SilverStone Technology is a well known company in the enthusiast space, offering high quality solutions from cases to case-mounted fan controllers and displays. SilverStone has also gone through several generations of CPU all-in-on liquid cooling solutions with the Tundra series being the culmination of these past designs. The Tundra Series TD03 liquid cooler is designed to cool CPUs of any make, including the latest processor offerings from both Intel and AMD. The cooler is comprised of a thick 120mm radiator attached to a uni-body aluminum CPU cooler with integrated copper base plate and pump. To best measure the TD03's performance, we set it against several other high-performance liquid and air-based coolers. The TD03 cooler comes with a retail MSRP of $99.99, putting it at the higher end of the all-in-one cooler price range.
Athlon and Pentium Live On
Over the past year or so, we have taken a look at a few budget gaming builds here at PC Perspective. One of our objectives with these build guides was to show people that PC gaming can be cost competitive with console gaming, and at a much higher quality.
However, we haven't stopped pursuing our goal of the perfect inexpensive gaming PC, which is still capable of maxing out image quality settings on today's top games at 1080p.
Today we take a look at two new systems, featuring some parts which have been suggested to us after our previous articles.
|AMD System||Intel System|
|Processor||AMD Athlon X4 760K - $85||Intel Pentium G3220 - $65|
|Cores / Threads||4 / 4||2 / 2|
|Motherboard||Gigabyte F2A55M-HD2 - $60||ASUS H81M-E - $60|
|Graphics||MSI R9 270 Gaming - $180||MSI R9 270 Gaming - $180|
|System Memory||Corsair 8GB DDR3-1600 (1x8GB) - $73||Corsair 8GB DDR3-1600 (1x8GB) - $73|
|Hard Drive||Western Digital 1TB Caviar Green - $60||Western Digital 1TB Caviar Green - $60|
|Power Supply||Cooler Master GX 450W - $50||Cooler Master GX 450W - $50|
|Case||Cooler Master N200 MicroATX - $50||Cooler Master N200 MicroATX - $50|
(Editor's note: If you don't already have a copy of Windows, and don't plan on using Linux or SteamOS, you'll need an OEM copy of Windows 8.1 - currently selling for $98.)
These are low prices for a gaming computer, and feature some parts which many of you might not know a lot about. Let's take a deeper look at the two different platforms which we built upon.
First up is the AMD Athlon X4 760K. While you may not have known the Athlon brand was still being used on current parts, they represent an interesting part of the market. On the FM2 socket, the 760K is essentially a high end Richland APU, with the graphics portion of the chip disabled.
What this means is that if you are going to pair your processor with a discrete GPU anyway, you can skip paying extra for the integrated GPU.
As for the motherboard, we went for an ultra inexpensive A55 option from Gigabyte, the GA-F2A55M-HD2. This board features the A55 chipset which launched with the Llano APUs in 2011. Because of this older chipset, the board does not feature USB 3.0 or SATA 6G capability, but since we are only concerned about gaming performance here, it makes a great bare bones option.
Taking it all the way to 12!
Microsoft has been developing DirectX for around 20 years now. Back in the 90s, the hardware and software scene for gaming was chaotic, at best. We had wonderful things like “SoundBlaster compatibility” and 3rd party graphics APIs such as Glide, S3G, PowerSGL, RRedline, and ATICIF. OpenGL was aimed more towards professional applications and it took John Carmack and iD, through GLQuake in 1996, to start the ball moving in that particular direction. There was a distinct need to provide standards across audio and 3D graphics that would de-fragment the industry and developers. DirectX was introduced with Windows 95, but the popularity of Direct3D did not really take off until DirectX 3.0 that was released in late 1996.
DirectX has had some notable successes, and some notable let downs, over the years. DX6 provided a much needed boost in 3D graphics, while DX8 introduced the world to programmable shading. DX9 was the most long-lived version, thanks to it being the basis for the Xbox 360 console with its extended lifespan. DX11 added in a bunch of features and made programming much simpler, all the while improving performance over DX10. The low points? DX10 was pretty dismal due to the performance penalty on hardware that supported some of the advanced rendering techniques. DirectX 7 was around a little more than a year before giving way to DX8. DX1 and DX2? Yeah, those were very unpopular and problematic, due to the myriad changes in a modern operating system (Win95) as compared to the DOS based world that game devs were used to.
Some four years ago, if going by what NVIDIA has said, initial talks were initiated to start pursuing the development of DirectX 12. DX11 was released in 2009 and has been an excellent foundation for PC games. It is not perfect, though. There is still a significant impact in potential performance due to a variety of factors, including a fairly inefficient hardware abstraction layer that relies more upon fast single threaded performance from a CPU rather than leveraging the power of a modern multi-core/multi-thread unit. This has the result of limiting how many objects can be represented on screen as well as different operations that would bottleneck even the fastest CPU threads.
BF4 Integrates FCAT Overlay Support
Back in September AMD publicly announced Mantle, a new lower level API meant to offer more performance for gamers and more control for developers fed up with the restrictions of DirectX. Without diving too much into the politics of the release, the fact that Battlefield 4 developer DICE was integrating Mantle into the Frostbite engine for Battlefield was a huge proof point for the technology. Even though the release was a bit later than AMD had promised us, coming at the end of January 2014, one of the biggest PC games on the market today had integrated a proprietary AMD API.
When I did my first performance preview of BF4 with Mantle on February 1st, the results were mixed but we had other issues to deal with. First and foremost, our primary graphics testing methodology, called Frame Rating, wasn't able to be integrated due to the change of API. Instead we were forced to use an in-game frame rate counter built by DICE which worked fine, but didn't give us the fine grain data we really wanted to put the platform to the test. It worked, but we wanted more. Today we are happy to announce we have full support for our Frame Rating and FCAT testing with BF4 running under Mantle.
A History of Frame Rating
In late 2012 and throughout 2013, testing graphics cards became a much more complicated beast. Terms like frame pacing, stutter, jitter and runts were not in the vocabulary of most enthusiasts but became an important part of the story just about one year ago. Though complicated to fully explain, the basics are pretty simple.
Rather than using software on the machine being tested to measure performance, our Frame Rating system uses a combination of local software and external capture hardware. On the local system with the hardware being evaluated we run a small piece of software called an overlay that draws small colored bars on the left hand side of the game screen that change successively with each frame rendered by the game. Using a secondary system, we capture the output from the graphics card directly, intercepting it from the display output, in real-time in an uncompressed form. With that video file captured, we then analyze it frame by frame, measuring the length of each of those colored bars, how long they are on the screen, how consistently they are displayed. This allows us to find the average frame rate but also to find how smoothly the frames are presented, if there are dropped frames and if there are jitter or stutter issues.