All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
3840x2160 for Cheap!!
It has been just over a year ago when we first got our hands on a 4K display. At the time, we were using a 50-in Seiki 3840x2160 HDTV that ran at a 30 Hz refresh rate and was disappointing in terms of its gaming experience, but impressive in image quality and price ($1500 at the time). Of course, we had to benchmark graphics cards at 4K resolutions and the results proved what we expected - you are going to need some impressive hardware to run at 4K with acceptable frame rates.
Since that story was published, we saw progress in the world of 4K displays with the ASUS PQ321Q, a 4K monitor (not a TV) that was built to handle 60 Hz refresh rates. The problem, of course, was the requirement for a multi-stream connection that essentially pushes two distinct streams over a single DisplayPort cable to the monitor, each at 1920x2160. While in theory that wasn't a problem, we saw a lot configuration and installation headaches as we worked through the growing pains of drivers and firmware. Also, it was priced at $3200 when we first reviewed it, though that number has fallen to $2400 recently.
Today we are looking at the Samsung U28D590D, the first 4K panel we have seen that supports a 60 Hz refresh rate with a single stream (single tile) implementation. That means that not only do you get the better experiences associated with a 60 Hz refresh rate over a 30 Hz, you also gain a much more simple and compatible installation and setup. No tricky driver issues to be found here! If you have a DisplayPort 1.2-capable graphics card, it's just plug and play.
The Samsung U28D590D uses a 28-in TN panel, which is obviously of a lower quality in terms of colors and viewing angles than the IGZO screen used on the ASUS PQ321Q, but it's not as bad as you might expect based on previous TN panel implementations. We'll talk a bit more about that below. The best part of course is the price - you can find the Samsung 4K panel for as low as $690!
Introduction and Technical Specifications
Courtesy of GIGABYTE
Like many other manufacturers, GIGABYTE is using the launch of the Intel Z97 Express chipset to update its product line aesthetics and makeup. The Z97X-Gaming G1-WIFI-BK motherboard is part of GIGABYTE's Black Edition product line, differentiated from other GIGABYTE offers because of the enhanced validation criteria each individual Black Edition product is put through prior to leaving the assembly plant. Each board is put through a 168 hour stress test, consisting of both CPU and GPU-based Lite Coin mining, to prove its worth as part of the elite Black Edition product line. The Z97X-Gaming G1-WIFI-BK motherboard itself shares components and aesthetics with their Gaming line, featuring the line's black and red coloration and the GIGABYTE Gaming logo (the silver eye) on the Z97 chipset heat sink. With an MSRP of $349, the board is priced at a premium, more than justified by its enhanced feature set and Black Edition enhanced validation criteria.
Courtesy of GIGABYTE
GIGABYTE paired the Z97X-Gaming G1-WIFI-BK motherboard with an 8 phase digital power delivery system to ensure clean power to the CPU under all operating conditions. Additionally, the board's VRM heat sinks can be integrated into an existing water loop using the provided G 1/4" threaded ports. Note that the PLX and chipset heat pipe is physically separate from the VRM heat pipe and does not benefit by water cooling the board's VRM cooler. The Z97X-Gaming G1-WIFI-BK motherboard boasts the following integrated feature list: eight SATA 3 ports; one SATA Express 10 Gb/s ports; dual Gigabit NICs - an Intel i217V NIC and a Qualcomm® Atheros Killer E2201 NIC; an Intel 802.11ac Wi-Fi and Bluetooth PCIe adapter; four PCI-Express x16 slot; three PCI-Express x1 slots; 2-digit diagnostic LED display; on-board power, reset, and CMOS clear buttons; Dual-Bios and active BIOS switches; left and right channel audio gain control switches; Sound Blaster Core 3D audio solution; removable audio OP-AMP port; integrated voltage measurement points; hybrid VRM cooling solution; and USB 2.0 and 3.0 port support.
Courtesy of GIGABYTE
Upgrades from Anker
Last year we started to have a large amount of mobile devices around the office including smartphones, tablets and even convertibles like the ASUS T100, all of which were charged with USB connections. While not a hassle when you are charging one or two units at time, having 6+ on our desks on any day started to become a problem for our less numerous wall outlets. Our solution last year was Anker's E150 25 watt wall charger that we did a short video overview on.
It was great but had limitations including different charging rates depending on the port you connected it to, limited output of 5 Amps total for all five ports and fixed outputs per port. Today we are taking a look at a pair of new Anker devices that implement smart ports called PowerIQ that enable the battery and wall charger to send as much power to the charging device as it requests, regardless of what physical port it is attached to.
We'll start with the updated Anker 40 watt 5-port wall charger and then move on to discuss the 3-port mobile battery charger, both of which share the PowerIQ feature.
Anker 40 watt 5-Port Wall Charger
The new Anker 5-port wall charger is actually smaller than the previous generation but offers superior specifications at all feature points. This unit can push out more than 40 watts total combined through all five USB ports, 5 volts at as much as 8 amps. All 8 amps can in fact go through a single USB charging port we are told if there was a device that would request that much - we don't have anything going above 2.3A it seems in our offices.
Any USB port can be used for any device on this new model, it doesn't matter where it plugs in. This great simplifies things from a user experience point of view as you don't have to hold the unit up to your face to read the tiny text that existed on the E150. With 8 amps spread across all five ports you should have more than enough power to charge all your devices at full speed. If you happen to have five iPads charging at the same time, that would exceed 8A and all the devices charge rates would be a bit lower.
Introduction, Specifications, and Packaging
Image credit: NCASE
The NCASE M1 Mini-ITX case has been lusted after for about a year now by those of us interested in small form-factor (SFF) computing, ever since it made the news last spring by making its initial goal on the crowd-funding site Indiegogo. The last campaign to raise funds ended in August of last year, and not leaving anything up to chance the creators of the M1 contracted none other than Lian Li to make their dream a reality. Today, we have the privilege of seeing the finished product!
Making things happen
We’ve all talked about changing some existing product to fix problems or just add features that we’d like to have. But most of us probably wouldn’t take our idea to a public funding site to actually make it happen, and that’s exactly why the story of NCASE and the M1 is unique. The creators were members on hardforums, and the original thread for the M1 is now well over 500 pages long.
The story began with conversation about improving an existing mini-ITX design, with the SilverStone SG05 the original topic. (It's fascinating to watch the design evolve on the thread!) Two forum members joined forces and started creating designs, and ended up with the blueprint for an incredibly small case that still supported large GPU's and 240mm radiators. Then, it was on to Indiegogo to see if the interest was high enough to get this case built.
Judging by the results starting with that initial round of prototype funding, there has definitely been interest in this design! Lian Li's prototype case was a success, and the initial production run funding campaign quickly raised more than double the goal again… Fast forward to spring 2014, a black M1 case was delivered safely, and I for one can’t wait to get started building up a system with it!
The M1 next to a BitFenix Prodigy: It's tiny!! (Image credit NCASE)
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Z97-Deluxe motherboard is the flagship board in ASUS' channel line of Intel Z97 chipset boards. ASUS revised its entire motherboard line for the Z97 chipset launch with the Z97-Deluxe featuring the updated channel board aesthetics as well as many design innovations built upon the previous board generation. With an MSRP of $399.99, ASUS priced the Z97-Deluxe at the level of other flagship motherboards with a plethora of integrated features and performance that more than justifies the price.
Courtesy of ASUS
Courtesy of ASUS
ASUS designed the Z97-Deluxe to be a serious competitor in the enthusiast space, integrating a 16+2 phase digital power system to drive the CPU and DRAM with 10k-rated Japanese capacitors used through the board. The board also features an updated heat sink and heat pipe design to optimize passive heat transfer away from the digital VRM chipset and the Z97 chipset.
Introduction: Budget Cooling Options and the Seidon 120V
The Seidon 120V is Cooler Master's newest 120mm all-in-one liquid CPU cooler, and its affordable price adds another option to anyone looking for an aftermarket cooler on a budget. But when we start comparing low-cost options it's valid to wonder just how much better a liquid cooler in this price range might perform over air. To find out we'll test the Seidon 120V against a popular budget air solution, and see how these aftermarket coolers compare against the stock solutions from AMD and Intel.
Image courtesy of Cooler Master
Cooling on a Budget
When you’re pricing out a new computer build these days it’s pretty easy to put together a solid group of components for $500 or so, and these will get you going on all the latest games at HD resolution. Sounds awesome! Of course, within that tight budget certain things are going to have to wait, and right up there on the list is probably some better cooling. It’s easy enough to change out a CPU cooler later, but if the stock cooler is doing the job within the thermal specs of the processor is it really needed? Clearly, AMD and Intel are not going to ship a cooler with their product that can’t keep it cool enough under stock workloads, but having better cooling can allow for overclocking as well as extend the life of not only the CPU, but the components around it on your motherboard. Aftermarket coolers are often able to cool more efficiently as well, producing less noise.
The selection of aftermarket coolers available is, well, ridiculous. As easy as it is to get lost looking at, say, every virtually identical stick of DDR3 memory, scrolling through product pages for CPU cooling is on another level entirely. Liquid cooling systems are much easier to navigate, as there are not only fewer of them, but the pricing segmentation allows for easier selection if you’re on a budget. For instance, the Seidon 120V at around $50 was the least expensive AIO option on Amazon when this review was started (actually coming in at 47.99 shipped, though this has been fluctuating quite a bit lately). Finding a suitable budget air cooler was not so easy, and it needed to be at least comparable to the performance of a liquid cooler, while coming in at or below the $50 mark of the 120V. (This might take a while…)
On the air-cooling side of things narrowing the selection to $50 or less doesn’t help much, as there are still (roughly) 50 million to choose from in that price range. There are going to be so many different preferences and opinions on these, so an easier alternative would be to simply follow the consensus pick, e-tail style. This intensive research project involved visiting Amazon and typing “cpu cooler” into the search box. (OK, that was pretty easy!) The plan was to put whatever came up first under $50 in the cart. Turns out the most popular air-cooler is also under $50 (not surprising). This top result was also from Cooler Master, their Hyper 212 EVO which was selling for under $34 shipped. Done.
Introduction and Features
Corsair announced their latest flagship power supply, the 1,500 watt AX1500i Digital ATX PSU earlier this year at CES. We recently received a retail unit to review and have spent the past week putting it through our suite of tests. The new AX1500i Digital proved to be a very interesting power supply and brings not one, but several outstanding features to the high-end enthusiast market. Over time, we all grow numb to marketing terms like “most technologically advanced”, “state-of-the-art”, “ultra-stable”, “super-high efficiency”, etc., but in the case of the AX1500i Digital PSU, we have seen these claims come to life before our eyes.
Right out of the box, the AX1500i Digital power supply is capable of delivering up to 1,500 watts of continuous DC power (125 Amps on the +12V rail). If that is not impressive enough, the PSU can do it while operating on 115 VAC mains and with an ambient temperature up to 50°C (internal case temperature). This beast was made for multiple power-hungry graphic adapters and overclocked CPUs.
The Corsair AX1500i is one of the very first power supplies to obtain 80 Plus Titanium certification, which requires a PSU to operate with at least 90% efficiency between 10% and 100% load, and with at least 94% efficiency at 50% load.
The AX1500i is a digital power supply, which offers two distinct advantages. First, the AX1500i incorporates a Digital Signal Processor (DSP) to provide digitally-controlled power. This enables the PSU to deliver extremely tight voltage regulation over a wide range of loads. And second, the AX1500i features Corsair Link, which enables the PSU to be connected to the PC’s motherboard (via USB) for real-time monitoring (efficiency and power usage) and control (over-current protection and fan speed profiles).
Silent operation (zero-rpm fan mode up to ~30% load) might not be at the top of your feature list when shopping for a 1,500 watt PSU, but the AX1500i can do it thanks to high efficiency.
(Courtesy of Corsair)
Corsair AX1500i Digital ATX PSU Key Features:
• Digital Signal Processor (DSP) for extremely clean and efficient power
• Corsair Link Interface for monitoring and adjusting performance
• 1,500 watts continuous power output (50°C)
• Dedicated single +12V rail (125A) with user-configurable virtual rails
• 80 Plus Titanium certified, delivering up to 94% efficiency
• Ultra low noise 140mm double ball bearing fan
• Silent, Fanless mode up to ~30% load (~450W)
• Self-test switch to verify power supply functionality
• Premium quality components
• Fully modular cable system
• Conforms to ATX12V v2.4 and EPS 2.92 standards
• Universal AC input (100-240V) with Active PFC
• Over-current, over-voltage, over-temperature, and short circuit protection
• Dimensions: 150mm (W) x 86mm (H) x 225mm (L)
• 7-Year warranty and legendary Corsair customer service
• MSRP: $449.99 USD (available in Q2 2014)
You need a bit of power for this
PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.
For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.
All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.
After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.
The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.
During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.
As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.
Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.
With the release of the next generation desktop board line imminent, we wanted to give you a taste of the new ASUS offerings. ASUS offers support for current and next generation Intel Haswell processors as well as support for SATA 3 and next generation SATA storage solutions with their next generation boards.
ASUS sought to perfect their designs with the new board lines, building off the successess from the Intel Z87-based boards and improving them for better performance and usability. All board lines feature an updated UEFI BIOS with more GUI-driven interfaces and wizards, particularly aimed at those users who don't want to muck around in the Advanced mode settings. Additionally, all board lines feature updated heat sink designs and aesthetics to further differentiate them from their Z87 counterparts. ASUS also enhanced their AI Suite software with 5-Way Optimization functionality, instead of the 4-Way Optimization functionality integrated into the last generation of AI Suite. 5-Way Optimization functionality integrates TPU (Turbo Processing Unit), EPU (Energy Processing Unit), DIGI+ Power Control, Fan XPert 3, and Turbo App into a single cohesive entity.
ASUS will continue to market the next generation boards across their standard four board lines:
- Channel series motherboards
- Republic of Gamers series motherboards
- TUF series motherboards
- Workstation series motherboards
Channel series banner
The Channel series boards include all board designs aimed at mainstream consumers and system builders.
ROG series banner
The ROG (Republic of Gamers) series boards are geared more towards enthusiast and gamer users, featuring a black and red color scheme for eye-catching appeal. The boards also feature innovative designs and integrated components that differentiate them from the other lines.
AMD Makes some Lemonade...
I guess we could say that AMD has been rather busy lately. It seems that a significant amount of the content on PC Perspective this month revolved around the AMD AM1 platform. Before that we had the Kaveri products and the R7 265. AMD also reported some fairly solid growth over the past year with their graphics and APU lines. Things are not as grim and dire as they once were for the company. This is good news for consumers as they will continue to be offered competing solutions that will vie for that hard earned dollar.
AMD is continuing their releases for 2014 with the announcement of their latest low-power and mainstream mobile APUs. These are codenamed “Beema” and “Mullins”, but they are based on the year old Kabini chip. This may cause a few people to roll their eyes as AMD has had some fairly unimpressive refreshes in the past. We saw the rather meager increases in clockspeed and power consumption with Brazos 2.0 a couple of years back, and it looked like this would be the case again for Beema and Mullins.
I was again expecting said meager improvements in power consumption and clockspeeds that we had received all those years ago with Brazos 2.0. Turns out I was wrong. This is a fairly major refresh which does a few things that I did not think were entirely possible, and I’m a rather optimistic person. So why is this release surprising? Let us take a good look under the hood.
Introduction and Technical Specifications
Courtesy of Noctua
Noctua is an established player in the CPU cooler space, offering high quality solutions that cool your processors without killing your eardrums. Their U-series coolers combine unrivaled cooling performance with an innovative design to ensure quiet operation and motherboard compatibility. The flagship product of this line, the NH-U14S, is composed of a single aluminum-finned radiator with six embedded, nickel-plated copper heat pipes running through a copper base plate. For performance testing of the NH-U14S cooler, we put it up against other high-performance liquid and air-based coolers. With a retail MSRP of $75.99, the NH-U14S CPU cooler has a premium price to match its premium size and cooling potential.
Courtesy of Noctua
GIGABYTE is doing some interesting things with their next generation line of boards, offering support for current and next generation Intel Haswell processors. Additionally, these next generation boards will offer support for SATA 3 and next generation SATA storage solutions.
The biggest difference between their new board lines and existing Intel Z87 motherboard lines is how they are differentiating the different board series. GIGABYTE will have a total of four board series in their new product line:
- Normal series motherboards
- Gaming series motherboards
- Overclocking series motherboards
- Black Edition series motherboards
Normal Series motherboard
Courtesy of GIGABYTE
The Normal Series boards encompass their channel boards for mainstream consumers and system builders.
Gaming Series motherboard
The Gaming Series boards are geared more towards enthusiast and gamers users, featuring a black and red color scheme for eye-catching appeal.
Overclocking Series motherboard
Courtesy of GIGABYTE
The Overclocking Series boards are geared towards hard-core enthusiasts and the LN2 record setting crowd with features and designs specifically oriented to eeking out that last bit of speed and performance from the board.
Black Edition Series motherboard
Courtesy of GIGABYTE
The Black Edition Series boards build on several products from the other board lines, ensuring the utmost stability and performance through a specialized factory-performed testing regimen.
Introduction and Features
Back in March, EVGA announced the pending arrival of two new power supplies in their popular SuperNOVA line, the 750G2 and 850G2. Both new power supplies are 80Plus Gold certified and feature all modular cables, high-quality Japanese brand capacitors, a single high-power +12V rail, and a 140mm dual ball bearing cooling fan (with the ability to operate in silent, fan-less mode at low power levels). The 750G2 and 850G2 are also backed by a 10-year warranty (with registration). And last but not least, many PC PSU enthusiasts will be happy to know the new 750G2 and 850G2 are built on the successful Superflower Leadex Gold platform. Superflower is the same OEM that EVGA uses for several of their popular higher output, premium power supplies!
EVGA was founded in 1999 with headquarters in Brea, California. They continue to specialize in producing NVIDIA based graphics adapters and Intel based motherboards and keep expanding their PC power supply product line, which now includes fourteen models ranging from the high-end 1,500W NEX1500 Classified to the budget minded EVGA 430W power supply.
(Courtesy of EVGA)
In this review we will be taking a detailed look at both the EVGA SuperNOVA 750G2 and 850G2 power supplies.
Here is what EVGA has to say about the new SuperNOVA G2 Gold PSUs: “Unleash the next generation in power with the EVGA SuperNOVA 850G2 and 750G2 Power Supplies. Based on the award winning G2 series Power Supplies from EVGA, these PSUs features 80 PLUS Gold rated efficiency, and clean, continuous power to every component. This provides improved efficiency for longer operation, less power consumption, reduced energy costs and minimal heat dissipation. The new ECO Thermal Control Fan System offers fan modes to provide zero fan noise during low load operations. Backed by a 10 year warranty and Japanese capacitor design, the EVGA SuperNOVA G2 is not only the right choice for your system today, it is also the best choice for your system tomorrow."
EVGA SuperNOVA 850 G2 and 750 G2 Gold PSU Key Features:
• 10-Year Warranty and unparalleled EVGA Customer Support
• 80PLUS Gold certified, with up to 90% efficiency under typical loads
• Tight voltage regulation (2%), stable power with low AC ripple and noise
• Highest quality Japanese brand capacitors ensure long-term reliability
• Fully modular cables to reduce clutter and improve airflow
• Quiet dual-ball bearing fan for exceptional reliability and quiet operation
• ECO Intelligent Thermal Control allows silent, fan-less operation at low power
• NVIDIA SLI and AMD Crossfire Ready
• Intel 4th Generation CPU Ready (Haswell, C6/C7 sleep modes)
• Compliance with ErP Lot 6 2013 Requirement
• Active Power Factor correction (0.99) with Universal AC input
• Heavy-duty Protections: OVP, UVP, OCP, OPP, and SCP
• MSRP for the 750 G2 PSU : $129.99 USD (119.99 after mail-in rebate)
• MSRP for the 850 G2 PSU : $159.99 USD ($149.99 after mail-in rebate)
AM1 Walks New Ground
After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.
While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.
Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.
Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.
Competition is a Great Thing
While doing some testing with the AMD Athlon 5350 Kabini APU to determine it's flexibility as a low cost gaming platform, we decided to run a handful of tests to measure something else that is getting a lot of attention right now: AMD Mantle and NVIDIA's 337.50 driver.
Earlier this week I posted a story that looked at performance scaling of NVIDIA's new 337.50 beta driver compared to the previous 335.23 WHQL. The goal was to assess the DX11 efficiency improvements that the company stated it had been working on and implemented into this latest beta driver offering. In the end, we found some instances where games scaled by as much as 35% and 26% but other cases where there was little to no gain with the new driver. We looked at both single GPU and multi-GPU scenarios on mostly high end CPU hardware though.
Earlier in April I posted an article looking at Mantle, AMD's answer to a lower level API that is unique to its ecosystem, and how it scaled on various pieces of hardware on Battlefield 4. This was the first major game to implement Mantle and it remains the biggest name in the field. While we definitely saw some improvements in gaming experiences with Mantle there was work to be done when it comes to multi-GPU scaling and frame pacing.
Both parties in this debate were showing promise but obviously both were far from perfect.
While we were benchmarking the new AMD Athlon 5350 Kabini based APU, an incredibly low cost processor that Josh reviewed in April, it made sense to test out both Mantle and NVIDIA's 337.50 driver in an interesting side by side.
Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games.
As I wrote then:
What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11. When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.
NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.
In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II are...an interesting story. More on that on the next page.)
Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?
Introduction, Specifications, and Contents
Corsair has added another double-width liquid cooler to their growing lineup of all-in-one solutions with the H105, joining the existing H100i and larger H110 in this category.
Image courtesy of Corsair
Initially, the H105 might leave you scratching your head. It's listed on Corsair’s site with the same $119.99 MSRP as the H100i, and both are 240mm designs featuring the same high performance fans. The similarities end there, however, as the design of the H105 is more akin to Corsair's new 120mm H75 (which we recently reviewed) than to the existing 240mm H100/H100i. With the H75 already a solid price/performance pick in Corsair’s lineup - and the various other options still available - it's reasonable to wonder exactly where H105 fits in.
While this new cooler is using the same pair of excellent SP120L PWM fans as the earlier H100i (and H80i), it's the radiator they will be connected to that should separate the H105 from prior efforts. Corsair has implemented a much thicker 240mm rad with the H105 at 35mm (vs. only 27mm on earlier products), and this added thickness should have an noticeable impact on cooling performance, and possibly fan noise as well.
Ultra-Speed RAM, APU-Style
In our review of the Kingston HyperX Predator 2666MHz kit, we discovered what those knowledgeable about Intel memory scaling already knew: for most applications, and specifically games, there is no significant advantage to increases in memory speed past the current 1600MHz DDR3 standard. But this was only half of the story. What about memory scaling with an AMD processor, and specifically an APU? To find out, we put AMD’s top APU, the A10-7850K, to the test!
Ready for some APU memory testing!
AMD has created a compelling option with their APU lineup, and the inclusion of powerful integrated graphics allows for interesting build options with lower power and space requirements, and even make building tiny mini-ITX systems for gaming realistic. It’s this graphical prowess compared to any other onboard solution that creates an interesting value proposition for any gamer looking at a new low-cost build. The newest Kaveri APU’s are getting a lot of attention and they beg the question, is a discrete graphics card really needed for gaming at reasonable settings?
AMD Brings Kabini to the Desktop
Perhaps we are performing a study of opposites? Yesterday Ryan posted his R9 295X2 review, which covers the 500 watt, dual GPU monster that will be retailing for $1499. A card that is meant for only the extreme enthusiast who has plenty of room in their case, plenty of knowledge about their power supply, and plenty of electricity and air conditioning to keep this monster at bay. The product that I am reviewing could not be any more different. Inexpensive, cool running, power efficient, and can be fit pretty much anywhere. These products can almost be viewed as polar opposites.
The interesting thing of course is that it shows how flexible AMD’s GCN architecture is. GCN can efficiently and effectively power the highest performing product in AMD’s graphics portfolio, as well as their lowest power offerings in the APU market. The performance scales very linearly when it comes to adding in more GCN compute cores.
The product that I am of course referring to are the latest Athlon and Sempron APUs that are based on the Kabini architecture which fuses Jaguar x86 cores with GCN compute cores. These APUs were announced last month, but we did not have the chance at the time to test them. Since then these products have popped up in a couple of places around the world, but this is the first time that reviewers have officially received product from AMD and their partners.
A Powerful Architecture
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.