Manufacturer: Koolance

Introduction and Technical Specifications

Introduction

02-ext-440cu_p1-700x700.jpg

Koolance EXT-440CU Liquid Cooling System
Courtesy of Koolance

03-cpu-380i_p1-700x700.jpg

Koolance CPU-380I CPU Water Block with Intel CPU mounting plate
Courtesy of Koolance

Koolance has effectively transformed itself from a minor player in the cooling community to a powerhouse at the forefront of high performance liquid cooling products. Koolance recently released the EXT-440CU Liquid Cooling System, an apparatus integrating the cooling system's reservoir, pump, and radiator into an aluminum assembly. In addition to the EXT-440CU unit, Koolance provided us with their CPU-380I CPU water block for testing as a complete kit. We tested the Koolance kit in conjunction with other all-in-one and air coolers to see how well the Koolance kit stacks up. The EXT-440CU Liquid Cooling System retails at an MSRP of $274.99 with the CPU-380I water block available for a $74.99 MSRP. While not the cheapest solution, the adage "You get what you pay for" fits the bill for this Koolance kit.

04-ext-440cu_p2-700x700.jpg

Koolance EXT-440CU Liquid Cooling System, side view
Courtesy of Koolance

05-ext-440cu_p3-700x700.jpg

Koolance EXT-440CU Liquid Cooling System, rear view
Courtesy of Koolance

06-cpu-380a_p1-700x700.jpg

Koolance CPU-380A CPU Water Block with AMD CPU mounting plate
Courtesy of Koolance

Continue reading our review of the Koolance EXT-440CU Liquid Cooling System!

Manufacturer: NZXT

Introduction and Features

NZXT is introducing the H440 Mid-Tower case in their H Series line.  The new H440 chassis will be available in two different color schemes; white with black accents and black with red accents. Both versions exhibit clean lines and a sleek design. Gone are the 5.25” optical drive bays and in their place you get three 120mm intake fans. In addition to providing excellent case cooling with four included fans the H440 is also very water-cooling friendly with support for water-cooling radiators on the top, front and rear of the case. The left side panel features a large acrylic window to showcase the motherboard area and the H440 can support up to eight internal 3.5”/2.5” HDDs/SSDs: 6+2.

2-Sides_black_wht.jpg

NZXT H440 Mid-Tower Chassis

The lower section of the H440 case, which houses the power supply, is shrouded from view and provides a lot of room for cable management. The color accented shroud features a lighted NZXT logo and there are two LEDs built into the back panel to provide light when making connections; very nice.

The NZXT H440 Mid-Tower case comes with four NZXT FN V2 fans preinstalled: (3) 120mm intake fans in the front and (1) 140mm exhaust fan on the back. Dust filters are provided for the three front intake fans and also on the bottom of the case for the PSU intake fan. And up to three more 120mm fans (or two 140mm fans) can be added to the top panel if desired.

3-H440_wt_fan.jpg

Here is what NZXT has to say about the H440 Mid-Tower case:

The new H440 features a doorless, ODD-free front panel made entirely of steel while a large, full-view window reveals an interior specially engineered to make any build seamless and beautiful. The H440 ensures a hassle-free experience, allowing anybody to become an expert on clean cable management. The H440 comes standard with four of NZXT’s newly designed FN V2 case fans. An unheard of 3 x 120mm in front and 1 x 140mm in rear. And newly designed steel HDD drive trays can support up to eight internal 3.5”/2.5” HDDs/SSDs: 6+2. The H440 supports both 140mm and 120mm fans, the steel top and front panels come Kraken ready, fitting radiators up to 360mm in size to offer comprehensive water-cooling performance in a sleek, minimalist package.

NZXT H440 Mid-Tower Case Key Features:
•    Mid-tower PC case with sleek, clean styling
•    Available in matte black with red accents or gloss white with black accents
•    Four NZXT FN V2 case fans preinstalled
•    Two removable dust filters (front intake and under PSU intake)
•    Large side window
•    Lighted NZXT logo on side and two LED lights on back panel
•    Support for up to eight internal HDDs/SSDs
•    Support for water-cooling radiators: front, top, and/or back
•    Can mount radiators up to 360mm in length
•    Cable routing cutouts with rubber grommets
•    Large CPU backplate cut out for easy CPU cooler upgrades
•    Top panel has 2 USB 3.0, 2 USB 2.0 and audio in/out ports
•    Thumbscrew side panel removal for quick access
•    Supports ATX, Micro-ATX, and Mini-ITX motherboards
•    2-Year warranty

Please continue reading our review of the NZXT H440 Mid-Tower case!!!

Manufacturer: PC Perspective
Tagged: Mantle, interview, amd

What Mantle signifies about GPU architectures

Mantle is a very interesting concept. From the various keynote speeches, it sounds like the API is being designed to address the current state (and trajectory) of graphics processors. GPUs are generalized and highly parallel computation devices which are assisted by a little bit of specialized silicon, when appropriate. The vendors have even settled on standards, such as IEEE-754 floating point decimal numbers, which means that the driver has much less reason to shield developers from the underlying architectures.

Still, Mantle is currently a private technology for an unknown number of developers. Without a public SDK, or anything beyond the half-dozen keynotes, we can only speculate on its specific attributes. I, for one, have technical questions and hunches which linger unanswered or unconfirmed, probably until the API is suitable for public development.

Or, until we just... ask AMD.

amd-mantle-interview-01.jpg

Our response came from Guennadi Riguer, the chief architect for Mantle. In it, he discusses the API's usage as a computation language, the future of the rendering pipeline, and whether there will be a day where Crossfire-like benefits can occur by leaving an older Mantle-capable GPU in your system when purchasing a new, also Mantle-supporting one.

Q: Mantle's shading language is said to be compatible with HLSL. How will optimizations made for DirectX, such as tweaks during shader compilation, carry over to Mantle? How much tuning will (and will not) be shared between the two APIs?

[Guennadi] The current Mantle solution relies on the same shader generation path games the DirectX uses and includes an open-source component for translating DirectX shaders to Mantle accepted intermediate language (IL). This enables developers to quickly develop Mantle code path without any changes to the shaders. This was one of the strongest requests we got from our ISV partners when we were developing Mantle.

AMD-mantle-dx-hlsl-GSA_screen_shot.jpg

Follow-Up: What does this mean, specifically, in terms of driver optimizations? Would AMD, or anyone else who supports Mantle, be able to re-use the effort they spent on tuning their shader compilers (and so forth) for DirectX?

[Guennadi] With the current shader compilation strategy in Mantle, the developers can directly leverage DirectX shader optimization efforts in Mantle. They would use the same front-end HLSL compiler for DX and Mantle, and inside of the DX and Mantle drivers we share the shader compiler that generates the shader code our hardware understands.

Read on to see the rest of the interview!

Subject: Mobile
Manufacturer: Lenovo

Introduction and Design

PC063666.jpg

Arguably some of the most thoughtful machines on the market are Lenovo’s venerable ThinkPads, which—while sporadically brave in their assertions—are still among the most conservative (yet simultaneously practical) notebooks available.  What makes these notebooks so popular in the business crowds is their longstanding refusal to compromise functionality in the interest of form, as well as their self-proclaimed legendary reliability.  And you could argue that such practical conservatism is what defines a good business notebook: a device which embraces the latest technological trends, but only with requisite caution and consideration.

Maybe it’s the shaky PC market, or maybe it’s the sheer onset of sexy technologies such as touch and clickpads, but recent ThinkPads have begun to show some uncommon progressivism, and unapologetically so, too.  First, it was the complete replacement of the traditional critically-acclaimed ThinkPad keyboard with the Chiclet AccuType variety, a decision which irked purists but eventually was accepted by most.  Along with that were the integrated touchpad buttons, which are still lamented by many users.  Those alterations to the winning design were ultimately relatively minor, however, and for the most part, they’ve now been digested by the community.  Now, though, with the T440s (as well as the rest of Lenovo’s revamped ThinkPad lineup), we’re seeing what will perhaps constitute the most controversial change of all: the substitution of the older touchpads with a “5-button trackpad”, as well as optional touchscreen interface.

Can these changes help to keep the T440s on the cusp of technological progress, or has the design finally crossed the threshold into the realm of counterproductivity?

specs.png

Compared with nearly any other modern notebook, these specs might not hold many surprises.  But judged side-by-side with its T430s predecessor, there are some pretty striking differences.  For starters, the T440s is the first in its line to offer only low-voltage CPU options.  While our test unit shipped with the (certainly capable enough) Core i5-4200U—a dual-core processor with up to 2.6 GHz Turbo Boost clock rate—options range up to a Core i7-4600U (up to 3.30 GHz).  Still, these options are admittedly a far cry from the i7-3520M with which top-end T430s machines were equipped.  Of course, it’s also less than half of the TDP, which is likely why the decision was made.  Other notables are the lack of discrete graphics options (previously users has the choice of either integrated graphics or an NVIDIA NVS 5200M) and the maximum supported memory of 12 GB.  And, of course, there’s the touchscreen—which is not required, but rather, is merely an option.  On the other hand, while we’re on the subject of the screen, this is also the first model in the series to offer a 1080p resolution, whether traditional or touch-enabled—which is very much appreciated indeed.

PC063645.jpg

That’s a pretty significant departure from the design of the T430s, which—as it currently appears—could represent the last T4xxs model that will provide such powerhouse options at the obvious expense of battery life.  Although some markets already have the option of the ThinkPad S440 to fill the Ultrabook void within the ThinkPad 14-inch range, that notebook can even be outfitted with discrete graphics.  The T440s top-end configuration, meanwhile, consists of a 15W TDP dual-core i7 with integrated graphics and 12 GB DDR3 RAM.  In other words, it’s powerful, but it’s just not in the same class as the T430’s components.  What’s more important to you?

Continue reading our review of the ThinkPad T440s!!!

Author:
Manufacturer: AMD

A quick look at performance results

Late last week, EA and Dice released the long awaited patch for Battlefield 4 that enables support for the Mantle renderer.  This new API technology was introduced by AMD back in September. Unfortunately, AMD wasn't quite ready for its release with their Catalyst 14.1 beta driver.  I wrote a short article that previewed the new driver's features, its expected performance with the Mantle version of BF4, and commentary about the current state of Mantle.  You should definite read that as a primer before continuing if you haven't yet.  

Today, after really just a few short hours with a useable driver, I have only limited results.  Still, I know that you, our readers, clamor for ANY information on the topic.  I thought I would share what we have thus far.

Initial Considerations

As I mentioned in the previous story, the Mantle version of Battlefield 4 has the biggest potential to show advantages in times where the game is more CPU limited.  AMD calls this the "low hanging fruit" for this early release of Mantle and claim that further optimizations will come, especially for GPU-bound scenarios.  Because of that dependency on CPU limitations, that puts some non-standard requirements on our ability to showcase Mantle's performance capabilities.

bf42.jpg

For example, the level of the game and even the section of that level, in the BF4 single player campaign, can show drastic swings in Mantle's capabilities.  Multiplayer matches will also show more consistent CPU utilization (and thus could be improved by Mantle) though testing those levels in a repeatable, semi-scientific method is much more difficult.  And, as you'll see in our early results, I even found a couple instances in which the Mantle API version of BF4 ran a smidge slower than the DX11 instance.  

For our testing, we compiled two systems that differed in CPU performance in order to simulate the range of processors installed within consumers' PCs.  Our standard GPU test bed includes a Core i7-3960X Sandy Bridge-E processor specifically to remove the CPU as a bottleneck and that has been included here today.  We added in a system based on the AMD A10-7850K Kaveri APU which presents a more processor-limited (especially per-thread) system, overall, and should help showcase Mantle benefits more easily.

Continue reading our early look at the performance advantages of AMD Mantle on Battlefield 4!!

Author:
Manufacturer: AMD

A troubled launch to be sure

AMD has released some important new drivers with drastic feature additions over the past year.  Remember back in August of 2013 when Frame Pacing was first revealed?  Today’s Catalyst 14.1 beta release will actually complete the goals that AMD set forth upon itself in early 2013 in regards to introducing (nearly) complete Frame Pacing technology integration for non-XDMA GPUs while also adding support for Mantle and HSA capability.

Frame Pacing Phase 2 and HSA Support

When AMD released the first frame pacing capable beta driver in August of 2013, it added support to existing GCN designs (HD 7000-series and a few older generations) at resolutions of 2560x1600 and below.  While that definitely addressed a lot of the market, the fact was that CrossFire users were also amongst the most likely to have Eyefinity (3+ monitors spanned for gaming) or even 4K displays (quickly dropping in price).  Neither of those advanced display options were supported with any Catalyst frame pacing technology.

That changes today as Phase 2 of the AMD Frame Pacing feature has finally been implemented for products that do not feature the XDMA technology (found in Hawaii GPUs for example).  That includes HD 7000-series GPUs, the R9 280X and 270X cards, as well as older generation products and Dual Graphics hardware combinations such as the new Kaveri APU and R7 250.  I have already tested Kaveri and the R7 250 in fact, and you can read about its scaling and experience improvements right here.  That means that users of the HD 7970, R9 280X, etc., as well as those of you with HD 7990 dual-GPU cards, will finally be able to utilize the power of both GPUs in your system with 4K displays and Eyefinity configurations!

BF3_5760x1080_PLOT.png

This is finally fixed!!

As of this writing I haven’t had time to do more testing (other than the Dual Graphics article linked above) to demonstrate the potential benefits of this Phase 2 update, but we’ll be targeting it later in the week.  For now, it appears that you’ll be able to get essentially the same performance and pacing capabilities on the Tahiti-based GPUs as you can with Hawaii (R9 290X and R9 290). 

Catalyst 14.1 beta is also the first public driver to add support for HSA technology, allowing owners of the new Kaveri APU to take advantage of the appropriately enabled applications like LibreOffice and the handful of Adobe apps.  AMD has since let us know that this feature DID NOT make it into the public release of Catalyst 14.1.

The First Mantle Ready Driver (sort of) 

A technology that has been in development for more than two years according to AMD, the newly released Catalyst 14.1 beta driver is the first to enable support for the revolutionary new Mantle API for PC gaming.  Essentially, Mantle is AMD’s attempt at creating a custom API that will replace DirectX and OpenGL in order to more directly target the GPU hardware in your PC, specifically the AMD-based designs of GCN (Graphics Core Next). 

slide2_0.jpg

Mantle runs at a lower level than DX or OGL does, able to more directly access the hardware resources of the graphics chips, and with that ability is able to better utilize the hardware in your system, both CPU and GPU.  In fact, the primary benefit of Mantle is going to be seen in the form of less API overhead and bottlenecks such as real-time shader compiling and code translation. 

If you are interested in the meat of what makes Mantle tick and why it was so interesting to us when it was first announced in September of 2013, you should check out our first deep-dive article written by Josh.  In it you’ll get our opinion on why Mantle matters and why it has the potential for drastically changing the way the PC is thought of in the gaming ecosystem.

Continue reading our coverage of the launch of AMD's Catalyst 14.1 driver and Battlefield 4 Mantle patch!!

Author:
Manufacturer: AMD

Hybrid CrossFire that actually works

The road to redemption for AMD and its driver team has been a tough one.  Since we first started to reveal the significant issues with AMD's CrossFire technology back in January of 2013 the Catalyst driver team has been hard at work on a fix, though I will freely admit it took longer to convince them that the issue was real than I would have liked.  We saw the first steps of the fix released in August of 2013 with the release of the Catalyst 13.8 beta driver.  It supported DX11 and DX10 games and resolutions of 2560x1600 and under (no Eyefinity support) but was obviously still less than perfect.  

In October with the release of AMD's latest Hawaii GPU the company took another step by reorganizing the internal architecture of CrossFire on the chip level with XDMA.  The result was frame pacing that worked on the R9 290X and R9 290 in all resolutions, including Eyefinity, though still left out older DX9 titles.  

One thing that had not been addressed, at least not until today, was the issues that surrounded AMD's Hybrid CrossFire technology, now known as Dual Graphics.  This is the ability for an AMD APU with integrated Radeon graphics to pair with a low cost discrete GPU to improve graphics performance and gaming experiences.  Recently over at Tom's Hardware they discovered that Dual Graphics suffered from the exact same scaling issues as standard CrossFire; frame rates in FRAPS looked good but the actually perceived frame rate was much lower.

drivers01.jpg

A little while ago a new driver made its way into my hands under the name of Catalyst 13.35 Beta X, a driver that promised to enable Dual Graphics frame pacing with Kaveri and R7 graphics cards.  As you'll see in the coming pages, the fix definitely is working.  And, as I learned after doing some more probing, the 13.35 driver is actually a much more important release than it at first seemed.  Not only is Kaveri-based Dual Graphics frame pacing enabled, but Richland and Trinity are included as well.  And even better, this driver will apparently fix resolutions higher than 2560x1600 in desktop graphics as well - something you can be sure we are checking on this week!

drivers02.jpg

Just as we saw with the first implementation of Frame Pacing in the Catalyst Control Center, with the 13.35 Beta we are using today you'll find a new set of options in the Gaming section to enable or disable Frame Pacing.  The default setting is On; which makes me smile inside every time I see it.

drivers03.jpg

The hardware we are using is the same basic setup we used in my initial review of the AMD Kaveri A8-7600 APU review.  That includes the A8-7600 APU, an Asrock A88X mini-ITX motherboard, 16GB of DDR3 2133 MHz memory and a Samsung 840 Pro SSD.  Of course for our testing this time we needed a discrete card to enable Dual Graphics and we chose the MSI R7 250 OC Edition with 2GB of DDR3 memory.  This card will run you an additional $89 or so on Amazon.com.  You could use either the DDR3 or GDDR5 versions of the R7 250 as well as the R7 240, but in our talks with AMD they seemed to think the R7 250 DDR3 was the sweet spot for the CrossFire implementation.

IMG_9457.JPG

Both the R7 250 and the A8-7600 actually share the same number of SIMD units at 384, otherwise known as 384 shader processors or 6 Compute Units based on the new nomenclature that AMD is creating.  However, the MSI card is clocked at 1100 MHz while the GPU portions of the A8-7600 APU are running at only 720 MHz. 

So the question is, has AMD truly fixed the issues with frame pacing with Dual Graphics configurations, once again making the budget gamer feature something worth recommending?  Let's find out!

Continue reading our look at Dual Graphics Frame Pacing with the Catalyst 13.35 Beta Driver!!

Manufacturer: Corsair

Introduction and Features

2-CS650M-Banner.jpg

Corsair's new CS Series Modular PSUs include four models; the CS450M, CS550M, CS650M and CS750M.  All of the power supplies in the CS Series feature modular cables, high efficiency (80 Plus Gold certified) and quiet operation. In addition, Corsair continues to offer a full line of high quality power supplies, memory components, cases, cooling components, SSDs and accessories for the PC market.  

Here is what Corsair has to say about their CS Series Modular PSUs: “The CS Modular Series is designed for basic and midrange PCs, but offers features and performance traditionally reserved for higher-end models. 80 Plus Gold efficiency and a thermally controlled fan ensure quiet operation and lower energy use, and the modular, detachable cable set makes installations and upgrades faster and better looking.

3-CS650M.jpg

80 Plus Gold rated efficiency saves you money on your power bill and produces less heat than less efficient power supplies. The flat black modular cables allow you to enjoy fast, neat builds. And, like all Corsair power supplies, CS Series Modular is built with high-quality components and is guaranteed to deliver clean, stable, continuous power.”

Corsair CS Series Modular PSU Key Features: (from the Corsair website)

4-Features.jpg

Please continue reading our Corsair CS650M power supply review!!!

Subject: Storage
Manufacturer: OCZ Technology

Introduction, Specifications and Packaging

Introduction:

As of yesterday, the OCZ we all knew was officially acquired by Toshiba. They are now referred to as OCZ Storage Solutions, acting as a wholly owned subsidiary of Toshiba Group:

ocz-toshiba-logo-new.JPG

This deal has been in the works for a while now, and while some suspected OCZ might be going under, they have continued to release new drives. The acquisition is more beneficial to OCZ than you might think, in that they now have much better access to Toshiba flash memory. Further, they can likely purchase it at better costs than available to those outside of the new parent companies' umbrella.

Today is no different, and OCZ is ringing in the pairing with a new product launch:

2014-01-20 21-55-27.JPG

Lets jump right into the specs:

Specifications:

specs.png

OCZ also provided a comparison against prior models:

comparison.png

This new model, just like the Vector 150, sports Toshiba 19nm flash. It's a slightly newer version of the Barefoot 3 controller, but with a lower endurance spec and warranty period.

Continue reading our review of the OCZ Vertex 460 240GB SSD!!

Subject: Motherboards
Manufacturer: ECS

Introduction and Technical Specifications

Introduction

02-Z87H3-A3X_V1_LGA_1150_Intel_Motherboard_2new_0.jpg

Courtesy of ECS

The Z87H3-A3X is ECS' latest release in their L337 Gaming board line. Similar to the A2X Extreme board, ECS designed the Z87H3-A3X with a liberal amount of gold, from the gold plating on its capacitors to the gold tint on its integrated heat sinks. This board is a steal at it $119.99 MSRP with its Intel Z87 chipset and performance-oriented components.

03-Z87H3-A3X_V1_LGA_1150_Intel_Motherboard_3new_0.jpg

Courtesy of ECS

ECS powers the Z87H3-A3X motherboard with a 6-phase digital power regulation system to ensure consistent power delivery to the CPU under all operating circumstances. The Z87H3-A3X includes the following integrated features: six SATA 6Gb/s ports and one eSATA port; an Intel GigE NIC; two PCI-Express x16 slots for up to dual-card support; four PCI-Express x1 slots; and USB 2.0 and 3.0 port support. The 80P button configures what information displays on the diagnostic display once the board has successfully initialized.

04-power_0.jpg

Courtesy of ECS

05-heatsinks_0.jpg

Courtesy of ECS

Being part of their L337 Gaming series boards, the Z87H3-A3X includes the Durathon power delivery solution and innovative cooling solution. The board's heat sinks are designed to enhance airflow over key components to aid in cooling, while the Durathon system incorporates enhanced power modules to maximum performance and minimize failure potential.

06-Z87H3-A3X_V1_LGA_1150_Intel_Motherboard_4new_0.jpg

Courtesy of ECS

Continue reading our review of the ECS Z87H3-A3X motherboard!

Author:
Subject: Mobile
Manufacturer: Lenovo

Lenovo introduces a unique form factor

Lenovo isn't a company that seems interested in slowing down.  Just when you think the world of notebooks is getting boring, it releases products like the ThinkPad Tablet 2 and the Yoga 2 Pro.  Today we are looking at another innovative product from Lenovo, the Yoga Tablet 8 and Yoga Tablet 10.  While the tablets share the Yoga branding seen in recent convertible notebooks these are NOT Windows-based PCs - something that I fear some consumers might get confused by.  

Instead this tablet pair is based on Android (4.2.2 at this point) which brings with it several advantages.  First, the battery life is impressive, particularly with the 8-in version that clocked in more than 17 hours in our web browsing test!  Second, the form factor of these units is truly unique and not only allows for larger batteries but also a more comfortable in-the-hand feeling than I have had with any other tablet.  

Check out the video overview below!  

You can pick up the 8-in version of the Lenovo Yoga Tablet for just $199 while the 10.1-in model starts at $274.

IMG_9325.JPG

The Lenovo Yoga Tablet is available in both 8-in and 10.1-in sizes though the hardware is mostly identical between both units include screen resolution (1280x800) and SoC hardware (MediaTek quad-core Cortex-A7).  The larger model does get an 8000 mAh battery (over the 6000 mAh on the 8-in) but isn't enough to counter balance the power draw of the larger screen.

IMG_9329.JPG

The 1280x800 resolution is a bit lower than I would like but is perfectly acceptable on the 8-in version of the Yoga Tablet.  On the 10-in model though the pixels are just too big and image quality suffers.  These are currently running Android 4.2.2 which is fine, but hopefully we'll see some updates from Lenovo to more current Android versions.

Continue reading about the Lenovo Yoga Tablet 8 and Yoga Tablet 10 devices!!

Subject: General Tech
Manufacturer: PC Perspective

A Hard Decision

Welcome to our second annual (only chumps say first annual... crap) Best Hardware of the Year awards. This is where we argue the order of candidates in several categories on the podcast and, some time later, compile the results into an article. The majority of these select the best hardware of its grouping but some look at the more general trends of our industry.

As an aside, Google Monocle will win Best Hardware Ever 2014, 2015, and 2017. It will fail to be the best of all time for 2016, however.

If you would like to see the discussion as it unfolded then you should definitely watch Episode 282 recorded January 2nd, 2014. You do not even need to navigate away because we left it tantalizingly embed below this paragraph. You know you want to enrich the next two hours of your life. Click it. Click it a few times if you have click to enable plugins active in your browser. You can stop clicking when you see the polygons dance. You will know it when you see it.

The categories were arranged as follows:

  • Best Graphics Card of 2013
  • Best CPU of 2013
  • Best Storage of 2013
  • Best Case of 2013
  • Best Motherboard of 2013
  • Best Price Drop of 2013
  • Best Mobile Device of 2013
  • Best Trend of 2013
  • Worst Trend of 2013

Each of the winners will be given our "Editor's Choice" award regardless of its actual badge in any review we conducted of it. This is because the product is the choice of our editors for this year even if it is not an "Editor's Choice". It may have not even been reviewed by us at all.

Also, the criteria for winning each category is left as vague as possible for maximum interpretation.

Continue reading our selection for Best Hardware of 2013!!

Author:
Manufacturer: Asus

A Refreshing Change

Refreshes are bad, right?  I guess that depends on who you talk to.  In the case of AMD, it is not a bad thing.  For people who live for cutting edge technology in the 3D graphics world, it is not pretty.  Unfortunately for those people, reality has reared its ugly head.  Process technology is slowing down, but product cycles keep moving along at a healthy pace.  This essentially necessitates minor refreshes for both AMD and NVIDIA when it comes to their product stack.  NVIDIA has taken the Kepler architecture to the latest GTX 700 series of cards.  AMD has done the same thing with the GCN architecture, but has radically changed the nomenclature of the products.

Gone are the days of the Radeon HD 7000 series.  Instead AMD has renamed their GCN based product stack with the Rx 2xx series.  The products we are reviewing here are the R9 280X and the R9 270X.  These products were formerly known as the HD 7970 and HD 7870 respectively.  These products differ in clock speeds slightly from the previous versions, but the differences are fairly minimal.  What is different are the prices for these products.  The R9 280X retails at $299 while the R9 270X comes in at $199.

asus_r9_01.png

Asus has taken these cards and applied their latest DirectCU II technology to them.  These improvements relate to design, component choices, and cooling.  These are all significant upgrades from the reference designs, especially when it comes to the cooling aspects.  It is good to see such a progression in design, but it is not entirely surprising given that the first HD 7000 series debuted in January, 2012.

Click here to read the rest of the review!

Author:
Subject: Processors
Manufacturer: AMD

The AMD Kaveri Architecture

Kaveri: AMD’s New Flagship Processor

How big is Kaveri?  We already know the die size of it, but what kind of impact will it have on the marketplace?  Has AMD chosen the right path by focusing on power consumption and HSA?  Starting out an article with three questions in a row is a questionable tactic for any writer, but these are the things that first come to mind when considering a product the likes of Kaveri.  I am hoping we can answer a few of these questions by the end of this article, but alas it seems as though the market will have the final say as to how successful this new architecture is.

AMD has been pursuing the “Future is Fusion” line for several years, but it can be argued that Kaveri is truly the first “Fusion” product that completes the overall vision for where AMD wants to go.  The previous several generations of APUs were initially not all that integrated in a functional sense, but the complexity and completeness of that integration has been improved upon with each iteration.  Kaveri takes this integration to the next step, and one which fulfills the promise of a truly heterogeneous computing solution.  While AMD has the hardware available, we have yet to see if the software companies are willing to leverage the compute power afforded by a robust and programmable graphics unit powered by AMD’s GCN architecture.

(Editor's Note: The following two pages were written by our own Josh Walrath, dicsussing the technology and architecture of AMD Kaveri.  Testing and performance analysis by Ryan Shrout starts on page 3.)

Process Decisions

The first step in understanding Kaveri is taking a look at the process technology that AMD is using for this particular product.  Since AMD divested itself of their manufacturing arm, they have had to rely on GLOBALFOUNDRIES to produce nearly all of their current CPUs and APUs.  Bulldozer, Piledriver, Llano, Trinity, and Richland based parts were all produced on GF’s 32 nm PD-SOI process.  The lower power APUs such as Brazos and Kabini have been produced by TSMC on their 40 nm and 28 nm processes respectively.

kv01.jpg

Kaveri will take a slightly different approach here.  It will be produced by GLOBALFOUNDRIES, but it will forego the SOI and utilize a bulk silicon process.  28 nm HKMG is very common around the industry, but few pure play foundries were willing to tailor their process to the direct needs of AMD and the Kaveri product.  GF was able to do such a thing.  APUs are a different kind of animal when it comes to fabrication, primarily because the two disparate units require different characteristics to perform at the highest efficiency.  As such, compromises had to be made.

Continue reading our review of the new AMD Kaveri A8-7600 APU!!

Manufacturer: StarTech

Introduction and Design

PB063557.jpg

We’re always on the hunt for good docking stations, and sometimes it can be difficult to locate one when you aren’t afforded the luxury of a dedicated docking port. Fortunately, with the advent of USB 3.0 and the greatly improved bandwidth that comes along with it, the options have become considerably more robust.

Today, we’ll take a look at StarTech’s USB3SDOCKHDV, more specifically labeled the Universal USB 3.0 Laptop Docking Station - Dual Video HDMI DVI VGA with Audio and Ethernet (whew). This docking station carries an MSRP of $155 (currently selling for $123 on Amazon.com) and is well above other StarTech options (such as the $100 USBVGADOCK2, which offers just one video output—VGA—10/100 Ethernet, and four USB 2.0 ports). In terms of street price, it is currently available at resellers such as Amazon for around $125.

The big selling points of the USB3SDOCKHDV are its addition of three USB 3.0 ports and Gigabit Ethernet—but most enticingly, its purported ability to provide three total screens simultaneously (including the connected laptop’s LCD) by way of dual HD video output. This video output can be achieved by way of either HDMI + DVI-D or HDMI + VGA combinations (but not by VGA + DVI-D). We’ll be interested to see how well this functionality works, as well as what sort of toll it takes on the CPU of the connected machine.

Continue reading our review of the StarTech USB3SDOCKHDV USB 3.0 Docking Station!!!

Author:
Subject: Storage
Manufacturer: Various

The stars are aligned

One of the most frequent questions we get at PC Perspective is some derivative of "is now the time to buy or should I wait?"  If you listen to the PC Perspective Podcast or This Week in Computer Hardware you'll know that I usually err on the side of purchasing now. Why should you hold yourself back on the enjoyment of technology unless something DRAMATIC is just over the horizon.

This week I got another such email that prompted me to do some thinking.  After just returning from CES 2014 in Las Vegas, I think its fair to say that we didn't hear anything concrete about upcoming SSD plans that would really be considered monumental.  Sure, we saw plenty of PCIe SSDs as well as some M.2 options, but little for PC enthusiasts or even users that are looking to replace the hard drives in their PlayStation 4. Our team thinks that now is about as good of a time to buy an SSD as you will get.

And while you are always going to see price drops on commodity goods like flash storage, the prices on some of our favorite SSDs are at a low that we haven't witnessed without the rebates and flash deals of Black Friday / Cyber Monday.  Let's take a look at a few:

Note: It should go without saying that all of these price discussions are as of this writing and could change...

ssd-evo1tb.jpg

Samsung 840 EVO 1TB SSD (Red: Amazon, Yellow: Newegg) - Graph courtesy HoverHound

The flagship SSD from the Samsung 840 EVO series SSDs, also the personal favorite of Allyn and most of the rest of the PC Perspective team, is near its all-time low in price at just $529 for a 1TB capacity.  That is a cost per GB of just $0.529; no rebates, no gimmicks.  

ssd-evo500gb.jpg

Samsung 840 EVO 500GB SSD (Red: Amazon, Yellow: Newegg) - Graph courtesy HoverHound

Likely the most popularly purchased of the EVO series is the 500GB model that is currently selling on Amazon for $309, or $0.618/GB.  Obviously that is a higher mark than the 1TB hits but as you'll see in our tables below, in general, the higher capacity you purchase at the better value per GB you are going to find.  

There are other capacities of the Samsung 840 EVO starting at 120GB, going to 250GB, and even a 750GB, all are included in the pricing table below.  Depending on your budget and your need for the best perceived value, you can make a decision on your own.

ssd-evo.jpg

Let's not forget the other options on the market; Samsung may be the strongest player today but companies like Intel, OCZ and Corsair continue to have a strong presence.  The second best selling series of SSD during the holidays was the Intel 530 series of drives that utilize the LSI SandForce SF2281 controller.  How do they stack up price-wise?

Continue reading our analysis to determine if this is the best time to buy an SSD!!

Author:
Manufacturer: AMD

DisplayPort to Save the Day?

During an impromptu meeting with AMD this week, the company's Corporate Vice President for Visual Computing, Raja Koduri, presented me with an interesting demonstration of a technology that allowed the refresh rate of a display on a Toshiba notebook to perfectly match with the render rate of the game demo being shown.  The result was an image that was smooth and with no tearing effects.  If that sounds familiar, it should.  NVIDIA's G-Sync was announced in November of last year and does just that for desktop systems and PC gamers.

Since that November unveiling, I knew that AMD would need to respond in some way.  The company had basically been silent since learning of NVIDIA's release but that changed for me today and the information discussed is quite extraordinary.  AMD is jokingly calling the technology demonstration "FreeSync".

slides04.jpg

Variable refresh rates as discussed by NVIDIA.

During the demonstration AMD's Koduri had two identical systems side by side based on a Kabini APU . Both were running a basic graphics demo of a rotating windmill.  One was a standard software configuration while the other model had a modified driver that communicated with the panel to enable variable refresh rates.  As you likely know from our various discussions about variable refresh rates an G-Sync technology from NVIDIA, this setup results in a much better gaming experience as it produces smoother animation on the screen without the horizontal tearing associated with v-sync disabled.  

Obviously AMD wasn't using the same controller module that NVIDIA is using on its current G-Sync displays, several of which were announced this week at CES.  Instead, the internal connection on the Toshiba notebook was the key factor: Embedded Display Port (eDP) apparently has a feature to support variable refresh rates on LCD panels.  This feature was included for power savings on mobile and integrated devices as refreshing the screen without new content can be a waste of valuable battery resources.  But, for performance and gaming considerations, this feature can be used to initiate a variable refresh rate meant to smooth out game play, as AMD's Koduri said.

Continue reading our thoughts on AMD's initial "FreeSync" variable refresh rate demonstration!!

Author:
Subject: Mobile
Manufacturer: NVIDIA

Once known as Logan, now known as K1

NVIDIA has bet big on Tegra.  Since the introduction of the SoC's first iteration, that much was clear.  With the industry push to mobile computing and the decreased importance of the classic PC design, developing and gaining traction with a mobile processor was not only an expansion of the company’s portfolio but a critical shift in the mindset of a graphics giant. 

The problem thus far is that while NVIDIA continues to enjoy success in the markets of workstation and consumer discrete graphics, the Tegra line of silicon-on-chip processors has faltered.  Design wins have been tough to come by. Other companies with feet already firmly planted on this side of the hardware fence continue to innovate and seal deals with customers.  Qualcomm is the dominant player for mobile processors with Samsung, MediaTek, and others all fighting for the same customers NVIDIA needs.  While press conferences and releases have been all smiles and sunshine since day one, the truth is that Tegra hasn’t grown at the rate NVIDIA had hoped.

Solid products based on NVIDIA Tegra processors have been released.  The first Google Nexus 7 used the Tegra 3 processor, and was considered the best Android tablet on the market by most, until it was succeeded by the 2013 iteration of the Nexus 7 this year.  Tegra 4 slipped backwards, though – the NVIDIA SHIELD mobile gaming device was the answer for a company eager to show the market they built compelling and relevant hardware.  It has only partially succeeded in that task.

denver2.jpg

With today’s announcement of the Tegra K1, previously known as Logan or Tegra 5, NVIDIA hopes to once again spark a fire under partners and developers, showing them that NVIDIA’s dominance in the graphics fields of the PC has clear benefits to the mobile segment as well.  During a meeting with NVIDIA about Tegra K1, Dan Vivoli, Senior VP of marketing and a 16 year employee, equated the release of the K1 to the original GeForce GPU.  That is a lofty ambition and puts of a lot pressure on the entire Tegra team, not to mention the K1 product itself, to live up to.

Tegra K1 Overview

What we previously knew as Logan or Tegra 5 (and actually it was called Tegra 5 until just a couple of days ago), is now being released as the Tegra K1.  The ‘K’ designation indicated the graphics architecture that powers the SoC, in this case Kepler.  Also, it’s the first one.  So, K1.

The processor of the Tegra K1 look very familiar and include four ARM Cortex-A15 “r3” cores and 2MB of L2 cache with a fifth A15 core used for lower power situations.  This 4+1 design is the same that was introduced with the Tegra 4 processor last year and allows NVIDIA to implement a style of “big.LITTLE” design that is unique.  Some slight modifications to the cores are included with Tegra K1 that improve performance and efficiency, but not by much – the main CPU is very similar to the Tegra 4.

NVIDIA also unveiled late last night that another version of the Tegra K1 that replaces the quad A15 cores with two of the company's custom designs Denver CPU cores.  Project Denver, announced in early 2011, is NVIDIA's attempt at building its own core design based on the ARMv8 64-bit ISA.  This puts this iteration of Tegra K1 on the same level as Apple's A7 and Qualcomm's Krait processors.  When these are finally available in the wild it will be incredibly intriguing to see how well NVIDIA's architects did in the first true CPU design from the GPU giant.

Continue reading about NVIDIA's new Tegra K1 SoC with Kepler-based graphics!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at http://pcper.com/ces!

Manufacturer: NVIDIA G-SYNC

Introduction and Unboxing

Introduction:

We've been covering NVIDIA's new G-Sync tech for quite some time now, and displays so equipped are finally shipping. With all of the excitement going on, I became increasingly interested in the technology, especially since I'm one of those guys who is extremely sensitive to input lag and the inevitable image tearing that results from vsync-off gaming. Increased discussion on our weekly podcast, coupled with the inherent difficulty of demonstrating the effects without seeing G-Sync in action in-person, led me to pick up my own ASUS VG248QE panel for the purpose of this evaluation and review. We've generated plenty of other content revolving around the G-Sync tech itself, so lets get straight into what we're after today - evaluating the out of box installation process of the G-Sync installation kit.

DSC01253.JPG

Unboxing:

DSC01225.JPG

DSC01226.JPG

All items are well packed and protected.

DSC01229.JPG

Included are installation instructions, a hard plastic spudger for opening the panel, a couple of stickers, and all necessary hardware bits to make the conversion.

Read on for the full review!

Subject: Motherboards
Manufacturer: GIGABYTE

Introduction and Technical Specifications

Introduction

8022_big.jpg

Courtesy of GIGABYTE

The GIGABYTE G1.Sniper 5 motherboard is among GIGABYTE's flagship boards supporting the forth generation of the Intel Core processor line through the integrated Z87 chipset. The board offers support for the newest generation of Intel LGA1150-based processors with all the integrated features and port support you've come to expect from a high-end board. At an MSRP of $419.99, the G1.Sniper 5 premium price is only matched by its premium and expansive feature set.

8023_big.jpg

Courtesy of GIGABYTE

8283.jpg

Courtesy of GIGABYTE

GIGABYTE packed the G1.Sniper full of premium features to ensure its viability as a top-rated contender. The board features the Ultra Durable 5 Plus power technology and the Amp-Up Audio technology. Ultra Durable 5 Plus brings several high-end power components into the board's design: International Rectifier (IR) manufactured PowIRstage™ ICs and PWM controllers, Nippon Chemi-con manufactured Black Solid capacitors with a 10k hour operational rating at 105C, 15 micron gold plating on the CPU socket pins, and two 0.070mm copper layers imbedded into the PCB for optimal heat dissipation. GIGABYTE's Amp-Up Audio technology integrates an op-amp socket into the board's audio PCB, giving the user the ability to customize their audio listening experience. Additionally, the G1.Sniper 5 has the following integrated features: 10 SATA 6Gb/s ports; dual GigE NICs - an Intel NIC and a Qualcomm Killer NIC; four PCI-Express x16 slots for up to quad-card NVIDIA SLI or AMD CrossFire support; three PCI-Express x1 slots; on board power, reset, and BIOS reset buttons; switch BIOS and Dual-BIOS switches; 2-digit diagnostic LED display; integrated voltage measurement points; and USB 2.0 and 3.0 port support.

8024_big.jpg

Courtesy of GIGABYTE

Continue reading our review of the GIGABYTE G1.Sniper 5 Z87 motherboard!