All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
The AMD Kaveri Architecture
Kaveri: AMD’s New Flagship Processor
How big is Kaveri? We already know the die size of it, but what kind of impact will it have on the marketplace? Has AMD chosen the right path by focusing on power consumption and HSA? Starting out an article with three questions in a row is a questionable tactic for any writer, but these are the things that first come to mind when considering a product the likes of Kaveri. I am hoping we can answer a few of these questions by the end of this article, but alas it seems as though the market will have the final say as to how successful this new architecture is.
AMD has been pursuing the “Future is Fusion” line for several years, but it can be argued that Kaveri is truly the first “Fusion” product that completes the overall vision for where AMD wants to go. The previous several generations of APUs were initially not all that integrated in a functional sense, but the complexity and completeness of that integration has been improved upon with each iteration. Kaveri takes this integration to the next step, and one which fulfills the promise of a truly heterogeneous computing solution. While AMD has the hardware available, we have yet to see if the software companies are willing to leverage the compute power afforded by a robust and programmable graphics unit powered by AMD’s GCN architecture.
(Editor's Note: The following two pages were written by our own Josh Walrath, dicsussing the technology and architecture of AMD Kaveri. Testing and performance analysis by Ryan Shrout starts on page 3.)
The first step in understanding Kaveri is taking a look at the process technology that AMD is using for this particular product. Since AMD divested itself of their manufacturing arm, they have had to rely on GLOBALFOUNDRIES to produce nearly all of their current CPUs and APUs. Bulldozer, Piledriver, Llano, Trinity, and Richland based parts were all produced on GF’s 32 nm PD-SOI process. The lower power APUs such as Brazos and Kabini have been produced by TSMC on their 40 nm and 28 nm processes respectively.
Kaveri will take a slightly different approach here. It will be produced by GLOBALFOUNDRIES, but it will forego the SOI and utilize a bulk silicon process. 28 nm HKMG is very common around the industry, but few pure play foundries were willing to tailor their process to the direct needs of AMD and the Kaveri product. GF was able to do such a thing. APUs are a different kind of animal when it comes to fabrication, primarily because the two disparate units require different characteristics to perform at the highest efficiency. As such, compromises had to be made.
Introduction and Design
We’re always on the hunt for good docking stations, and sometimes it can be difficult to locate one when you aren’t afforded the luxury of a dedicated docking port. Fortunately, with the advent of USB 3.0 and the greatly improved bandwidth that comes along with it, the options have become considerably more robust.
Today, we’ll take a look at StarTech’s USB3SDOCKHDV, more specifically labeled the Universal USB 3.0 Laptop Docking Station - Dual Video HDMI DVI VGA with Audio and Ethernet (whew). This docking station carries an MSRP of $155 (currently selling for $123 on Amazon.com) and is well above other StarTech options (such as the $100 USBVGADOCK2, which offers just one video output—VGA—10/100 Ethernet, and four USB 2.0 ports). In terms of street price, it is currently available at resellers such as Amazon for around $125.
The big selling points of the USB3SDOCKHDV are its addition of three USB 3.0 ports and Gigabit Ethernet—but most enticingly, its purported ability to provide three total screens simultaneously (including the connected laptop’s LCD) by way of dual HD video output. This video output can be achieved by way of either HDMI + DVI-D or HDMI + VGA combinations (but not by VGA + DVI-D). We’ll be interested to see how well this functionality works, as well as what sort of toll it takes on the CPU of the connected machine.
Continue reading our review of the StarTech USB3SDOCKHDV USB 3.0 Docking Station!!!
The stars are aligned
One of the most frequent questions we get at PC Perspective is some derivative of "is now the time to buy or should I wait?" If you listen to the PC Perspective Podcast or This Week in Computer Hardware you'll know that I usually err on the side of purchasing now. Why should you hold yourself back on the enjoyment of technology unless something DRAMATIC is just over the horizon.
This week I got another such email that prompted me to do some thinking. After just returning from CES 2014 in Las Vegas, I think its fair to say that we didn't hear anything concrete about upcoming SSD plans that would really be considered monumental. Sure, we saw plenty of PCIe SSDs as well as some M.2 options, but little for PC enthusiasts or even users that are looking to replace the hard drives in their PlayStation 4. Our team thinks that now is about as good of a time to buy an SSD as you will get.
And while you are always going to see price drops on commodity goods like flash storage, the prices on some of our favorite SSDs are at a low that we haven't witnessed without the rebates and flash deals of Black Friday / Cyber Monday. Let's take a look at a few:
Note: It should go without saying that all of these price discussions are as of this writing and could change...
Samsung 840 EVO 1TB SSD (Red: Amazon, Yellow: Newegg) - Graph courtesy HoverHound
The flagship SSD from the Samsung 840 EVO series SSDs, also the personal favorite of Allyn and most of the rest of the PC Perspective team, is near its all-time low in price at just $529 for a 1TB capacity. That is a cost per GB of just $0.529; no rebates, no gimmicks.
Samsung 840 EVO 500GB SSD (Red: Amazon, Yellow: Newegg) - Graph courtesy HoverHound
Likely the most popularly purchased of the EVO series is the 500GB model that is currently selling on Amazon for $309, or $0.618/GB. Obviously that is a higher mark than the 1TB hits but as you'll see in our tables below, in general, the higher capacity you purchase at the better value per GB you are going to find.
There are other capacities of the Samsung 840 EVO starting at 120GB, going to 250GB, and even a 750GB, all are included in the pricing table below. Depending on your budget and your need for the best perceived value, you can make a decision on your own.
Let's not forget the other options on the market; Samsung may be the strongest player today but companies like Intel, OCZ and Corsair continue to have a strong presence. The second best selling series of SSD during the holidays was the Intel 530 series of drives that utilize the LSI SandForce SF2281 controller. How do they stack up price-wise?
DisplayPort to Save the Day?
During an impromptu meeting with AMD this week, the company's Corporate Vice President for Visual Computing, Raja Koduri, presented me with an interesting demonstration of a technology that allowed the refresh rate of a display on a Toshiba notebook to perfectly match with the render rate of the game demo being shown. The result was an image that was smooth and with no tearing effects. If that sounds familiar, it should. NVIDIA's G-Sync was announced in November of last year and does just that for desktop systems and PC gamers.
Since that November unveiling, I knew that AMD would need to respond in some way. The company had basically been silent since learning of NVIDIA's release but that changed for me today and the information discussed is quite extraordinary. AMD is jokingly calling the technology demonstration "FreeSync".
Variable refresh rates as discussed by NVIDIA.
During the demonstration AMD's Koduri had two identical systems side by side based on a Kabini APU . Both were running a basic graphics demo of a rotating windmill. One was a standard software configuration while the other model had a modified driver that communicated with the panel to enable variable refresh rates. As you likely know from our various discussions about variable refresh rates an G-Sync technology from NVIDIA, this setup results in a much better gaming experience as it produces smoother animation on the screen without the horizontal tearing associated with v-sync disabled.
Obviously AMD wasn't using the same controller module that NVIDIA is using on its current G-Sync displays, several of which were announced this week at CES. Instead, the internal connection on the Toshiba notebook was the key factor: Embedded Display Port (eDP) apparently has a feature to support variable refresh rates on LCD panels. This feature was included for power savings on mobile and integrated devices as refreshing the screen without new content can be a waste of valuable battery resources. But, for performance and gaming considerations, this feature can be used to initiate a variable refresh rate meant to smooth out game play, as AMD's Koduri said.
Once known as Logan, now known as K1
NVIDIA has bet big on Tegra. Since the introduction of the SoC's first iteration, that much was clear. With the industry push to mobile computing and the decreased importance of the classic PC design, developing and gaining traction with a mobile processor was not only an expansion of the company’s portfolio but a critical shift in the mindset of a graphics giant.
The problem thus far is that while NVIDIA continues to enjoy success in the markets of workstation and consumer discrete graphics, the Tegra line of silicon-on-chip processors has faltered. Design wins have been tough to come by. Other companies with feet already firmly planted on this side of the hardware fence continue to innovate and seal deals with customers. Qualcomm is the dominant player for mobile processors with Samsung, MediaTek, and others all fighting for the same customers NVIDIA needs. While press conferences and releases have been all smiles and sunshine since day one, the truth is that Tegra hasn’t grown at the rate NVIDIA had hoped.
Solid products based on NVIDIA Tegra processors have been released. The first Google Nexus 7 used the Tegra 3 processor, and was considered the best Android tablet on the market by most, until it was succeeded by the 2013 iteration of the Nexus 7 this year. Tegra 4 slipped backwards, though – the NVIDIA SHIELD mobile gaming device was the answer for a company eager to show the market they built compelling and relevant hardware. It has only partially succeeded in that task.
With today’s announcement of the Tegra K1, previously known as Logan or Tegra 5, NVIDIA hopes to once again spark a fire under partners and developers, showing them that NVIDIA’s dominance in the graphics fields of the PC has clear benefits to the mobile segment as well. During a meeting with NVIDIA about Tegra K1, Dan Vivoli, Senior VP of marketing and a 16 year employee, equated the release of the K1 to the original GeForce GPU. That is a lofty ambition and puts of a lot pressure on the entire Tegra team, not to mention the K1 product itself, to live up to.
Tegra K1 Overview
What we previously knew as Logan or Tegra 5 (and actually it was called Tegra 5 until just a couple of days ago), is now being released as the Tegra K1. The ‘K’ designation indicated the graphics architecture that powers the SoC, in this case Kepler. Also, it’s the first one. So, K1.
The processor of the Tegra K1 look very familiar and include four ARM Cortex-A15 “r3” cores and 2MB of L2 cache with a fifth A15 core used for lower power situations. This 4+1 design is the same that was introduced with the Tegra 4 processor last year and allows NVIDIA to implement a style of “big.LITTLE” design that is unique. Some slight modifications to the cores are included with Tegra K1 that improve performance and efficiency, but not by much – the main CPU is very similar to the Tegra 4.
NVIDIA also unveiled late last night that another version of the Tegra K1 that replaces the quad A15 cores with two of the company's custom designs Denver CPU cores. Project Denver, announced in early 2011, is NVIDIA's attempt at building its own core design based on the ARMv8 64-bit ISA. This puts this iteration of Tegra K1 on the same level as Apple's A7 and Qualcomm's Krait processors. When these are finally available in the wild it will be incredibly intriguing to see how well NVIDIA's architects did in the first true CPU design from the GPU giant.
Follow all of our coverage of the show at http://pcper.com/ces!
Introduction and Unboxing
We've been covering NVIDIA's new G-Sync tech for quite some time now, and displays so equipped are finally shipping. With all of the excitement going on, I became increasingly interested in the technology, especially since I'm one of those guys who is extremely sensitive to input lag and the inevitable image tearing that results from vsync-off gaming. Increased discussion on our weekly podcast, coupled with the inherent difficulty of demonstrating the effects without seeing G-Sync in action in-person, led me to pick up my own ASUS VG248QE panel for the purpose of this evaluation and review. We've generated plenty of other content revolving around the G-Sync tech itself, so lets get straight into what we're after today - evaluating the out of box installation process of the G-Sync installation kit.
All items are well packed and protected.
Included are installation instructions, a hard plastic spudger for opening the panel, a couple of stickers, and all necessary hardware bits to make the conversion.
Introduction and Technical Specifications
Courtesy of GIGABYTE
The GIGABYTE G1.Sniper 5 motherboard is among GIGABYTE's flagship boards supporting the forth generation of the Intel Core processor line through the integrated Z87 chipset. The board offers support for the newest generation of Intel LGA1150-based processors with all the integrated features and port support you've come to expect from a high-end board. At an MSRP of $419.99, the G1.Sniper 5 premium price is only matched by its premium and expansive feature set.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
GIGABYTE packed the G1.Sniper full of premium features to ensure its viability as a top-rated contender. The board features the Ultra Durable 5 Plus power technology and the Amp-Up Audio technology. Ultra Durable 5 Plus brings several high-end power components into the board's design: International Rectifier (IR) manufactured PowIRstage™ ICs and PWM controllers, Nippon Chemi-con manufactured Black Solid capacitors with a 10k hour operational rating at 105C, 15 micron gold plating on the CPU socket pins, and two 0.070mm copper layers imbedded into the PCB for optimal heat dissipation. GIGABYTE's Amp-Up Audio technology integrates an op-amp socket into the board's audio PCB, giving the user the ability to customize their audio listening experience. Additionally, the G1.Sniper 5 has the following integrated features: 10 SATA 6Gb/s ports; dual GigE NICs - an Intel NIC and a Qualcomm Killer NIC; four PCI-Express x16 slots for up to quad-card NVIDIA SLI or AMD CrossFire support; three PCI-Express x1 slots; on board power, reset, and BIOS reset buttons; switch BIOS and Dual-BIOS switches; 2-digit diagnostic LED display; integrated voltage measurement points; and USB 2.0 and 3.0 port support.
Courtesy of GIGABYTE
Small form factor cases and the push to Mini ITX designs took a dramatic journey during 2013 as the popularity of the smaller PC once again became a popular trend. Though a company like Shuttle, that hardly exists in the form it did in 2004, was the first PC hardware company to really drive home the idea of an SFF system design, many other players have released compelling products helping to strengthen it as one of the unique possibilities for enthusiast PCs.
Even better, though a Mini-ITX based platform could mean limited options for hardware and performance, with companies like ASUS, EVGA, BitFenix and others in the mix, building an incredibly fast and powerful gaming machine using small hardware is not only easy but can be done at a lower price than you might expect.
One entry that found its way to our offices this December comes from Silverstone in the form of the Raven Z, RVZ01 case. This case includes unique features and capabilities including the ability to support nearly any high end graphics card on the market (dual slot or single), space for larger heatsinks and even liquid coolers along with a home theater friendly look and style. Oh, and it's
the same almost the same design that Valve used for its beta Steam Machines as well. (Update: Turns out the size of the Steam Machine is actually a fair bit smaller than the Silverstone RVZ01.)
Sapphire Triple Fan Hawaii
It was mid-December when the very first custom cooled AMD Radeon R9 290X card hit our offices in the form of the ASUS R9 290X DirectCU II. It was cooler, quieter, and faster than the reference model; this is a combination that is hard to pass up (if you could buy it yet). More and more of these custom models, both in the R9 290 and R9 290X flavor, are filtering their way into PC Perspective. Next on the chopping block is the Sapphire Tri-X model of the R9 290X.
Sapphire's triple fan cooler already made quite an impression on me when we tested a version of it on the R9 280X retail round up from October. It kept the GPU cool but it was also the loudest of the retail cards tested at the time. For the R9 290X model, Sapphire has made some tweaks to the fan speeds and the design of the cooler which makes it a better overall solution as you will soon see.
The key tenets for any AMD R9 290/290X custom cooled card is to beat AMD's reference cooler in performance, noise, and variable clock rates. Does Sapphire meet these goals?
The Sapphire R9 290X Tri-X 4GB
While the ASUS DirectCU II card was taller and more menacing than the reference design, the Sapphire Tri-X cooler is longer and appears to be more sleek than the competition thus far. The bright yellow and black color scheme is both attractive and unique though it does lack the LED light that the 280X showcased.
Sapphire has overclocked this model slightly, to 1040 MHz on the GPU clock, which puts it in good company.
|AMD Radeon R9 290X||ASUS R9 290X DirectCU II||Sapphire R9 290X Tri-X|
|Rated Clock||1000 MHz||1050 MHz||1040 MHz|
|Memory Clock||5000 MHz||5400 MHz||5200 MHz|
|TDP||~300 watts||~300 watts||~300 watts|
|Peak Compute||5.6 TFLOPS||5.6+ TFLOPS||5.6T TFLOPS|
There are three fans on the Tri-X design, as the name would imply, but each are the same size unlike the smaller central fan design of the R9 280X.
Introduction and Technical Specifications
Courtesy of Phanteks
A relative new comer in the enthusiast space, Phanteks has taken the hearts and minds with their high-performance and innovatively designed thermal cooling solutions. The PH-TC12DX cooler features a massive dual-radiator tower actively cooled by two 120mm fans with a copper, nickel-plated CPU base plate. The cooler cines packaged with support for all current Intel and AMD CPU socket offerings. To properly gage the PH-TC12DX's performance, we put it up against several similarly-classed air and water-based cooling solutions. At a retail price of $54.99, the Phanteks PH-TC12DX offers you solid performance without busting your bank.
Courtesy of Phanteks
Courtesy of Phanteks
Courtesy of Phanteks
The Phanteks PH-TC12DX cooler consists of a single tower radiator with four U-shaped heat pipes intersecting the its cooling fins. The cooler uses nickel-plated copper heat pipes to transfer the heat from the copper CPU base plate to the fins of the aluminum radiator for optimal heat transmission and dispersal. The tower is sandwiched by two high air flow 120mm fans for heat dispersion from the radiator. Phanteks went to great lengths to make sure the PH-TC12DX kept a sleek looking appearance from the black coloration of the radiator to the branded top-plate to hide the heat pipe termination points.
Courtesy of Phanteks
Phanteks includes everything you need to get the cooler up and running in your system: mounting kits supporting both Intel and AMD-based systems, dual PH-F120HP 120mm fans, fan mounting kits, a sleeved dual-ended fan power cable, and Phanteks PH-NDC thermal paste.
Introduction and Features
We have been reviewing Seasonic power supplies for over ten years here at PCPerspective and they have never failed to impress us. Seasonic is also one of the few companies that actually builds their own power supplies (along with supplying units to numerous other big-name brands). Seasonic has built a stellar reputation for producing some of the best PC power supplies on the market today. In their ongoing pursuit to continuously improve their products, Seasonic has recently introduced the S12G Series, which includes four models: S12G-450, S12G-550, S12G-650, and the S12G-750 that we will be taking a detailed look at in this review. Here are a few of the highlights offered by the new S12G Series power supplies.
• 80Plus Gold certification
• Standard all in one, flat black cabling
• High +12V Output
• Smart and Silent Fan Control (S2FC)
• S12G-750/650: PCI-E 8P/6P x 4, SATA x 10, 4P Molex x 4, FDD x 1
• S12G-550/450: PCI-E 8P/6P x 2, SATA x 8, 4P Molex x 3, FDD x 1
• Worldwide 5-Year Warranty
The S12G Series is targeted towards gamers and PC enthusiasts who want solid performance at a user friendly price. To accomplish this Seasonic has designed the S12G Series on the same basic platform as many of their premium products but has forgone a few features like modular cables and fanless operation for price conscious consumers. Retail prices are currently ranging from $79.99 USD for the S12G-450 to $109.99 USD for the S12G-750 (newegg.com, November 2013).
Here is what Seasonic has to say about the new S12G Series: “The S12G Series is the newest addition to Seasonic’s families of award winning retail products, representing the latest innovation of our engineering team. To meet the demands of users who are looking for reliable 80Plus Gold performance for gaming and overall usage, the S12G Series is designed to support Intel’s Haswell processors, features more SATA cables and is an affordable solution for a wide range of applications.”
Seasonic S12G Series Key Features:
(Courtesy of Seasonic)
EVGA Enters the Chassis Market
Cases are a funny thing. Some people spend more time fretting over the chassis of their new system than any other component while some builders simply could not care less about what "box" is holding the carefully selected components that power their gaming rig. While I can see both points of view, I think it is a shame to completely ignore the "look" of your system as it will be the one part of your design choices that you'll see on a daily basis.
EVGA has a great reputation in the enthusiast market thanks to its top of the line graphics cards and the emphasis of the company on enthusiast level products, water cooling and more. In recent years EVGA has branched into motherboards (again), power supplies and now cases. But rather than target a market that was saturated and dominated by a few big players, they decided to target the Mini ITX form factor. Having just recently released the Z87 Stinger Mini ITX motherboard, EVGA has created an ecosystem that allows a builder to use exclusively EVGA components with the new Hadron Air case.
In our video review below you'll see our overview of the design, the positive and negatives of the design and of small cases in general and my final thoughts on this rather impressive mITX design. After you are done watching it head down to the collection of photos below for a written analysis of the Hadron Air.
If you have never worked in a small form factor PC before, let me warn you up front - this is not as simple of a process as building a computer in a standard ATX case. Space is tight and doing simple things like routing cables from the motherboard to the hard drive can be a 10 minute ordeal. Additionally, sometimes the ORDER of installation can make a HUGE difference in the ease of the entire process so pay attention to other users that might have used your particular chassis.
Don't let its small size fool you though, the EVGA Hadron Air can pack a lot of punch. Using the latest mITX motherboard and graphics cards from EVGA's lineup you can literally build one of the most powerful gaming systems around in its small 6x12x12-in space!
What is the Hardware Leaderboard
What is a Leaderboard? If you have to ask you really haven't clicked on enough of the tabs at the top of PC Perspective! The Leaderboard consists of four different systems, each with a price target and are updated monthly. They start with the ~$500 budget system which is for general family or dorm usage but not for heavy gaming usage, though it can certainly handle many online games without issue. The Mid Range machine can be yours for around $1000 and packs enough power under the hood to handle productivity software and can give a console a run for its money when gaming. Things start getting more serious when you look at the High End machine, even while keeping the price around $1500 you start to see serious performance that will show you why PC Gaming is still far more popular than some would have you believe. Finally is the Dream Machine which doesn't have a specific price cap but is limited by a certain amount of common sense; you can slap four GPUs in the system but you really will not be getting a great return on your investment as the performance scaling does not continue to increase at a linear pace.
You may notice several components missing from the HWLB and there is a reason for that. Enclosures are a very personal choice for system builders and no ones desires are exactly the same. Dremel owners with a good imagination want a case that is easily moddable while pet owners want washable filters on their systems. Some may want a giant white case while others an unobtrusive and quiet enclosure and who can tell where you prefer your front panel connectors to be but you? Cooling solutions are again a personal choice, do you plan on getting the biggest chunk of metal you can find with three 140mm fans strapped to it or were you thinking of using watercooling, either a self contained CPU cooler or a custom built cooling loop that incorporates multiple components? The same applies to monitors with some gamers preferring to sacrifice colour quality and viewing angle for the refresh rates of a TN display while others have a need to pick up a professional quality display at over $1000 for when they are working. Size is always personal; just how big can you fit in your place? (Editor's note: we did include a couple of case recommendations in the build guide summary tables, in case you are interested though.)
So continue on to see the components that make up the current four builds of the Hardware Leaderboard. Once you have all your components you can reference Ryan's videos covering the installation of the parts into the case of your choice as well as installing your OS and Steam so you can get right to gaming and surfing.
The First Custom R9 290X
It has been a crazy launch for the AMD Radeon R9 series of graphics cards. When we first reviewed both the R9 290X and the R9 290, we came away very impressed with the GPU and the performance it provided. Our reviews of both products resulted in awards of the Gold class. The 290X was a new class of single GPU performance while the R9 290 nearly matched performance at a crazy $399 price tag.
But there were issues. Big, glaring issues. Clock speeds had a huge amount of variance depending on the game and we saw a GPU that was rated as "up to 1000 MHz" running at 899 MHz in Skyrim and 821 MHz in Bioshock Infinite. Those are not insignificant deltas in clock rate that nearly perfectly match deltas in performance. These speeds also changed based on the "hot" or "cold" status of the graphics card - had it warmed up and been active for 10 minutes prior to testing? If so, the performance was measurably lower than with a "cold" GPU that was just started.
That issue was not necessarily a deal killer; rather, it just made us rethink how we test GPUs. The fact that many people were seeing lower performance on retail purchased cards than with the reference cards sent to press for reviews was a much bigger deal. In our testing in November the retail card we purchased, that was using the exact same cooler as the reference model, was running 6.5% slower than we expected.
The obvious hope was the retail cards with custom PCBs and coolers would be released from AMD partners and somehow fix this whole dilemma. Today we see if that was correct.
A slightly smaller MARS
The NVIDIA GeForce GTX 760 was released in June of 2013. Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously. My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD.
As for ASUS, they have a storied history with the MARS brand. Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them. The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course). The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide. It was heavy, expensive and fast; blazing fast. But at a price of $1200+ it wasn't on the radar of most PC gamers.
Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate. Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.
Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market. Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them. The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.
Introduction and Features
Be Quiet! has been a market leader for PC power supplies in Germany for seven years straight and in 2013 they are continuing to expand their PC power supply lineup into North American markets. Earlier this year, we reviewed Be Quiet!’s top-of-the-line Dark Power Pro 10 850W PSU and the value-minded Pure Power L8 Series with very good results. Now we are going to take a look at the new Power Zone Series, sprecifically the Power Zone 1000W PSU. The Power Zone Series features a 135mm Be Quiet! SilentWings fan, are certified for 80Plus Bronze efficiency, come with all-modular cables, and are backed by a 5-year warranty.
Be Quiet! is targeting the Power Zone Series towards discerning gamers and PC enthusiasts seeking high power, top performance and great features.
Here is what Be Quiet! has to say about their Power Zone Series: “The Power Zone Series provides the winning combination of superior performance, rock-solid stability, and advanced cooling. Whether you are assembling a high power PC or multi-GPU gaming system, your build will benefit from the Power Zone features. The Power Zone 1000W hits the sweet spot with granite stability, advanced cooling features, low noise and great value.”
Be Quiet! Power Zone 1000W PSU Key Features:
• 1000W of continuous power output @ 50°C
• Massive +12V rail design is ideal for overclocking
• Full cable management supports maximum build flexibility
• Quiet operation: 135mm SilentWings fan with 6-pole motor
• COOL*OFF feature runs fans for 3 minutes after system shutdown
• Connect up to three case fans for optimized system cooling
• 80Plus Bronze certification (up to 90% power conversion efficiency)
• Meets Energy Star 5.2 Guidelines
• Fulfills ErP 2013 Guidelines
• Supports Intel’s Deep Power Down C6 mode
• Sleeved cables for improved cooling and more attractive looks
• NVIDIA SLI Ready and AMD CrossFireX certified
• Up to six PCI-E connectors for multi-GPU support
• 5-Year warranty
• German product conception, design and quality control
A not-so-simple set of instructions
Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening. We have spent quite a lot of time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.
Our full video tutorial of installing and configuring SteamOS
First up was selecting the hardware for the build. As is usually the case, we had a nearly-complete system sitting around that needed some tweaks. Here is a quick list of the hardware we used, with a discussion about WHY just below.
|Processor||Intel Core i5-4670K - $222|
|Motherboard||EVGA Z87 Stinger Mini ITX Motherboard - $257|
|Memory||Corsair Vengeance LP 8GB 1866 MHz (2 x 4GB) - $109|
|Graphics Card||NVIDIA GeForce GTX TITAN 6GB - $999
EVGA GeForce GTX 770 2GB SuperClocked - $349
|Storage||Samsung 840 EVO Series 250GB SSD - $168|
|Case||EVGA Hadron Mini ITX Case - $189|
|Power Supply||Included with Case|
|Optical Drive||Slot loading DVD Burnder - $36|
|Peak Compute||4,494 GFLOPS (TITAN), 3,213 GFLOPS (GTX 770)|
|Total Price||$1947 (GTX TITAN) $1297 (GTX 770)|
We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on. First up was the case, the new EVGA Hadron Mini ITX chassis. It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.
The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended. Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.
Introduction and Technical Specifications
Courtesy of EVGA
The EVGA Z87 Stinger is EVGA's Z87-based answer for the small form-factor crowd. Sporting the micro-ITX form factor, the board is featured packed and offers support for the latest generation of Intel LGA1150-based processors. While its MSRP of $229.99 may seem large for its small stature, the Z87 Stinger's feature list makes it well worth the outlay.
Courtesy of EVGA
The EVGA Z87 Stinger board features a 6-phase power delivery system and an impressive 10 layer PCB. Additionally, EVGA designed the CPU socket with a higher amount of gold, as well as use of solid state capacitors throughout the board to ensure problem-free operation under all operational circumstances. The following features are integrated into the Z87 Stinger: 4 SATA 6Gb/s ports; 1 mPCIe/mSATA 6Gb/s port; 1 eSATA 6Gb/s port; an Intel GigE NIC; 1 PCI-Express x16 slot; on board power, reset, and BIOS reset buttons; BIOS Select switch; 2-digit diagnostic LED display; and USB 2.0 and 3.0 port support.
Courtesy of EVGA
Technical Specifications (taken from the EVGA website)
|Based on Intel Z87 chipset|
|2 x 240-pin DIMM sockets
Maximum of 16GB of DDR3 (2666MHz+ in dual channel configuration)
|4 x Serial ATA 600MB/sec (4 Internal) with support for RAID 0 and RAID1|
|Audio connector (Line-in, Line-out, MIC)|
|6 Channel Creative Sound Core3D
1 x 10/100/1000 (Intel i217)
|mITX Form Factor
Length: 6.7in - 170.18mm
Width: 6.7in - 170.18mm
Operating System Support
|Windows 8 32/64bit
Windows 7 32/64bit
Windows Vista 32/64bit
Windows XP 32/64bit
|This product comes with a 3 year warranty. Registration is recommended.|
Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Introduction and Design
Contortionist PCs are a big deal these days as convertible models take the stage to help bridge the gap between notebook and tablet. But not everyone wants to drop a grand on a convertible, and not everyone wants a 12-inch notebook, either. Meanwhile, these same people may not wish to blow their cash on an underpowered (and far less capable) Chromebook or tablet. It’s for these folks that Lenovo has introduced the IdeaPad Flex 14 Ultrabook, which occupies a valuable middle ground between the extremes.
The Flex 14 looks an awful lot like a Yoga at first glance, with the same sort of acrobatic design and a thoroughly IdeaPad styling (Lenovo calls it a “dual-mode notebook”). The specs are also similar to that of the x86 Yoga, though with the larger size (and later launch), the Flex also manages to assemble a slightly more powerful configuration:
The biggest internal differences here are the i5-4200U CPU, which is a 1.6 GHz Haswell model with a TDP of 15 W and the ability to Turbo Boost (versus the Yoga 11S’ i5-3339Y, which is Ivy Bridge with a marginally lower TDP of 13 W and no Turbo Boost), the integrated graphics improvements that follow with the newer CPU, and a few more ports made possible by the larger chassis. Well, and the regression to a TN panel from the Yoga 11S’ much-appreciated IPS display, which is a bummer. Externally, your wallet will also appreciate a $250 drop in price: our model, as configured here, retails for just $749 (versus the $999 Yoga 11S we reviewed a few months back).
You can actually score a Flex 14 for as low as $429 (as of this writing), by the way, but if you’re after any sort of respectable configuration, that price quickly climbs above the $500 mark. Ours is the least expensive option currently available with both a solid-state drive and an i5 CPU.