All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Looking at the Exterior
We have had some really good experiences with Puget Systems pre-built PCs in the past and a little while ago, the company sent us a modestly priced HTPC based on the Serenity line of systems. Based on the Intel Core i5 Sandy Bridge platform, the Serenity has a lot of customizations that help keep the computer quiet that are unique.
With a cost hovering around $1800 though, does the Serenity offer enough to consumers?
The Serenity Home Theater PC
The Puget Systems Serenity line actually spans small form factor chassis, HTPC designs and even standard desktop ATX designs, one of which we have previously reviewed. Today we are going to be showing you the HTPC form factor that could fit in your home theater furniture (if you have some hefty space available). Let's look quickly at the specifications before we dive into the design.
Click to Enlarge
- Intel Core i5-2500K
- ASUS H67 Motherboard
- 4GB DDR3-1333 Memory
- 120GB Intel 320 SSD
- 1.5TB Western Digital Caviar Green HDD
- ASUS 12x Blu-Ray Burner
- Windows 7 Home Premium x64
Introduction and Features
SilverStone has a well earned reputation among PC enthusiasts for providing a full line of high quality enclosures, power supplies, cooling components, and accessories. Their Strider Gold series, which includes four models ranging from 750W up to 1,200W, was designed to meet the needs of professional users and enthusiasts alike. The Strider Gold ST1200-G that we have up for review today is rated for 1200W DC output and is 80 Plus Gold certified. It comes with a full compliment of all modular cables, including dual EPS connectors and eight PCI-E connectors for multiple high-end graphics adapter support. High efficiency also means less waste heat, which should result in lower operating temperatures and less fan noise.
Here is what SilverStone has to say about their new Strider Gold 1200W power supply:
“To meet the ever higher requirements imbued by professional users and enthusiasts alike, SilverStone created the Strider Gold series power supplies. The Strider Gold models are built on a similar platform as the award-winning Strider Plus series but with even higher efficiency ratings.
The Strider Gold series will encompass wattage range from 750W to 1200W for a great variety of applications. In addition to 80 Plus Gold level efficiency, all models are built to meet very high standards in electrical performance so they have ±3% voltage regulation, low ripple and noise and high amperage single +12V rail. Other notable features also included are 24/7 40°C continuous output capacity, Japanese capacitors, low-noise 135mm fan, and multiple sets of PCI-E cables. For users looking for a power supply with a faultless combination of performance, efficiency, and quality, there is no need to look beyond the Strider Gold series.”
EVGA Changes the Game Again
Dual-GPU graphics cards are becoming an interesting story. While both NVIDIA and AMD have introduced their own reference dual-GPU designs for quite some time, it is the custom build models from board vendors like ASUS and EVGA that really peak our interest because of their unique nature. Earlier this year EVGA released the GTX 460 2Win card that brought the worlds first (and only) graphics card with a pair of the GTX 460 GPUs on-board.
ASUS has released dual-GPU options as well including the ARES dual Radeon HD 5870 last year and the MARS II dual GTX 580 just this past August but they were both prohibitively rare and expensive. The EVGA "2Win" series, which we can call it now that there are two of them, is still expensive but much more in line with the performance per dollar of the rest of the graphics card market. When the company approached us last week about the new GTX 560 Ti 2Win, we jumped at the chance to review it.
The EVGA GeForce GTX 560 Ti 2Win 2GB
The new GTX 560 Ti 2Win from EVGA follows directly in the footsteps of the GTX 460 model - we are essentially looking at a pair of GTX 560 Ti GPUs on a single PCB running in SLI multi-GPU mode. Clock speeds, memory capacity, performance - it should all be pretty much the same as if you were running a pair of GTX 560 Ti cards independently.
Just as with the GTX 460 2Win, EVGA is the very first company to offer such a product. NVIDIA didn't design a reference platform and pass it along to everyone like they did with the GTX 590 - this is all EVGA.
The Alienware M17x Giveth
Mobile graphics cards are really a different beast than the desktop variants. Despite have similar names and model numbers, the specifications vary greatly as the GTX 580M isn't equivalent to the GTX 580 and the HD 6990M isn't even a dual-GPU product. Also, getting the capability to do a direct head-to-head is almost always a tougher task thanks to the notebook market's penchant for single-vendor SKUs.
Over the past week or two, I was lucky enough to get my hands on a pair of Alienware M17x notebooks, one sporting the new AMD Radeon HD 6990M discrete graphics solution and the other with the NVIDIA GeForce GTX 580M.
AMD Radeon HD 6990M on the left; NVIDIA GeForce GTX 580M on the right
Also unlike the desktop market - the time from announcement of a new mobile GPU product to when you can actually BUY a system including it tends to be pretty long. Take the two GPUs we are looking at today for example: the HD 6990M launched in July and we are only just now finally seeing machines ship in volume; the GTX 580M in June.
Well, problems be damned, we had the pair in our hands for a few short days and I decided to put them through the ringer in our GPU testing suite and added Battlefield 3 in for good measure as well. The goal was to determine which GPU was actually the "world's fastest" as both companies claimed to be.
Introduction, Design and Display Quality
I’m a multi-monitor addict.
My addiction started many years ago. I had a rather lame customer service job, and as part of my job I needed to manage spreadsheets with customer interactions while also filling in data on a separate, large window. To accomplish this, two monitors were required. I was amazed at the efficiency of the setup, and I bought a second monitor for home use within a few months. Now, I can’t imagine using my desktop with a single display.
Laptops, however, are a different story. Multiple displays can actually be even more useful for road warriors because of the limited resolution of many laptops, yet there are issues with using multiple monitors on the road. Carrying around even a relatively small desktop monitor is out of the question, leaving few options.
Enter the Lenovo LT1421. This unique mobile monitor has a 14” panel and, according to Lenovo, it’s portable enough to be carried about with minimal hassle. Has the company managed to create a unique, must-have product for mobile productivity, or is multi-monitor use with a laptop still a concept that’s better on paper than in reality?
Down to Business
While I'm mostly a PC guy, I favor iPhones. My friends always jab at me because I'm a fairly die-hard techie, and they expect me to be sporting the latest overclocked super-smartphone with eleventy-thousand gigs and a built-in dishwasher. While my phone used to be included in the list of things I tinker with endlessly, I eventually came to the point where I just wanted my smallest mobile device to 'Just Work' for those tasks I needed it to. I just didn't have the time to tinker endlessly with the thing that handled more and more of my work-related duties, yet as a techie, I still enjoyed the ability to cram a bunch of functionality into my pocket. That led me to the iPhone, which I've used since 2008.
Like many iPhone users, I've upgraded through the various iterations over the years. The original, 3G, 3GS, 4, and most recently the 4S. I've also witnessed nearly every iteration of Apple's iOS (including many of the betas). Throughout all of these, I feel Apple did their best to prevent new versions from breaking application functionality, and while they did their best to keep bugs under control, every so often a few would creep in - typically with cross-compatibility of new features that could only be tested in a limited capacity before being released into the wild.
Apple's iPhone 4S introduced some added features to their line-up - enough to get me to bite the bullet earlier than I typically do. I was a bit of an iOS 5 veteran by then, as I'd been using it to test applications for a developer friend of mine. While some of the iOS 5 betas were a bit unruly in the battery life category, those problems appeared to be sorted out by the last and final beta prior to release. With my reservations against the new OS put to rest, the only thing holding me back was the possibility of the new dual core CPU drawing more power than the iPhone 4. The iFixit teardown (which revealed a higher capacity battery - presumably to counter power draw of the additional CPU), coupled with the improved camera and witnessing the endless toying around with Siri finally got me to bite the bullet.
I picked up a 4S from my local Apple Store, restored from the backup of my 4, and off I went. I enjoyed the phone for a few days, but quickly realized there was some sort of issue with battery life. The most glaring indicator was one night where I forgot to plug the 4S in before bed. The next morning I was surprised to find that nearly 40% of the charge was lost while I slept - almost enough to completely shut down the phone (and failing to wake me with the alarm I'd gleefully set via Siri the night before).
Introduction, Specs, Design and Ergonomics
Samsung's Galaxy S II smartphone debuted in the U.S. with Sprint, AT&T, and T-Mobile in September and we finally got our hands on a review sample. The Samsung smartphone runs on Android 2.3 "Gingerbread" operating system and includes an 8 MP camera with LED flash and 1080p video, front facing 2 MP camera, and Samsung’s custom TouchWiz user interface.
T-Mobile and Sprint’s version sports a 4.52-inch display, but AT&T’s version has a 4.3-inch screen that matches the original international version of the Galaxy S II. We are reviewing T-Mobile's Galaxy S II with 16GB of internal memory (there are two options for 16 and 32 GB). The Sprint and AT&T versions are outfitted with a dual-core 1.2 GHz Orion processor, but the T-Mobile version we are reviewing today sports a Qualcomm Snapdragon S3 1.5 GHz dual-core CPU.
Introduction and Design
Sound. It seems to be one of the new battlefields on which notebook computers are fighting, which is odd, because audio quality has until recently been so rarely a central focus of notebook manufacturers. Today, the artillery is clearly placed. HP arrived first with Beats Audio, but others have responded, such as MSI with its Dynaudio branded laptops and now ASUS with this Bang & Olufsen tagged N55, which comes with an external subwoofer by default.
Yep, that’s right. It’s not a large subwoofer (that’s the point - it’s small enough to potentially be transported in same bag as the laptop), but clearly ASUS is taking sound seriously with this laptop. Yet there’s so much more to a laptop that its sound, and that’s particularly true with a system such as this. Everywhere you look, the N55’s specification scream performance. A Core i7 quad-core mobile processor is the heart of the machine, and snuggles up with an Nvidia GT 555M graphics processor. What else is there to be had? Have a look.
Introduction and Features
It's always an exciting day when a new Corsair power supply shows up at the PCPerspective test lab and today we have a brand new HX1050 PSU to test out. Corsair continues to bring a full line of high quality power supplies, memory components, cases, cooling components, SSDs and accessories to market for the PC enthusiast and professional alike. Corsair's Professional Series power supplies now include four models; the HX650, HX750, HX850 and HX1050. All of the power supplies in the Professional Series feature premium quality components, an energy-efficient design (80 Plus Silver certified) and quiet operation; and they are backed by a 7-year warranty and lifetime access to Corsair's comprehensive technical support and customer service.
Here is what Corsair has to say about their Professional Series PSUs:
"Corsair Professional Series power supplies are designed for enthusiasts and gamers who demand high performance, high efficiency, the flexibility of modular cables, and unflinching reliability. Designed and engineered using cutting-edge technology, each Corsair Professional Series power supply is built to the highest quality standards and delivers the features and performance that technology enthusiasts demand.
Corsair Professional Series power supplies are built using Industrial Grade electrical components, ensuring the stable, clean and reliable voltages that are essential for high-end gaming PCs and workstations. Professional Series PSUs are rigorously tested at 100% load and an ambient temperature of 50°C, guaranteeing stability and reliability even in the most demanding multi-core, multi-graphics card systems.
The Corsair Professional Series HX1050 PSU utilizes advanced DC-to-DC circuitry, and exceeds the requirements for 80 Plus Silver certification standards, for up to 88% energy-efficiency. This reduces wasted energy, reducing energy bills, and lowers the amount of excess heat generated, allowing the cooling fan to spin more slowly and quietly.
Corsair Professional Series power supplies boast an innovative semi-modular cabling configuration that uses low-profile, flat modular cables, which reduce air friction to help maximize airflow through the computer chassis. The modular design also simplifies installation as it allows unprecedented flexibility in utilizing only those cables which are needed. Corsair Professional Series is backed by and industry-leading 7-year warranty and comprehensive customer support. "
You don't have 3D Vision 2? Loser.
In conjunction with GeForce LAN 6 current taking place on the USS Hornet in Alameda, NVIDIA is announcing an upgrade to the lineup of 3D Vision technologies. Originally released back in January of 2009, 3D Vision was one of the company's grander attempts to change the way PC gamers, well, game. Unfortunately for NVIDIA and the gaming community, running a 3D Vision setup required a new, much more expensive display as well as some glasses that originally ran $199.
While many people, including myself, were enamored with 3D technology when we first got our hands on it, the novelty kind of wore off and I found myself quickly back on the standard panels for gaming. The reasons were difficult to discern at first but it definitely came down to some key points:
- Panel resolution
- Panel size
- Image quality
The cost was obvious - having to pay nearly double for a 3D Vision capable display just didn't jive for most PC gamers and then the need to have to purchase $200 glasses made it even less likely that you would plop down the credit card. Initial 3D Vision ready displays, while also being hard to find, were limited to a resolution of 1680x1050 and were only available in 22-in form factors. Obviously if you were interested in 3D technology you were likely a discerning gamer and running at lower resolutions would be less than ideal.
The new glasses - less nerdy?
Yes, 24-in and 1080p panels did come in 2010 but by then much of the hype surrounding 3D Vision had worn off. To top it all off, even if you did adopt a 3D Vision kit of your own you realized that the brightness of the display was basically halved when operating in 3D mode - with one shutter of your glasses covered at any given time, you only receive half the total output from the screen leaving the image quality kind of drab and washed out.
Introduction and Features
Courtesy of MSI
Micro-Star International, better known as MSI, has been busy little bees in 2011 to fend off fierce competition from ASUS, Gigabyte and other motherboard vendors. This year's launch of the Z68 chipset from Intel combined the capabilities and features from the H67 and P67 chipsets, and MSI capitalized on this when they joined forces with LucidLogix to include their Virtu technology in their latest Z68A-GD80 motherboard. Lucid's Virtu tech provides users with switchable graphics, which allows users to enjoy both graphics power of integrated GPU and discrete GPU.
Courtesy of MSI
MSI also used the Z68A-GD80 as their first motherboard to support PCI Express 3.0, which boosts 32GB/s of transfer bandwidth and makes this mobo a bit more future proof for users looking for their next hardware upgrade. MSI also upgraded their BIOS system to ClickBIOS II, which provides a consistent user interface both in the UEFI BIOS and in Windows. Users can control their system settings directly from Windows and the GUI also supports touchscreen controls.
Bulldozer. Since its initial unveiling and placement on the roadmap many have called the Bulldozer architecture the savior of AMD, the processor that would finally turn the tide back against Intel and its dominance in the performance desktop market. After quite literally YEARS of waiting we have finally gotten our hands on the Bulldozer processors, now called the AMD FX series of CPUs, and can report on our performance and benchmarking of the platform.
With all of the leaks surrounding the FX processor launch you might be surprised by quite a bit of our findings - both on the positive and the negative side of things. With all of the news in the past weeks about Bulldozer, now we can finally give you the REAL information.
- Bulldozer First Release and the State of 32nm AMD Parts
- AMD Bulldozer Processor hits 8.429 GHz - New World Record!
- AMD Bulldozer FX Processor Benchmarks Leaked
Before we dive right into the performance part of our story I think it is important to revisit the Bulldozer architecture and describe what makes it different than the Phenom II architecutre as well as Intel's Sandy Bridge design. Josh wrote up a great look at the architecture earlier in the year with information that is still 100% pertinent and we recount much of that writing here. If you are comfortable with the architeture design points, then feel free to skip ahead to the sections you are more interested in - but I recommend highly you give the data below a look first.
The below text was taken from Bulldozer at ISSCC 2011 - The Future of AMD Processors.
Bulldozer Architecture Revisited
Bulldozer brings very little from the previous generation of CPUs, except perhaps the experience of the engineers working on these designs. Since the original Athlon, the basic floor plan of the CPU architecture AMD has used is relatively unchanged. Certainly there were significant changes throughout the years to keep up in performance, but the 10,000 foot view of the actual decode, integer, and floating point units were very similar throughout the years. TLB’s increasing in size, more instructions in flight, etc. were all tweaked and improved upon. Aspects such as larger L2 caches, integrated memory controllers, and the addition of a shared L3 cache have all brought improvements to the architecture. But the overall data flow is very similar to that of the original Athlon introduced 14 years ago.
As covered in our previous article about Bulldozer, it is a modular design which will come in several flavors depending on the market it is addressing. The basic building block of the Bulldozer core is a 213 million transistor unit which features 2 MB of L2 cache. This block contains the fetch and decode unit, two integer execution units, a shared 2 x 128 bit floating point/SIMD unit, L1 data and instruction caches, and a large shared L2 unit. All of this is manufactured on GLOBALFOUNDRIES’ 32nm, 11 metal layer SOI process. This entire unit, plus 2 MB of L2 cache, is contained in approximately 30.9 mm squared of die space.
A Pre-Built System in Your Budget
We all know that the majority of our readers enjoy building their own gaming systems - picking components, building the hardware, installing the software, etc. But as gamers get older and the amount of time they have to dedicate to their passion decreases, some might be willing to take the move to buying a pre-built gaming rig based on industry standard components. The benefits are definitely there: quicker turn around with just a couple days shipping, warranty and support for anything that should go wrong and the ability to upgrade and adapt your system in anyway you want.
AVADirect is a system builder that has been specializing in gaming PCs since 2003 and is based near Cleveland, Ohio. They offer a wide array of PC options including the most basic and inexpensive machines used for business computing as well as top-level gaming machines with overclocked settings and high-end water cooling configurations.
Recently AVADirect approached me with an interesting review idea: build a custom system for just around $1000 made for gaming and see if it could stand up to our testing. The result is a rig based on the P67 platform (though since our system shipped you can get Z68 motherboards for the same price) and the Core i5 Sandy Bridge processor, coupled with a Radeon HD 6950 that provides enough gaming power to tackle the best PC games.
Here is our video review of the AVADirect custom $1000 gaming machine, and check below for more images and thoughts!
RAGE is not as dependant on your graphics hardware as it is on your CPU and storage system (which may be an industry first); the reason for which we will discover when talking about the texture pop-up issue on the next page.
The first id Software designed game since the release of Doom 3 in August of 2004, RAGE has a lot riding on it. Not only is this the introduction of the idTech 5 game engine but also culminates more than 4 years of development and the first new IP from the developer since the creation of Quake. And since the first discussions and demonstrations of Carmack's new MegaTexture technology, gamers have been expecting a lot as well.
Would this game be impressive enough on the visuals to warrant all the delays we have seen? Would it push today's GPUs in a way that few games are capable of? It looks like we have answers to both of those questions and you might be a bit disappointed.
First, let's get to the heart of the performance question: will your hardware play RAGE? Chances are, very much so. I ran through some tests of RAGE on a variety of hardware including the GeForce GTX 580, 560 Ti, 460 1GB and the Radeon HD 6970, HD 6950, HD 6870 and HD 5850. The test bed included an Intel Core i7-965 Nehalem CPU, 6GB of DDR3-1333 memory running off of a 600GB VelociRaptor hard drive. Here are the results from our performance tests running at 1920x1080 resolution with 4x AA enabled in the game options:
If you have been visiting PC Perspective at all over the last week there is no doubt you have seen a lot of discussion about the currently running Battlefield 3 beta. We posted an article looking at performance of several different GPUs in the game and then followed it up with a look at older cards like the GeForce 9800 GT. We did a live stream of some PC Perspective staff playing BF3 with readers and fans, showed off and tested the locked Caspian Border map and even looked at multi-GPU scaling performance. It was a lot of testing and a lot of time, but now that we have completed it, we are ready to summarize our findings in a piece that many have been clamoring for - a Battlefield 3 system build guide.
The purpose of this article is simple: gather our many hours or testing and research and present the results in a way that simply says "here is the hardware we recommend." It is a the exact same philosophy that makes our PC Perspective Hardware Leaderboard so successful as it gives the reader all the information they need, all in one place.
Introduction and Design
When the Apple MacBook Air originally debuted, geeks took note, and Windows laptop manufacturers ramped up efforts to meet that super-thin laptop on its own turf. Surprisingly, one of the first to respond was MSI, a company that is still struggling to define itself among a mainstream American market dominated by the likes of HP, Toshiba, Dell, and others.
MSI managed to significantly undercut the Air with its X340, but the build quality was also nowhere near what Cupertino’s engineers had managed. Yet MSI is not one to give up, and they’ve made moves to refine the X series over the years. As the price has dropped further, and the processor selection changed, comparisons to the Air have become less obvious.
That’s particularly true with this latest MSI X370, which now makes use of the AMD E-350 Fusion APU. This processor is nothing new, and we’ve tested it before at PC Perspective. Yet this laptop is different from any previously we’ve reviewed product with this processor because of its size.
The Battlefield 3 Beta
Update 2 (9/30/11): We have some quick results from our time on the Caspian Border map as well if you are interested - check them out!
It was an exciting day at PC Perspective yesterday with much our time dedicated to finding, installing and playing the new Battlefield 3 public beta. Released on the morning of the 26th to those of you who had pre-ordered BF3 on Origin or those of you who had purchased Medal of Honor prior to July 26th, getting the beta a couple of days early should give those of you with more FPS talent than me a leg up once the open beta starts on Thursday the 29th.
My purpose in playing Battlefield 3 yesterday was purely scientific, of course. We wanted to test a handful of cards from both AMD and NVIDIA to see how the beta performed. With all of the talk about needing to upgrade your system and the relatively high recommended system requirements, there is a lot of worry that just about anyone without a current generation GPU is going to need to shell out some cash.
Is that a justified claim? While we didn't have time yet to test EVERY card we have at the office we did put some of the more recent high end, mid-range and lower cost GPUs to the test.
Back in June of last year, OCZ released the RevoDrive, followed up rather quickly by the RevoDrive x2. A further jump was made with the introduction of VCA 2.0 architecture with the RevoDrive 3 and 3 x2. Each iteration pushed the envelope further as better implementations of VCA were introduced, using faster and greater numbers of PCIe channels, linked to faster and greater numbers of SandForce controllers.
While the line of RevoDrives was tailored more towards power users and mild server use, OCZ has taken their VCA 2.0 solution to the next level entirely, putting their sights on full blown enterprise purposing. With that, we introduce the OCZ Z-Drive R4:
Kal-El Tegra SoC to use 5 cores
Recent news from NVIDIA has unveiled some interesting new technical details about the upcoming Kal-El ARM-based Tegra SoC. While we have known for some time that this chip would include a quad-core processor and would likely be the first ARM-based quad-core part on the market, NVIDIA's Matt Wuebbling spilled the beans on a new technology called "Variable SMP" (vSMP) and a fifth core on the die.
An updated diagram shows the fifth "companion" core - Courtesy NVIDIA
This patented technology allows the upcoming Tegra processor to address a couple of key issues that affect smartphones and tablets: standby power consumption and manufacturing process deviations. Even though all five of the cores on Kal-El are going to be based on the ARM Cortex A9 design they will have very different power characteristics due to variations in the TSMC 40nm process technology that builds them. Typical of most foundries and process technologies, TSMC has both a "high performance" and a "low power" derivative of the 40nm technology usually aimed at different projects. The higher performing variation will run at faster clock speeds but will also have more transistor leakages thus increasing overall power consumption. The low power option does just the opposite: lowers the frequency ceiling while using less power at idle and usage states.
CPU power and performance curves - Courtesy NVIDIA
NVIDIA's answer to this dilemma is to have both - a single A9 core built on the low power transistors and quad A9s built on the higher performing transistors. The result is the diagram you saw at the top of this story with a quad-core SoC with a single ARM-based "companion." NVIDIA is calling this strategy Variable Symmetric Multiprocessing and using some integrated hardware tricks it is able to switch between operating on the lower power core OR one to four of the higher power cores. The low power process will support operating frequencies up to only 500 MHz while the high speed process transistors will be able to hit well above 1-1.2 GHz.
Introduction and Specifications
Courtesy of ECS
ECS developed the HDC-I motherboard to take advantage of AMD's new Brazos platform that's based on the Hudson M1 chipset and their latest E-350 dual-core processor and integrated DDR3 800/1066 memory controller. The dual-core E-350 APU, which combines the CPU and GPU, brings a host of features to mini ITX enthusiasts like USB 3.0 compability, SATA 6Gb/s support, bluetooth and Radeon HD 6310 graphics and UVD 3 to play 3D Blu-ray and HD-1080P movies.
Courtesy of ECS
Another huge advantage of going with a mini ITX motherboard for your next home theater PC is the balance of computing power and power consumption that the AMD Brazos platform adds to the ECS HDC-I. The HDC-I is an energy-efficient motherboard that has integrated computing power and graphics firepower for users looking for an "all-in-one" solution for their next small form factor build.
Get notified when we go live!