All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction, Design and Ergonomics
The tablet market is starting to heat up. After a long period of dominance by the iPad and its long line of Android imitators, we have new competitors looking to spoil the tablet world order. On the high-end we have the incoming volley of buff Tegra 3 based products, and on the low end with have the Kindle Fire, a simple $199 tablet that seems to prefer that its users don’t think for a second about the hardware inside.
That’s actually a bit odd, because the hardware inside is at least competitive. Though priced $300 less than the cheapest iPad 2, the Fire offers a dual core processor at the same clock speed of 1 GHz. It also provides 512MB of RAM and 8GB of storage, neither of which will blow away competitors, but all of which is competitive. While the 7” size of the Fire means there is simply less tablet to build, it’s impressive that Amazon has managed to cram reasonably impressive hardware into one of the cheapest Android tablets on the market today.
Hardware is only a small part of equation, however. Amazon really intends the Fire to be a portal to its world of services, which includes ebooks, streaming video, apps and much more. This is very much a walled garden, even more so than Apple’s iPad, and for it to work the spoils of the garden need to be damn good. Let’s see if $200 is really a good value given that users must buy into Amazon’s services as well.
Introduction, Specifications, and Packaging
A few months back, OCZ acquired Indilinx. Ever since, we've been wondering if the next generation Indilinx offering could stand up to the competition, who has made leaps and bounds since the first generation SSD controllers were released.
- 128GB Max Performance
- 256GB Max Performance
- 512GB Max Performance
- 1TB Max Performance
Here's a basic block diagram of the new Everest controller from Indilinx. All of the usual bits are present, of particular note being the ability to drive 8 channels, with each channel rated at 4-way. This should mean an Everest could theoretically drive 32 flash chips.
Welcome to The Inside Perspective
Welcome! The Inside Perspective is a new series on PC Perspective that will take you further behind the scenes of the industry than we typically go, with interviews of industry personnel. I hope you find these discussions to be both informative as well as eye opening into the world behind the reviews, behind the press releases and the marketing speak that everyone sees on a daily basis. http://pcper.com/tip
The Inside Perspective 001 - Antec's Jessie Lawrence
Antec was at one time the leader in enthusiast and gamer case design and much of what the company built years ago, like the P180 series, has endured years of modifications to remain relevant and these chassis remain some of the favorites of our staff. However, in the past several years, the case world has seen new entrants, new designs and new feature sets that Antec has been slow to adopt and as such they have fallen out of the leaders position in the mind-set of enthusiasts, even if they haven't in sales in revenue.
With the release of the new P280 and Eleven Hundred cases though (the former of which we have already reviewed) Antec is hoping to make a resurgence into the enthusiast market. And while I definitely was impressed with the P280 there is still some work to do and, luckily, Antec knows it and is moving forward in the right direction.
Recently, Antec's Communications Manager Jessie Lawrence stopped by the office and we sat down for a small interview to ask him about the new case designs, what he likes about working in this cut throat industry and how Antec plans to stem the tides against the competition.
While we are getting The Inside Perspective started, we are including it in our PC Perspective Podcast RSS feed. Also, the video version which we would encourage you to check out can be seen below, embedded via YouTube. You can be sure to find all of our episodes of The Inside Perspective at http://pcper.com/tip
- iTunes - Subscribe to the podcast directly through the iTunes Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
- Video - See YouTube embed below (RSS to come soon!)
Host: Ryan Shrout
Guest: Jessie Lawrence, Antec
Program Length: 13:49
- 01:08 Introduction
- 02:15 Questions about new P280 and Eleven Hundred Cases
- 04:40 Target audience for each new case
- 05:50 Background with Jessie Lawrence
- 09:20 Memorable moments in the industry
- 11:20 How can Antec return to enthusiast roots?
- 13:45 Conclusion
Antec pushes forward
Sometimes, when you right something correctly the first time, it still fits:
Antec is one of the most storied brands when it comes to enthusiast class components like cases and power supplies. Unfortunately, the whims of the gamer change on a breeze as most will swap between company allegiances whenever the performance or features dictate. Antec has fallen into this trap over the last few years as many in the community moved on from the likes of the ever-present P180 to newer and more innovative designs from other companies. Having realized this internally and making moves to take care of that slide, Antec is going to start producing some modern cases with improved specifications.
I wrote those words as the introduction to our Antec SOLO II review and they still stand true. The SOLO II was a good case for its small size but the new Antec P280 attempts to reenergize the excitement around the Performance One series of cases, well known for the P180/190 lineage. The P280 continues with specific features like the 270 degree swinging door and sound dampening material but does it offer up enough additional features to catch the competition that has passed Antec?
Check out our video review below and continue reading if you want to see a collection of still photos and our final thoughts!
Overall, I think the new Antec P280 is a great case for the price, coming in at $139 (and likely lower) at launch. The case offers what I would now call "staple" features like the grommeted cable routing openings around the motherboard and a large CPU cooler installation opening while keeping what made the Performance One series of cases great including sound dampening material on the front door and side panels and the classic swinging door. Extras like the dedicated 2.5-in drive bays, rubber feet for the standard 3.5-in HDDs, 9 expansion slots and the side access fan filter for the power supply really show that Antec is moving in the right direction again.
Introduction, Campaign Testing
As you might have noticed, we’re a bit excited about Battlefield 3 here at PC Perspective. It promised to pay attention to what PC gamers want, and shockingly, it has come through. Dedicated servers and huge multi-player matches are supported, and the browser based interface is excellent.
If we’re honest, a lot of our hearts have been stirred simply by the way the game looks. There aren’t many titles that really let a modern mid-range graphics card stretch its legs, even at 1080p resolution. Battlefield 3, however, can be demanding - and it looks beautiful. Even with the presets at medium, it’s one of the most attractive games ever.
But what does this mean for laptops? Has the resident laptop reviewer at PC Perspective, I know that gaming remains a challenge. The advancements over the last few years have been spectacular, but even so, most of the laptops we test can’t run Just Cause 2 at a playable framerate even with all detail set to low and a resolution of just 1366x768.
To find out if mobile gamers were given consideration by the developers of Battlefield 3, I installed the game on three different laptops. The results only go to show how far mobile gaming has come, and has far it has to go.
Introduction and Features
It's been awhile since we reviewed our last Thermaltake power supply so we were excited to see a new Toughpower Grand unit show up on our doorstep recently. Thermatake's new Grand series includes five models ranging from 650W up to 1200W. All of the Toughpower Grand series power supplies feature advanced circuitry design, high efficiency (80 Plus Gold certified), smart cable management, and come backed by Thermaltake's 7-year warranty.
Here is what Thermaltake has to say about their new Toughpower Grand Series: "Designed to meet the highest expectations and standards of today's extreme PC enthusiasts and system builders while pushing the limit of clean and reliable power delivery system utilizing advanced circuitry design that can only be found on Toughpower Grand series of high-efficiency power supplies. Smart cable management not only enables professional-looking system builds, but also accelerates airflow through the computer case for highly efficient heat dissipation. All of the above are achieved while enabling the Toughpower Grand power supply to deliver continuous power output even when operating in 50C environment, making it the Toughest power supply on the block."
Introduction and Design
In the realm of hulking, powerful gaming laptops, Alienware remains king by popular vote. Whatever you think of their laptops (as always, popularity attracts criticism) there is no denying that the Alienware name stands for something among gamers. And to most, they're a dream machine, the laptop equivilent of a Ferrari 458 Italia.
The purpose of a review, of course, is to move past reputation and judge the true capabilities of a product. And in the case of the Alienware M17x, there's a lot to judge. We’ll talk about its bulk later, but I don’t think I’m spoiling anything by saying that this laptop has big bones as well as serious hardware.
One paper, the stars seem to align. But this market remains competitive, thanks to not only to boutique outfits like Origin and Maingear but also to ASUS, which is still on its gaming laptop war-path, producing affordable rigs that can be had for surprisingly little dough. With competition on all sides, can the Alienware justify its $2500 price of entry? Let’s find out.
Sandy Bridge-E is just what you expect
It has been more than three years since Intel released the first Core i7 processor built around the Nehalem CPU architecture along with the X58 chipset. It quickly became the platform of choice for the enthusiast market (gamers and overclockers), and remained in that role even as the world of processors evolved around it with the release of Westmere and Sandy Bridge. Yes, we have been big supporters of the Sandy Bridge Core i7 parts for some time as the "new" platform of choice for gamers, but part of us always fondly remembered the days of Nehalem and X58.
Well, Intel shared the sentimentl and this holiday they are officially unveiling the Sandy Bridge-E platform and the X79 chipset. The "E" stands for enthusiast in this case and you'll find that many of the same decisions and patterns apply from the Nehalem release to this one. Nehalem and X58 was really meant as a workstation design but the performance and features were so good that Intel wanted to offer it to the high-end consumer as well. Sandy Bridge-E is the same thing - this design is clearly built for the high-profit areas of computing including workstation and servers but those that want the best available technology will find it pretty damn attractive as well.
But what actually makes a Sandy Bridge-E processor (now going with the Core i7-3xxx model naming scheme) different from the Sandy Bridge CPUs we have come to love since it was released in January of this year?
The Sandy Bridge-E Architecture
The answer might surprise you, but truthfully not a whole lot has changed. In fact, from a purely architectural stand point (when looking at the x86 processor cores), Sandy Bridge-E looks essentially identical to the cores found in currently available Sandy Bridge CPUs. You will see the same benefits of the additional AVX instruction set in applications that take advantage of it, a shared L3 cache that exists between all of the cores for data coherency and the ring bus introduced with Sandy Bridge is still there to move data between the cores, cache and uncore sections of the die.
900 Series: Bulldozer Ready
The rumor on the street is that Asus makes a few motherboards. They may or may not be the world’s leading motherboard manufacturer. Asus may also have a pretty good reputation for quality and innovation in their products. It is tongue in cheek hour at PC Perspective. All joking aside, Asus anymore is the gold standard for quality manufacturing and design in motherboards.
Some months ago AMD released their AM3+ capable chipsets, though the release was not nearly as exciting as we had hoped. The AMD 900 series of chipsets are essentially the same silicon as those that power the non-integrated AMD 800 series. There are three SKUs that are currently available for the 900 series that Asus makes motherboards around.
Looking at the Exterior
We have had some really good experiences with Puget Systems pre-built PCs in the past and a little while ago, the company sent us a modestly priced HTPC based on the Serenity line of systems. Based on the Intel Core i5 Sandy Bridge platform, the Serenity has a lot of customizations that help keep the computer quiet that are unique.
With a cost hovering around $1800 though, does the Serenity offer enough to consumers?
The Serenity Home Theater PC
The Puget Systems Serenity line actually spans small form factor chassis, HTPC designs and even standard desktop ATX designs, one of which we have previously reviewed. Today we are going to be showing you the HTPC form factor that could fit in your home theater furniture (if you have some hefty space available). Let's look quickly at the specifications before we dive into the design.
Click to Enlarge
- Intel Core i5-2500K
- ASUS H67 Motherboard
- 4GB DDR3-1333 Memory
- 120GB Intel 320 SSD
- 1.5TB Western Digital Caviar Green HDD
- ASUS 12x Blu-Ray Burner
- Windows 7 Home Premium x64
Introduction and Features
SilverStone has a well earned reputation among PC enthusiasts for providing a full line of high quality enclosures, power supplies, cooling components, and accessories. Their Strider Gold series, which includes four models ranging from 750W up to 1,200W, was designed to meet the needs of professional users and enthusiasts alike. The Strider Gold ST1200-G that we have up for review today is rated for 1200W DC output and is 80 Plus Gold certified. It comes with a full compliment of all modular cables, including dual EPS connectors and eight PCI-E connectors for multiple high-end graphics adapter support. High efficiency also means less waste heat, which should result in lower operating temperatures and less fan noise.
Here is what SilverStone has to say about their new Strider Gold 1200W power supply:
“To meet the ever higher requirements imbued by professional users and enthusiasts alike, SilverStone created the Strider Gold series power supplies. The Strider Gold models are built on a similar platform as the award-winning Strider Plus series but with even higher efficiency ratings.
The Strider Gold series will encompass wattage range from 750W to 1200W for a great variety of applications. In addition to 80 Plus Gold level efficiency, all models are built to meet very high standards in electrical performance so they have ±3% voltage regulation, low ripple and noise and high amperage single +12V rail. Other notable features also included are 24/7 40°C continuous output capacity, Japanese capacitors, low-noise 135mm fan, and multiple sets of PCI-E cables. For users looking for a power supply with a faultless combination of performance, efficiency, and quality, there is no need to look beyond the Strider Gold series.”
EVGA Changes the Game Again
Dual-GPU graphics cards are becoming an interesting story. While both NVIDIA and AMD have introduced their own reference dual-GPU designs for quite some time, it is the custom build models from board vendors like ASUS and EVGA that really peak our interest because of their unique nature. Earlier this year EVGA released the GTX 460 2Win card that brought the worlds first (and only) graphics card with a pair of the GTX 460 GPUs on-board.
ASUS has released dual-GPU options as well including the ARES dual Radeon HD 5870 last year and the MARS II dual GTX 580 just this past August but they were both prohibitively rare and expensive. The EVGA "2Win" series, which we can call it now that there are two of them, is still expensive but much more in line with the performance per dollar of the rest of the graphics card market. When the company approached us last week about the new GTX 560 Ti 2Win, we jumped at the chance to review it.
The EVGA GeForce GTX 560 Ti 2Win 2GB
The new GTX 560 Ti 2Win from EVGA follows directly in the footsteps of the GTX 460 model - we are essentially looking at a pair of GTX 560 Ti GPUs on a single PCB running in SLI multi-GPU mode. Clock speeds, memory capacity, performance - it should all be pretty much the same as if you were running a pair of GTX 560 Ti cards independently.
Just as with the GTX 460 2Win, EVGA is the very first company to offer such a product. NVIDIA didn't design a reference platform and pass it along to everyone like they did with the GTX 590 - this is all EVGA.
The Alienware M17x Giveth
Mobile graphics cards are really a different beast than the desktop variants. Despite have similar names and model numbers, the specifications vary greatly as the GTX 580M isn't equivalent to the GTX 580 and the HD 6990M isn't even a dual-GPU product. Also, getting the capability to do a direct head-to-head is almost always a tougher task thanks to the notebook market's penchant for single-vendor SKUs.
Over the past week or two, I was lucky enough to get my hands on a pair of Alienware M17x notebooks, one sporting the new AMD Radeon HD 6990M discrete graphics solution and the other with the NVIDIA GeForce GTX 580M.
AMD Radeon HD 6990M on the left; NVIDIA GeForce GTX 580M on the right
Also unlike the desktop market - the time from announcement of a new mobile GPU product to when you can actually BUY a system including it tends to be pretty long. Take the two GPUs we are looking at today for example: the HD 6990M launched in July and we are only just now finally seeing machines ship in volume; the GTX 580M in June.
Well, problems be damned, we had the pair in our hands for a few short days and I decided to put them through the ringer in our GPU testing suite and added Battlefield 3 in for good measure as well. The goal was to determine which GPU was actually the "world's fastest" as both companies claimed to be.
Introduction, Design and Display Quality
I’m a multi-monitor addict.
My addiction started many years ago. I had a rather lame customer service job, and as part of my job I needed to manage spreadsheets with customer interactions while also filling in data on a separate, large window. To accomplish this, two monitors were required. I was amazed at the efficiency of the setup, and I bought a second monitor for home use within a few months. Now, I can’t imagine using my desktop with a single display.
Laptops, however, are a different story. Multiple displays can actually be even more useful for road warriors because of the limited resolution of many laptops, yet there are issues with using multiple monitors on the road. Carrying around even a relatively small desktop monitor is out of the question, leaving few options.
Enter the Lenovo LT1421. This unique mobile monitor has a 14” panel and, according to Lenovo, it’s portable enough to be carried about with minimal hassle. Has the company managed to create a unique, must-have product for mobile productivity, or is multi-monitor use with a laptop still a concept that’s better on paper than in reality?
Down to Business
While I'm mostly a PC guy, I favor iPhones. My friends always jab at me because I'm a fairly die-hard techie, and they expect me to be sporting the latest overclocked super-smartphone with eleventy-thousand gigs and a built-in dishwasher. While my phone used to be included in the list of things I tinker with endlessly, I eventually came to the point where I just wanted my smallest mobile device to 'Just Work' for those tasks I needed it to. I just didn't have the time to tinker endlessly with the thing that handled more and more of my work-related duties, yet as a techie, I still enjoyed the ability to cram a bunch of functionality into my pocket. That led me to the iPhone, which I've used since 2008.
Like many iPhone users, I've upgraded through the various iterations over the years. The original, 3G, 3GS, 4, and most recently the 4S. I've also witnessed nearly every iteration of Apple's iOS (including many of the betas). Throughout all of these, I feel Apple did their best to prevent new versions from breaking application functionality, and while they did their best to keep bugs under control, every so often a few would creep in - typically with cross-compatibility of new features that could only be tested in a limited capacity before being released into the wild.
Apple's iPhone 4S introduced some added features to their line-up - enough to get me to bite the bullet earlier than I typically do. I was a bit of an iOS 5 veteran by then, as I'd been using it to test applications for a developer friend of mine. While some of the iOS 5 betas were a bit unruly in the battery life category, those problems appeared to be sorted out by the last and final beta prior to release. With my reservations against the new OS put to rest, the only thing holding me back was the possibility of the new dual core CPU drawing more power than the iPhone 4. The iFixit teardown (which revealed a higher capacity battery - presumably to counter power draw of the additional CPU), coupled with the improved camera and witnessing the endless toying around with Siri finally got me to bite the bullet.
I picked up a 4S from my local Apple Store, restored from the backup of my 4, and off I went. I enjoyed the phone for a few days, but quickly realized there was some sort of issue with battery life. The most glaring indicator was one night where I forgot to plug the 4S in before bed. The next morning I was surprised to find that nearly 40% of the charge was lost while I slept - almost enough to completely shut down the phone (and failing to wake me with the alarm I'd gleefully set via Siri the night before).
Introduction, Specs, Design and Ergonomics
Samsung's Galaxy S II smartphone debuted in the U.S. with Sprint, AT&T, and T-Mobile in September and we finally got our hands on a review sample. The Samsung smartphone runs on Android 2.3 "Gingerbread" operating system and includes an 8 MP camera with LED flash and 1080p video, front facing 2 MP camera, and Samsung’s custom TouchWiz user interface.
T-Mobile and Sprint’s version sports a 4.52-inch display, but AT&T’s version has a 4.3-inch screen that matches the original international version of the Galaxy S II. We are reviewing T-Mobile's Galaxy S II with 16GB of internal memory (there are two options for 16 and 32 GB). The Sprint and AT&T versions are outfitted with a dual-core 1.2 GHz Orion processor, but the T-Mobile version we are reviewing today sports a Qualcomm Snapdragon S3 1.5 GHz dual-core CPU.
Introduction and Design
Sound. It seems to be one of the new battlefields on which notebook computers are fighting, which is odd, because audio quality has until recently been so rarely a central focus of notebook manufacturers. Today, the artillery is clearly placed. HP arrived first with Beats Audio, but others have responded, such as MSI with its Dynaudio branded laptops and now ASUS with this Bang & Olufsen tagged N55, which comes with an external subwoofer by default.
Yep, that’s right. It’s not a large subwoofer (that’s the point - it’s small enough to potentially be transported in same bag as the laptop), but clearly ASUS is taking sound seriously with this laptop. Yet there’s so much more to a laptop that its sound, and that’s particularly true with a system such as this. Everywhere you look, the N55’s specification scream performance. A Core i7 quad-core mobile processor is the heart of the machine, and snuggles up with an Nvidia GT 555M graphics processor. What else is there to be had? Have a look.
Introduction and Features
It's always an exciting day when a new Corsair power supply shows up at the PCPerspective test lab and today we have a brand new HX1050 PSU to test out. Corsair continues to bring a full line of high quality power supplies, memory components, cases, cooling components, SSDs and accessories to market for the PC enthusiast and professional alike. Corsair's Professional Series power supplies now include four models; the HX650, HX750, HX850 and HX1050. All of the power supplies in the Professional Series feature premium quality components, an energy-efficient design (80 Plus Silver certified) and quiet operation; and they are backed by a 7-year warranty and lifetime access to Corsair's comprehensive technical support and customer service.
Here is what Corsair has to say about their Professional Series PSUs:
"Corsair Professional Series power supplies are designed for enthusiasts and gamers who demand high performance, high efficiency, the flexibility of modular cables, and unflinching reliability. Designed and engineered using cutting-edge technology, each Corsair Professional Series power supply is built to the highest quality standards and delivers the features and performance that technology enthusiasts demand.
Corsair Professional Series power supplies are built using Industrial Grade electrical components, ensuring the stable, clean and reliable voltages that are essential for high-end gaming PCs and workstations. Professional Series PSUs are rigorously tested at 100% load and an ambient temperature of 50°C, guaranteeing stability and reliability even in the most demanding multi-core, multi-graphics card systems.
The Corsair Professional Series HX1050 PSU utilizes advanced DC-to-DC circuitry, and exceeds the requirements for 80 Plus Silver certification standards, for up to 88% energy-efficiency. This reduces wasted energy, reducing energy bills, and lowers the amount of excess heat generated, allowing the cooling fan to spin more slowly and quietly.
Corsair Professional Series power supplies boast an innovative semi-modular cabling configuration that uses low-profile, flat modular cables, which reduce air friction to help maximize airflow through the computer chassis. The modular design also simplifies installation as it allows unprecedented flexibility in utilizing only those cables which are needed. Corsair Professional Series is backed by and industry-leading 7-year warranty and comprehensive customer support. "
You don't have 3D Vision 2? Loser.
In conjunction with GeForce LAN 6 current taking place on the USS Hornet in Alameda, NVIDIA is announcing an upgrade to the lineup of 3D Vision technologies. Originally released back in January of 2009, 3D Vision was one of the company's grander attempts to change the way PC gamers, well, game. Unfortunately for NVIDIA and the gaming community, running a 3D Vision setup required a new, much more expensive display as well as some glasses that originally ran $199.
While many people, including myself, were enamored with 3D technology when we first got our hands on it, the novelty kind of wore off and I found myself quickly back on the standard panels for gaming. The reasons were difficult to discern at first but it definitely came down to some key points:
- Panel resolution
- Panel size
- Image quality
The cost was obvious - having to pay nearly double for a 3D Vision capable display just didn't jive for most PC gamers and then the need to have to purchase $200 glasses made it even less likely that you would plop down the credit card. Initial 3D Vision ready displays, while also being hard to find, were limited to a resolution of 1680x1050 and were only available in 22-in form factors. Obviously if you were interested in 3D technology you were likely a discerning gamer and running at lower resolutions would be less than ideal.
The new glasses - less nerdy?
Yes, 24-in and 1080p panels did come in 2010 but by then much of the hype surrounding 3D Vision had worn off. To top it all off, even if you did adopt a 3D Vision kit of your own you realized that the brightness of the display was basically halved when operating in 3D mode - with one shutter of your glasses covered at any given time, you only receive half the total output from the screen leaving the image quality kind of drab and washed out.
Introduction and Features
Courtesy of MSI
Micro-Star International, better known as MSI, has been busy little bees in 2011 to fend off fierce competition from ASUS, Gigabyte and other motherboard vendors. This year's launch of the Z68 chipset from Intel combined the capabilities and features from the H67 and P67 chipsets, and MSI capitalized on this when they joined forces with LucidLogix to include their Virtu technology in their latest Z68A-GD80 motherboard. Lucid's Virtu tech provides users with switchable graphics, which allows users to enjoy both graphics power of integrated GPU and discrete GPU.
Courtesy of MSI
MSI also used the Z68A-GD80 as their first motherboard to support PCI Express 3.0, which boosts 32GB/s of transfer bandwidth and makes this mobo a bit more future proof for users looking for their next hardware upgrade. MSI also upgraded their BIOS system to ClickBIOS II, which provides a consistent user interface both in the UEFI BIOS and in Windows. Users can control their system settings directly from Windows and the GUI also supports touchscreen controls.