All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
You don't have 3D Vision 2? Loser.
In conjunction with GeForce LAN 6 current taking place on the USS Hornet in Alameda, NVIDIA is announcing an upgrade to the lineup of 3D Vision technologies. Originally released back in January of 2009, 3D Vision was one of the company's grander attempts to change the way PC gamers, well, game. Unfortunately for NVIDIA and the gaming community, running a 3D Vision setup required a new, much more expensive display as well as some glasses that originally ran $199.
While many people, including myself, were enamored with 3D technology when we first got our hands on it, the novelty kind of wore off and I found myself quickly back on the standard panels for gaming. The reasons were difficult to discern at first but it definitely came down to some key points:
- Panel resolution
- Panel size
- Image quality
The cost was obvious - having to pay nearly double for a 3D Vision capable display just didn't jive for most PC gamers and then the need to have to purchase $200 glasses made it even less likely that you would plop down the credit card. Initial 3D Vision ready displays, while also being hard to find, were limited to a resolution of 1680x1050 and were only available in 22-in form factors. Obviously if you were interested in 3D technology you were likely a discerning gamer and running at lower resolutions would be less than ideal.
The new glasses - less nerdy?
Yes, 24-in and 1080p panels did come in 2010 but by then much of the hype surrounding 3D Vision had worn off. To top it all off, even if you did adopt a 3D Vision kit of your own you realized that the brightness of the display was basically halved when operating in 3D mode - with one shutter of your glasses covered at any given time, you only receive half the total output from the screen leaving the image quality kind of drab and washed out.
Introduction and Features
Courtesy of MSI
Micro-Star International, better known as MSI, has been busy little bees in 2011 to fend off fierce competition from ASUS, Gigabyte and other motherboard vendors. This year's launch of the Z68 chipset from Intel combined the capabilities and features from the H67 and P67 chipsets, and MSI capitalized on this when they joined forces with LucidLogix to include their Virtu technology in their latest Z68A-GD80 motherboard. Lucid's Virtu tech provides users with switchable graphics, which allows users to enjoy both graphics power of integrated GPU and discrete GPU.
Courtesy of MSI
MSI also used the Z68A-GD80 as their first motherboard to support PCI Express 3.0, which boosts 32GB/s of transfer bandwidth and makes this mobo a bit more future proof for users looking for their next hardware upgrade. MSI also upgraded their BIOS system to ClickBIOS II, which provides a consistent user interface both in the UEFI BIOS and in Windows. Users can control their system settings directly from Windows and the GUI also supports touchscreen controls.
Bulldozer. Since its initial unveiling and placement on the roadmap many have called the Bulldozer architecture the savior of AMD, the processor that would finally turn the tide back against Intel and its dominance in the performance desktop market. After quite literally YEARS of waiting we have finally gotten our hands on the Bulldozer processors, now called the AMD FX series of CPUs, and can report on our performance and benchmarking of the platform.
With all of the leaks surrounding the FX processor launch you might be surprised by quite a bit of our findings - both on the positive and the negative side of things. With all of the news in the past weeks about Bulldozer, now we can finally give you the REAL information.
- Bulldozer First Release and the State of 32nm AMD Parts
- AMD Bulldozer Processor hits 8.429 GHz - New World Record!
- AMD Bulldozer FX Processor Benchmarks Leaked
Before we dive right into the performance part of our story I think it is important to revisit the Bulldozer architecture and describe what makes it different than the Phenom II architecutre as well as Intel's Sandy Bridge design. Josh wrote up a great look at the architecture earlier in the year with information that is still 100% pertinent and we recount much of that writing here. If you are comfortable with the architeture design points, then feel free to skip ahead to the sections you are more interested in - but I recommend highly you give the data below a look first.
The below text was taken from Bulldozer at ISSCC 2011 - The Future of AMD Processors.
Bulldozer Architecture Revisited
Bulldozer brings very little from the previous generation of CPUs, except perhaps the experience of the engineers working on these designs. Since the original Athlon, the basic floor plan of the CPU architecture AMD has used is relatively unchanged. Certainly there were significant changes throughout the years to keep up in performance, but the 10,000 foot view of the actual decode, integer, and floating point units were very similar throughout the years. TLB’s increasing in size, more instructions in flight, etc. were all tweaked and improved upon. Aspects such as larger L2 caches, integrated memory controllers, and the addition of a shared L3 cache have all brought improvements to the architecture. But the overall data flow is very similar to that of the original Athlon introduced 14 years ago.
As covered in our previous article about Bulldozer, it is a modular design which will come in several flavors depending on the market it is addressing. The basic building block of the Bulldozer core is a 213 million transistor unit which features 2 MB of L2 cache. This block contains the fetch and decode unit, two integer execution units, a shared 2 x 128 bit floating point/SIMD unit, L1 data and instruction caches, and a large shared L2 unit. All of this is manufactured on GLOBALFOUNDRIES’ 32nm, 11 metal layer SOI process. This entire unit, plus 2 MB of L2 cache, is contained in approximately 30.9 mm squared of die space.
A Pre-Built System in Your Budget
We all know that the majority of our readers enjoy building their own gaming systems - picking components, building the hardware, installing the software, etc. But as gamers get older and the amount of time they have to dedicate to their passion decreases, some might be willing to take the move to buying a pre-built gaming rig based on industry standard components. The benefits are definitely there: quicker turn around with just a couple days shipping, warranty and support for anything that should go wrong and the ability to upgrade and adapt your system in anyway you want.
AVADirect is a system builder that has been specializing in gaming PCs since 2003 and is based near Cleveland, Ohio. They offer a wide array of PC options including the most basic and inexpensive machines used for business computing as well as top-level gaming machines with overclocked settings and high-end water cooling configurations.
Recently AVADirect approached me with an interesting review idea: build a custom system for just around $1000 made for gaming and see if it could stand up to our testing. The result is a rig based on the P67 platform (though since our system shipped you can get Z68 motherboards for the same price) and the Core i5 Sandy Bridge processor, coupled with a Radeon HD 6950 that provides enough gaming power to tackle the best PC games.
Here is our video review of the AVADirect custom $1000 gaming machine, and check below for more images and thoughts!
RAGE is not as dependant on your graphics hardware as it is on your CPU and storage system (which may be an industry first); the reason for which we will discover when talking about the texture pop-up issue on the next page.
The first id Software designed game since the release of Doom 3 in August of 2004, RAGE has a lot riding on it. Not only is this the introduction of the idTech 5 game engine but also culminates more than 4 years of development and the first new IP from the developer since the creation of Quake. And since the first discussions and demonstrations of Carmack's new MegaTexture technology, gamers have been expecting a lot as well.
Would this game be impressive enough on the visuals to warrant all the delays we have seen? Would it push today's GPUs in a way that few games are capable of? It looks like we have answers to both of those questions and you might be a bit disappointed.
First, let's get to the heart of the performance question: will your hardware play RAGE? Chances are, very much so. I ran through some tests of RAGE on a variety of hardware including the GeForce GTX 580, 560 Ti, 460 1GB and the Radeon HD 6970, HD 6950, HD 6870 and HD 5850. The test bed included an Intel Core i7-965 Nehalem CPU, 6GB of DDR3-1333 memory running off of a 600GB VelociRaptor hard drive. Here are the results from our performance tests running at 1920x1080 resolution with 4x AA enabled in the game options:
If you have been visiting PC Perspective at all over the last week there is no doubt you have seen a lot of discussion about the currently running Battlefield 3 beta. We posted an article looking at performance of several different GPUs in the game and then followed it up with a look at older cards like the GeForce 9800 GT. We did a live stream of some PC Perspective staff playing BF3 with readers and fans, showed off and tested the locked Caspian Border map and even looked at multi-GPU scaling performance. It was a lot of testing and a lot of time, but now that we have completed it, we are ready to summarize our findings in a piece that many have been clamoring for - a Battlefield 3 system build guide.
The purpose of this article is simple: gather our many hours or testing and research and present the results in a way that simply says "here is the hardware we recommend." It is a the exact same philosophy that makes our PC Perspective Hardware Leaderboard so successful as it gives the reader all the information they need, all in one place.
Introduction and Design
When the Apple MacBook Air originally debuted, geeks took note, and Windows laptop manufacturers ramped up efforts to meet that super-thin laptop on its own turf. Surprisingly, one of the first to respond was MSI, a company that is still struggling to define itself among a mainstream American market dominated by the likes of HP, Toshiba, Dell, and others.
MSI managed to significantly undercut the Air with its X340, but the build quality was also nowhere near what Cupertino’s engineers had managed. Yet MSI is not one to give up, and they’ve made moves to refine the X series over the years. As the price has dropped further, and the processor selection changed, comparisons to the Air have become less obvious.
That’s particularly true with this latest MSI X370, which now makes use of the AMD E-350 Fusion APU. This processor is nothing new, and we’ve tested it before at PC Perspective. Yet this laptop is different from any previously we’ve reviewed product with this processor because of its size.
The Battlefield 3 Beta
Update 2 (9/30/11): We have some quick results from our time on the Caspian Border map as well if you are interested - check them out!
It was an exciting day at PC Perspective yesterday with much our time dedicated to finding, installing and playing the new Battlefield 3 public beta. Released on the morning of the 26th to those of you who had pre-ordered BF3 on Origin or those of you who had purchased Medal of Honor prior to July 26th, getting the beta a couple of days early should give those of you with more FPS talent than me a leg up once the open beta starts on Thursday the 29th.
My purpose in playing Battlefield 3 yesterday was purely scientific, of course. We wanted to test a handful of cards from both AMD and NVIDIA to see how the beta performed. With all of the talk about needing to upgrade your system and the relatively high recommended system requirements, there is a lot of worry that just about anyone without a current generation GPU is going to need to shell out some cash.
Is that a justified claim? While we didn't have time yet to test EVERY card we have at the office we did put some of the more recent high end, mid-range and lower cost GPUs to the test.
Back in June of last year, OCZ released the RevoDrive, followed up rather quickly by the RevoDrive x2. A further jump was made with the introduction of VCA 2.0 architecture with the RevoDrive 3 and 3 x2. Each iteration pushed the envelope further as better implementations of VCA were introduced, using faster and greater numbers of PCIe channels, linked to faster and greater numbers of SandForce controllers.
While the line of RevoDrives was tailored more towards power users and mild server use, OCZ has taken their VCA 2.0 solution to the next level entirely, putting their sights on full blown enterprise purposing. With that, we introduce the OCZ Z-Drive R4:
Kal-El Tegra SoC to use 5 cores
Recent news from NVIDIA has unveiled some interesting new technical details about the upcoming Kal-El ARM-based Tegra SoC. While we have known for some time that this chip would include a quad-core processor and would likely be the first ARM-based quad-core part on the market, NVIDIA's Matt Wuebbling spilled the beans on a new technology called "Variable SMP" (vSMP) and a fifth core on the die.
An updated diagram shows the fifth "companion" core - Courtesy NVIDIA
This patented technology allows the upcoming Tegra processor to address a couple of key issues that affect smartphones and tablets: standby power consumption and manufacturing process deviations. Even though all five of the cores on Kal-El are going to be based on the ARM Cortex A9 design they will have very different power characteristics due to variations in the TSMC 40nm process technology that builds them. Typical of most foundries and process technologies, TSMC has both a "high performance" and a "low power" derivative of the 40nm technology usually aimed at different projects. The higher performing variation will run at faster clock speeds but will also have more transistor leakages thus increasing overall power consumption. The low power option does just the opposite: lowers the frequency ceiling while using less power at idle and usage states.
CPU power and performance curves - Courtesy NVIDIA
NVIDIA's answer to this dilemma is to have both - a single A9 core built on the low power transistors and quad A9s built on the higher performing transistors. The result is the diagram you saw at the top of this story with a quad-core SoC with a single ARM-based "companion." NVIDIA is calling this strategy Variable Symmetric Multiprocessing and using some integrated hardware tricks it is able to switch between operating on the lower power core OR one to four of the higher power cores. The low power process will support operating frequencies up to only 500 MHz while the high speed process transistors will be able to hit well above 1-1.2 GHz.
Introduction and Specifications
Courtesy of ECS
ECS developed the HDC-I motherboard to take advantage of AMD's new Brazos platform that's based on the Hudson M1 chipset and their latest E-350 dual-core processor and integrated DDR3 800/1066 memory controller. The dual-core E-350 APU, which combines the CPU and GPU, brings a host of features to mini ITX enthusiasts like USB 3.0 compability, SATA 6Gb/s support, bluetooth and Radeon HD 6310 graphics and UVD 3 to play 3D Blu-ray and HD-1080P movies.
Courtesy of ECS
Another huge advantage of going with a mini ITX motherboard for your next home theater PC is the balance of computing power and power consumption that the AMD Brazos platform adds to the ECS HDC-I. The HDC-I is an energy-efficient motherboard that has integrated computing power and graphics firepower for users looking for an "all-in-one" solution for their next small form factor build.
The Mechanics of a Keyboard
During the duration of this review Razer announced two new mechanical keyboards, the BlackWidow Stealth and the BlackWidow Ultimate Stealth. This review is not for those products. Razer ninja’d me with stealth.
Keyboards are often overlooked during the purchase of a new computer; for many there does not appear to be any real difference between any two keyboards outside of wireless technology, backlighting, or extra keys. Those who game heavily or those who are typing enthusiasts for work or hobby might be in the market for a more personalized experience. There are whole categories of keyboard styles which allow a tailored solution to your personal style of use right down to the type of switch used to register a keystroke. Razer is no stranger to the production of input devices but they are stepping slightly out of their element with their recent products: The BlackWidow and the BlackWidow Ultimate, the first two from Razer which are based on mechanical switches.
Popping Razer’s CherryMX?
Membrane keyboards comprise the majority of the cheapest keyboards in the market with scissor-switch taking up the laptop and thin-profile keyboard market. Despite being cheap, these keyboards also have the advantage of being quite silent. A mechanical keyboard on the other hand uses an actual mechanical switch for each and every key. While such as system costs substantially more than a membrane keyboard the cost may be offset by the precision, the response, or the ability to type without “bottoming-out” each keystroke.
If the concept of a mechanical keyboard interests you then you will likely be dealing indirectly with Cherry Corp in the near future most likely with their MX line of switches. I say indirectly as Cherry avoids selling their keyboards except to business, industrial, governmental, and medical suppliers. For the rest of us there exist several companies who purchase large quantities of mechanical switches and manufactures keyboards with them for retail end-users. Some common mechanical keyboard brands include Filco, SteelSeries, XArmor, Optimus, Das Keyboard, and Ducky. Keep in mind that while there are many brands, almost all of their keyboards are produced by iOne, Datacomp, or Costar with a few exceptions. In our situation, Razer’s BlackWidow and BlackWidow Ultimate are produced by iOne who also produces the XArmor line of mechanical keyboards.
Read on for the rest of the review including benchmarks… yes that is possible!
Bulldozer Ships for Revenue
Some months back we covered the news that AMD had released its first revenue shipments of Llano. This was a big deal back then, as it was the first 32 nm based product from AMD, and one which could help AMD achieve power and performance parity with Intel in a number of platforms. Llano has gone on to be a decent seller for AMD, and it has had a positive effect on AMD’s marketshare in laptops. Where once AMD was a distant second in overall terms of power and performance in the mobile environment, Llano now allows them to get close to the CPU performance of the Intel processors, achieve much greater performance in graphics workloads, and has matched Intel in overall power consumption.
KY Wong and Marshall Kwait hand off the first box of Bulldozer based Interlagos processors to Cray's Joe Fitzgerald. Photo courtesy of AMD.
Some five months later we are now making the same type of announcement for AMD and their first revenue shipment of the Bulldozer core. The first chips off the line are actually “Interlagos” chips; basically server processors that feature upwards of 16 cores (8 modules, each module containing two integer units and then the shared 256 bit FPU/SSE SIMD unit). The first customer is Cray, purveyor of fine supercomputers everywhere. They will be integrating these new chips into their Cray XE6 supercomputers, which have been purchased by a handful of governmental and education entities around the world.
Introduction and Design
Tablets may be the darling of the tech industry, but they’ve also received their fair share of criticism as well. One of the most consistent barbs throw towards them is the tablet’s inability to serve as a competent platform for content creature. While it’s technically possible to write a document or edit an image on a tablet, it’s certainly not enjoyable.
Part of the problem is the lack of a keyboard and mouse. Touchscreens are beautiful and intuitive, but they’re not precise. While third-party cases and docks have tried to solve this issue, they’re often both clunky and expensive.
It’s little surprise that a tablet designed specifically to work in conjunction with a keyboard dock has hit the market, but it is surprising that the first such device comes from ASUS, a company with relatively little experience building mobile products. The Eee Pad Transformer is already the second-best selling tablet on the market (after the iPad, of course) and reports indicate sales are constrained by supply rather than demand. What is it that has made the Transformer a quick success?
Antec SOLO II Chassis Review
Antec is one of the most storied brands when it comes to enthusiast class components like cases and power supplies. Unfortunately, the whims of the gamer change on a breeze as most will swap between company allegiances whenever the performance or features dictate. Antec has fallen into this trap over the last few years as many in the community moved on from the likes of the ever-present P180 to newer and more innovative designs from other companies. Having realized this internally and making moves to take care of that slide, Antec is going to start producing some modern cases with improved specifications.
First up on the list is the SOLO II, part of the Antec Sonata line of "Quiet Computing" products. The original SOLO was released in early 2006 and since then much has changed in the world of chassis for gamers and those more interested in noise reduction. Antec is hoping that the SOLO II will actually address a large portion of BOTH crowds with its small footprint, many noise-deleting features and several enthusiast-class nods included.
Check out our video review of the SOLO II and keep reading for some images of the newest Antec chassis!
Introduction and Features
Kingwin just released their first fanless power supply, the Stryker Fanless 500W, which is rated for 500W and can be easily overclocked to 600W output with only a slight drop in efficiency. That makes the Stryker STR-500 one of the highest output fanless power supplies on the market today. And the Kingwin Stryker 500W Fanless power supply is certified 80 Plus Platinum with up to 92% efficiency. High efficiency is always good but it is critical for a fanless power supply to help minimize waste heat. Combine that with a good combination of fixed and modular cables and a 5-year warranty and we have a very interesting new product to review!
Kingwin has impressed us recently with several of their new power supplies, most notably the LZP-550W that we reviewed earlier this year, which won an Editor's Choice award. In addition to the new Stryker fanless PSU, Kingwin offers a complete lineup of PC power supplies ranging across seven categories along with PC chassis, external enclosures, docking stations, cooling systems, and accessories.
True 5.1 Surround Sound Gaming Headset
Cooler Master is a household name in the PC case world, and an established player in the cooling industry. Not content with those two areas, Cooler Master has expanded into power supplies, keyboards, mousepads, and a plethora of other accessories where they apparently make a tidy bundle. Coolermaster is now moving into a new area; gaming audio. Under the “CM Storm” brand, Coolermaster is releasing its own set of cans.
We were sent a production quality sample, but it did not come in the retail box that is availble now.
Cooler Master is hoping to deliver a profound audio experience to users with their CM Storm Sirus (not Sirius mind) True 5.1 Surround Sound Gaming Headphones. The design and packaging certainly look impressive, but what counts in the end is the sound emanating from these products.
Is a GTX 590 just not enough for you?
A Legacy of Unique Engineering
ASUS has often been one of only a handful of companies that really pushes the limits of technology on their custom designed products including graphics cards, sound cards, notebooks, motherboards and more. Just a little over a year ago I wrote a review of the ASUS ARES Dual Radeon HD 5870 graphics card - the first of its kind and it was labeled the "Ultimate Graphics Card" at the time. Life on the top of mountain doesn't last that long in the world of the GPU though and time (and the GTX 590 and HD 6990) left the Greek god of war in the rearview mirror.
This time around we have a successor to the MARS - the NVIDIA version that combines two top-level GPUs on a single PCB. The new ASUS MARS II we are reviewing today is a pair of binned GTX 580 GPUs paired together for full-time SLI and built with a limited edition run of 999 units. In many ways the MARS II and the ARES share a lot of traits: custom designed cooling and PCB, a unique aesthetic design, limited edition status and significant physical weight as well. Of course, the price tag is also pretty high and if you aren't comfortable reading about a $1300 graphics card you might as well turn around now... For those that dare though, you can be sure that the MARS II will have you dreaming about PC gaming power for years to come!
Republic of Gamers Means Business
I have got to be honest with you - most of the time getting me excited for graphics cards any more is a chore. Unless we are talking about a new architecture from NVIDIA or AMD, card vendors are hard pressed to the same attention from me they used to a couple of years ago when every card release was something to pay attention to. Over the next week or so though it turns out that ASUS and Gigabyte have a few noteworthy items definitely worth some grade-A analysis and reviewing starting with today's: the ASUS ROG Matrix GTX 580 beast.
The Republic of Gamers brand is reserved for the highest-end parts from the company that are obviously targeted at three main segments. First, the serious gamers and enthusiasts that demand the top level performance either because they can't stand to lose at gaming or just want nothing but the best for their own experiences. Secondly are the professional overclockers that live on features and capabilities that most of us could only dream of pushing and that need LN2 to get the job done. Finally, the case modding groups that demand not only great performance, but sexy designs that add to the aesthetics of the design as whole and aren't boring. The ROG brand does a very commendable job of hitting all three of these groups in general and specifically with the new Matrix-series GTX 580.
In the following pages we will document what makes this card different, how it performs, how it overclocks and why it might be the best GTX 580 card on the market today.
MIA or Simply Retired?
It is awfully hard to deny the value proposition of the AMD HD 6970 graphics card. The card overall matches (and sometimes exceeds) the NVIDIA GTX 570 at a slightly lower price, it has 2 GB of frame buffer, and AMD is consistently improving not just gaming performance for the new VLIW 4 architecture, but also adding to its GPGPU support. Throw in the extra happiness of a more manageable power draw, pretty low heat production for a top end card, and it is also the fastest single GPU card when it comes to bitcoin mining. With all of these positives, why hasn’t everyone gone out to buy one? Simple, they simply are hard to come by anymore.
¿Dónde están las tarjetas gráficas?
Throughout Winter and Spring of this year, the HD 6970 was an easy card to acquire. Prices were very reasonable, supply seemed ample, and most every manufacturer had one in a configuration that would appeal to a lot of people. The HD 6950 was also in great supply, and it was also in a few unique configurations that adds more for the money than just the reference design. This Summer saw the pool of HD 6970 cards dry up, not to mention the complete lack of HD 6990 cards in retail altogether.