All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
If the netbook was a shooting star, the nettop was an asteroid that never quite entered our atmosphere. Instead it flew silently by, noted by NASA, written about in a handful of articles, and now forgotten.
That doesn’t mean it has ceased to exist, however. It’s still out there, floating in space - and it occasionally swings back around for an encore. So we have the Lenovo IdeaCentre Q180.
Of course, simply advertising a small computer as - well, a small computer - isn’t particularly sexy. The Q180 is instead being sold not just as general-purpose laptop but also as a media center (with optional Blu-Ray, not found on our review unit). There’s no doubting the demand for this, but so far, attempts to make PC-based media center computers have not done well - even Boxee, with its custom Linux-based operating system, was fussy. Can the Q180 succeed where others have stumbled? Let’s start with the specs.
It’s been awhile since we tested anything Atom. Since our last look at this line of processors, Intel has updated to the code-name Cedertrail processors, allowing for higher clock speeds. The 2.13 GHz dual-core Atom D2700 looks quite robust in print. But this still the same old architecture, so per-clock performance doesn’t come close to Intel’s Pentium and Core processors.
Also included in AMD’s Radeon HD 6450A, a version of the HD 6450 built for small systems that don’t have room for a typical PCIe graphics card. This makes up for the fact that all Atom processors are still using hopelessly outdated Intel Media Accelerator graphics, which is entirely unsuitable for HD video.
GK104 takes a step down
While the graphics power found in the new GeForce GTX 690, the GeForce GTX 680 and even the Radeon HD 7970 are incredibly impressive, if we are really honest with ourselves the real meat of the GPU market buys options much lower than $999. Today's not-so-well-kept-secret release of the GeForce GTX 670 attempts to bring the price to entry of the NVIDIA Kepler architecture down to a more attainable level while also resetting the performance per dollar metrics of the GPU world once again.
The GeForce GTX 670 is in fact a very close cousin to the GeForce GTX 680 with only a single SMX unit disabled and a more compelling $399 price tag.
The GTX 670 GPU - Nearly as fast as the GTX 680
The secret is out - GK104 finds its way onto a third graphics card in just two months - but in this iteration the hardware has been reduced slightly.
The GTX 670 block diagram we hacked together above is really just a GTX 680 diagram with a single SMX unit disabled. While the GTX 680 sported a total of 1536 CUDA cores broken up into eight 192 core SMX units, the new GTX 670 will include 1344 cores. This will also drop the texture units to 112 (from 128 on the GTX 680) though the ROP count stays at 32 thanks to the continued use of a 256-bit memory interface.
Infectious fear is infectious
PCMag and others have released articles based on a blog post from Sophos. The original post discussed how frequently malware designed for Windows is found on Mac computers. What these articles mostly demonstrate is that we really need to understand security: what it is, and why it matters. The largest threats to security are complacency and misunderstanding; users need to grasp the problem rather than have it burried under weak analogies and illusions of software crutches.
Your data and computational ability can be very valuable to people looking to exploit it.
The point of security is not to avoid malware, nor is it to remove it if you failed to avoid it. Those actions are absolutely necessary components of security -- do those things -- but they are not the goal of security. The goal of security is to retain control of what is yours. At the same time, be a good neighbor and make it easier for others to do the same with what is theirs.
Your responsibility extends far beyond just keeping a current antivirus subscription.
The problem goes far beyond throwing stones...
The distinction is subtle.
Your operating system is irrelevant. You could run Windows, Mac, Android, iOS, the ‘nixes, or whatever else. Every useful operating system has vulnerabilities and run vulnerable applications. The user is also very often tricked into loading untrusted code either directly or delivering it within data to a vulnerable application.
Blindly fearing malware -- such as what would happen if someone were to draw parallels to Chlamydia -- does not help you to understand it. There are reasons why malware exists; there are certain things which malware is capable of; and there are certain things which malware is not.
The single biggest threat to security is complacency. Your information is valuable and you are responsible to prevent it from being exploited. The addition of a computer does not change the fundamental problem. Use the same caution on your computer and mobile devices as you should on the phone or in person. You would not leave your credit card information on a park bench unmonitored.
Introduction, Design, User Interface
Intel has decided to lead its introduction of Ivy Bridge for mobile with its most powerful quad-core parts. Many of these processors will end up in mainstream laptop, but they’re also great for gaming laptops. In our first look at Ivy Bridge we saw that it holds up well when paired with its own Intel HD 4000 graphics – if you keep the resolution around 1366x768. A bit more than that and the IGP just can’t hang.
Gamers will still want a beefy discrete GPU, and that’s what the G75 offers. Inside this beast you’ll find an Nvidia GeForce GTX 670M. Those who were reading our Kepler coverage will remember that this is not based off Nvidia’s newest architecture but is instead a re-work of an older Fermi chip. That mean seem a bit disappointing, and it is – but the performance of Nvidia’s older mobile chips wasn’t lackluster.
So, this new laptop is packing a spanking-new Core i7-3720QM as well as Nvidia’s new GTX 670M. That’s an impressive combination, and ASUS has wisely backed it up with a well-rounded set of performance components.
GTX 690 Specifications
On Thursday May the 3rd at 10am PDT / 1pm EDT, stop by the PC Perspective Live page for an NVIDIA and PC Perspective hosted event surrounding the GeForce GTX 690 graphics card. Ryan Shrout and Tom Petersen will be on hand to talk about the technology, the performance characteristics as well as answer questions from the community from the chat room, twitter, etc. Be sure to catch it all at http://pcper.com/live
Okay, so it's not a surprise to you at all, or if it is, you haven't been paying attention. Today is the first on-sale date and review release for the new NVIDIA GeForce GTX 690 4GB dual-GPU Kepler graphics card that we first announced in late April. This is the dream card any PC gamer out there combining a pair of GTX 680 GK104 GPUs on a single PCB and running them in a single slot SLI configuration and is easily the fastest single card we have ever tested. It also the most expensive reference card we have ever seen with a hefty $999 price tag.
So how does it perform? How about efficiency and power consumption - does the GTX 690 suffer the same problems the GTX 590 did? Can AMD hope to compete with a dual-GPU HD 7990 card in the future? All that and more in our review!
Kepler Architecture Overview
For those of you that may have missed the boat on the GTX 680 launch, the first card to use NVIDIA's new Kepler GPU architecture, you should definitely head over and read my review and analysis of that before heading into the deep-dive on the GTX 690 here today.
Kepler is a 3.54 billion transistor GPU with 1536 CUDA cores / stream processors contained within and even in a single GPU configuration is able produce some impressive PC gaming performance results. The new SMX-based design has some modest differences from Fermi the most dramatic of which is the removal of the "hot clock" - the factor that ran the shaders and twice the clock speed of the rest of the GPU. Now, the entire chip runs at one speed, higher than 1 GHz on the GTX 680.
Each SMX on Kepler now includes 192 CUDA cores as opposed to the 32 cores found in each SM on Fermi - a change that has increased efficiency and performance per watt quite dramatically.
As I said above, there are lot more details on the changes in our GeForce GTX 680 review.
The GeForce GTX 690 Specifications
Many of the details surrounding the GTX 690 have already been revealed by NVIDIA's CEO Jen-Hsun Huang during a GeForce LAN event in China last week. The card is going to be fast, expensive and is built out of components and materials we haven't seen any graphics card utilize before.
Depsite the high performance level of the card, the GTX 690 isn't much heavier and isn't much longer than the reference GTX 680 card. We'll go over the details surrounding the materials, cooler and output configuration on the next page, but let's take some time just to look and debate the performance specifications.
Introduction and Features
SilverStone was one of the first PC power supply manufacturers to design and market a fanless power supply for silent operation. While many of their competitor’s fanless products have come and gone, SilverStone continues to build on their reputation and later last year released the SST-ST50NF 500W fanless power supply, which is the latest addition to the Nightjar series. We are a little late to the party in reviewing the ST50NF but after talking with the good folks at SilverStone it appears the wait was worth it as they have continued to tweak the design in recent months to improve AC ripple suppression on the DC outputs.
Here is what SilverStone has to say about the Nightjar 500W fanless power supply: The fanless Nightjar series power supplies are long favorites for professionals and enthusiasts alike that require noiseless power solution with no moving parts. With increasing power demands required from modern computers, SilverStone engineers have once again created another fanless power supply with leading output level in ST50NF. With 500W of continuous rating, near 80Plus Silver efficiency, ±3% voltage regulation, single +12V rail, multiple PCI-E connectors, and full host of safety features, the ST50NF is a great choice for mission-critical systems that need to operate in noiseless or dusty environments.
SilverStone Nightjar 500W Fanless PSU Key Features:
• Fanless thermal solution,0 dBA Acoustics
• 500W continuous power output
• 80 PLUS Bronze certified with84%~88% efficiency at 20%~100% load
• Compliance with ATX 12V v2.3 and EPS 12V Specifications
• Strict ±3% voltage regulation
• PCI-E 8-pin and PCI-E 6-pin connectors
• Powerfull class-leading single +12V rail (38A)
• Aluminum construction
• Server-level components
• Universal AC input (100~250V) with Active PFC
Editor’s Note: Fanless PC power supplies occupy a niche market and are targeted towards users who want a silent power supply for use in noise-sensitive areas or who need a power supply that can survive in a dusty/dirty environment that might choke and kill a conventional fan cooled PSU. Fanless power supplies rely on convection cooling and still require airflow in and around the power supply chassis to carry away the waste heat. So while the power supply itself may not have a fan, the computer enclosure must still have some means of creating airflow to keep the CPU, GPU and PSU cool. The last thing you want to do is put a fanless PSU in a closed enclosure without any fans or airflow!
Introduction, Low-Power Computing Was Never Enjoyable
It was nearly five years ago that ASUS announced the first Eee PC model at Computex. That October the first production version of what would to be called a netbook, the ASUS Eee PC 4G, was released. The press latched on to the little Eee PC, making it the new darling of the computer industry. It was small, it was inexpensive, and it was unlike anything on the market.
Even so, the original Eee PC was a bit of a dead end. It used an Intel Celeron processor that was not suited for the application. It consumed too much power and took up a significant portion of the netbook’s production cost. If Intel’s Celeron had remained the only option for netbooks they probably would not have made the leap from press darling to mainstream consumer device.
It turned out that Intel (perhaps unintentionally) had the solution – Atom. Originally built with hopes that it might power “mobile Internet devices” it proved to be the netbook’s savior. It allowed vendors to squeeze out cheap netbooks with Windows and a proper hard drive.
At the time, Atom and the netbook seemed promising. Sales were great – consumers loved the cute, pint-sized, affordable computers. In 2009 netbook sales jumped by over 160% quarter-over-quarter while laptops staggered along with single-digit growth. The buzz quickly jumped to other products, spawning nettops, media centers and low-power all-in-one-PCs. There seemed to be nothing an Atom powered computer could not do.
Fast forward. Earlier this year, PC World ran an article asking if netbooks are dead. U.S. sales peaked in the first quarter of 2010 and have been nose-diving since then, and while some interest remains in the other markets, only central Europe and Latin America have held steady. It appears the star that burned brightest has indeed burned the quickest.
Background and Internals
A little over two weeks back, Intel briefed me on their new SSD 910 Series PCIe SSD. Since that day I've been patiently awaiting its arrival, which happened just a few short hours ago. I've burned the midnight oil for the sake of getting some greater details out there. Before we get into the goods, here's a quick recap of the specs for the 800 (or 400) GB model:
- PCIe 2.0 x8 LSI Falcon 2008 SAS HBA driving 4 (or 2) Hitachi Ultrastar SAS controllers, each in turn driving 200GB of IMFT 25nm High Endurance Technology flash memory, all on a triple stacked half-height PCB.
- 400GB model yields (r/w) 1GB/s / 750MB/s sequential and 90,000 / 38,000 4k IOPS.
- 800GB model yields (r/w) 2GB/s / 1GB/s sequential and 180,000 / 75,000 4k IOPS.
- 800GB 'performance mode' (r/w) 2GB/s / 1.5GB/s sequential and 180,000 / 75,000 4k IOPS.
"Performance Mode" is a feature that can be enabled through the Intel Data Center Tool Software. This feature is only possible on the 800GB model, but not for the reason you might think. The 400GB model is *always* in Performance Mode, since it can go full speed without drawing greater than the standard PCIe 25W power specification. The 800GB model has twice the components to drive yet it stays below the 25W limit so long as it is in its Default Mode. Switching the 800GB model to Performance Mode increases that draw to 38W (the initial press briefing stated 28W, which appears to have been a typo). Note that this increased draw is only seen during writes.
Ok, now into the goodies:
Get Out the Microscope
AMD announced their Q1 2012 earnings last week, which turned out better than the previous numbers suggested. The bad news is that they posted a net loss of $590 million. That does sound pretty bad considering that their gross revenue was $1.59 billion, but there is more to the story than meets the eye. Of course, there are thoughts of “those spendthrift executives are burying AMD again”, but this is not the case. The loss lays squarely on the GLOBALFOUNDRIES equity and wafer agreements that have totally been retooled.
To get a good idea of where AMD stands in Q1, and for the rest of this year, we need to see how all these numbers actually get sorted out. Gross revenue is down 6% from the quarter before, which is expected due to seasonal pressures. This is right in line with Intel’s seasonal downturn, and in ways AMD was affected slightly less than their larger competitor. They are down around 2% from last year’s quarter, and part of that can be attributed to the continuing hard drive shortage that continued to affect the previous quarter.
An update to a great architecture
This article will focus on the new Ivy Bridge, 3rd Generation Core Processor from a desktop perspective. If you are curious as the performance and features of the Ivy Bridge mobile processors, be sure to check out our Core i7-3720QM ASUS N56VM review here!!
One of the great things about the way Intel works as a company is that we get very few surprises on an annual basis in terms of the technology they release. With the success of shows like the Intel Developer Forum permitting the release of architectural details months and often years ahead of the actual product, developers, OEMs and the press are able to learn about them over a longer period of time. As you might imagine, that results in both a much better understanding of the new processor in question and also a much less hurried one. If only GPU cycles would follow the same path...
Because of this long-tail release of a CPU, we already know quite a bit about Ivy Bridge, the new 22nm processor architecture from Intel to be rebranded as the 3rd Generation Intel Core Processor Family. Ivy Bridge is the "tick" that brings a completely new process technology node as we have seen over the last several years but this CPU does more than take the CPU from 32nm to 22nm. Both the x86 and the processor graphics portions of the die have some changes though the majority fall with the GPU.
Ivy Bridge Architecture
In previous tick-tock scenarios the "tick" results in a jump in process technology (45nm to 32nm, etc) with very little else being done. This isn't just to keep things organized in slides above but it also keeps Intel's engineers focused on one job at a time - either a new microprocessor architecture OR a new process node; but not both.
For the x86 portion of Ivy Bridge this plan stays in tract. The architecture is mostly unchanged from the currently available Sandy Bridge processors including the continuation of a 2-chip platform solution and integrated graphics, memory controller, display engine, PCI Express and LLC along with the IA cores.
Introduction, Overview, What is New With Ivy Bridge
This article will focus on the new Ivy Bridge, 3rd Generation Core Processor from a mobile perspective. If you are curious as the performance and features of the Ivy Bridge desktop processors, be sure to check out our desktop Core i7-3770K review here.
It would be an understatement to say that Intel’s had a good streak as of, say, the last five years. If life was commented on by the announcer from Unreal Tournament, Intel’s product releases would now be followed by the scream of “M-M-M-MONSTER KILLLLLLLL!” This is particularly true in the mobile market. Atom aside, Intel’s processors have repeatedly defeated AMD and its own preceding products.
Many companies in this position might feel it’s time to take a breather, but Intel has reached this point precisely because it doesn’t. The “tick-tock” strategy of constant improvement has made the company and its products stronger than ever before. Even the Pentium-powered Intel of the mid-90s seems weak compared to today’s juggernaut.
And so we come to the launch of Ivy Bridge. This is not a new architecture but instead an update of Sandy Bridge – however, that does not mean the under-the-hood revisions aren’t substantial. There’s a lot to talk about.
The reference system provided for our review is an ASUS N56VM, but this is not a full review of the laptop. That will be published later, after we’ve had more time to look at the laptop itself. Our focus today is on the new Intel hardware inside.
Let’s get to it.
Introduction, Design, User Interface
Dell has long tried to enter the high-end luxury laptop market. These attempts have always been met with mixed results. While Dell’s thick, power and relatively affordable XPS laptops are a good pick for people needing a desktop replacement, they don’t cause the thinness-obsessed media to salivate.
Enter the Dell XPS 15z. It’d be easy to think that it’s a MacBook Pro clone considering its similar pricing and silver exterior, but reality is simpler then that. This is just an XPS 15 that has been slimmed down. Like the standard XPS laptops, the 15z follows a form-balanced-by-function approach that is common among all of Dell’s laptops.
Slimming the chassis has forced the use of some less powerful components, but our review unit still arrived with some impressive hardware. Let’s have a look.
When the NVIDIA GeForce GTX 680 launched in March we were incredibly impressed with the performance and technology that the GPU was able to offer while also being power efficient. Fast forward nearly a month and we are still having problems finding the card in stock - a HUGE negative towards the perception of the card and the company at this point.
Still, we are promised by NVIDIA and its partners that they will soon have more on shelves, so we continue to look at the performance configurations and prepare articles and reviews for you. Today we are taking a look at the Galaxy GeForce GTX 680 2GB card - their most basic model that is based on the reference design.
If you haven't done all the proper reading about the GeForce GTX 680 and the Kepler GPU, you should definitely check out my article from March that goes into a lot more detail on that subject before diving into our review of the Galaxy card.
The Card, In Pictures
The Galaxy GTX 680 is essentially identical to the reference design with the addition of some branding along the front and top of the card. The card is still a dual-slot design, still requires a pair of 6-pin power connections and uses a very quiet fan in relation to the competition from AMD.
It's been a long while since we've looked at a hard drive, and how fitting that it be a new model of the Western Digital VelociRaptor! Western Digital appears to be on a somewhat fixed 2-year cycle with these, as out 600GB VelociRaptor Review went up two Aprils ago, and the 300GB two years prior to that. Well then, let's take a look at this new model!
(from left) 300GB, 600GB, and finally the 1TB VelociRaptor
Here's the old school VelociRaptor logo (from back when they were less than 100GB!)
Introduction and Exterior
When we do system reviews at PC Perspective we tend to look for some specific feature, or some unique asset, that the builder has to provide value to the consumer and potential customer. I have seen systems that provided a great cost value, ones that offer an extremely quiet experience, some that are in a small form factor, etc. Our review of the MAINGEAR Shift custom machine is here due simply to an impressive collection of hardware.
While you can grab a Shift PC starting under $2000, ours isn't going to come anywhere near that. In fact, as of this writing, the configuration we are detailing would run you about $6,200. Why? Take a look at the specifications:
- Intel Core i7-3960X Sandy Bridge-E
- 16GB Corsair Vengeance DDR3-1866
- ASUS Rampage IV Extreme X79 Motherboard
- 3 x Radeon HD 7970 3GB Graphics Cards
- 2 x Corsair Force GT 120GB SSDs (RAID-0)
- 1TB Western Digital 7200 RPM HDD
- Corsair AX1200 watt power supply
- MAINGEAR Epic 180 water cooler
- MAINGEAR Epic Audio system
- Fancy White LEDs
So with a Sandy Bridge-E processor, 16GB of memory, three HD 7970s running in CrossFireX and Corsair SSDs running in a RAID-0 array, this is one of the fastest gaming PCs you can purchase today.
A Look at the Shift
The specifications are just part of the story though; MAINGEAR is well known for building a high quality machine with attention to detail and continues to push forward with unique ideas like a vertical system design (first system builder to introduce it), custom 180mm water coolers and even in-house thermal interfaces.
While MAINGEAR does offer systems in a variety of colors, our system uses the basic brushed black aluminum. The window on the side panel is another option that was included on our demo rig.
Introduction and Features
Kingwin has just released the fifth and most powerful unit in their top-of-the-line Lazer Platinum Series, the LZP-1000. As you might guess, all of the Lazer Platinum Series power supplies are 80Plus Platinum certified, which means they should deliver top efficiency (up to 90%, 92% and 89% efficiency @ 20%, 50% and 100% load). The LZP-1000 PSU we have up for review comes with a full compliment of fixed and modular cables, a very quiet 140mm fan and includes universal AC input with Active PFC. A new feature found on the LZP-1000 is the ability to run in fanless mode at low to mid power levels thanks to Kingwin's ECO Thermal Control System and 2-way Thermal Control switch. The LZP-1000 is designed to support the latest Intel and AMD processors along with multiple high-end graphic adapters (Crossfire/SLI/3 Way SLI).
Kingwin LZP-1000 PSU Key Features:
• 1,000W total continuous power; Easily Overclocked to 650W
• 80 PLUS® Platinum Certified: 90%, 92% and 89% Efficiency @ 20%, 50% and 100% Load
• 140MM Fan with Intelligent Auto Fan Speed Control
• ECO Intelligent Thermal Control System (Patent)
• 2-Way Thermal Control switch (fanless at low power or fan on all the time)
• Supports Intel Core i7-980X/Core i7/Core i5/Core 2 Quad/Core 2 Duo and
• AMD Phenom II x6/Phenom II X4/Phenom II X3/Athlon 64 X2 CPUs
• Compliance with ATX 12V v2.2, EPS 12V v2.91, and SSI EPS 12V v2.92 Specifications
• Crystal Cube Modular Plugs w/ Patented Power Connector Cable Management System
• Six PCI-E 6+2 pin connectors to support high-end graphic cards
• Stable +12V current; 83A (996W)
• Over Power/Under Power/Over Voltage/Short Circuit Protections
• Universal AC input (115V ~ 240V) with Active PFC
• 5-Year warranty
Introduction, Design and Ergonomics
Tablets are growing in popularity, but the market is still immature. There are only a handful of serious contenders sold in North America (discounting the cheap knock-offs you can find on eBay and other sites).
Apple’s iPad is the clear leader in terms of sales. It is trailed by similar Android-powered options like the Samsung Galaxy Tab, the Acer Iconia and (of coures) the ASUS Transformer Prime. We reviewed the Prime when it hit store shelves earlier this year and concluded that it was the best Android tablet money can buy. That makes a comparison with the new iPad obvious.
The constant stream of rants for or against Android and iOS devices in the media may lead you to think that the comparison between the two is highly subjective. I don’t believe that’s the case. There are a number of objective measurements that can be used to judge these products.
Yes, there is always going to be some degree of preference between operating systems, but we’re not really going to get into the iOS vs. Android argument here. That’s a topic that would require its own article, and most likely one several times longer than this comparison. Subjective points will be limited to design and ergonomics.
Enough talk. It’s time for the competition to begin.
More MHz for the Masses
AMD has had a rough time of it lately when it comes to CPUs. Early last year when we saw the performance of the low power Bobcat architecture, we thought 2011 would be a breakout year for AMD. Bulldozer was on the horizon and it promised performance a step above what Intel could offer. This harkened back to the heady days of the original Athlon and Athlon 64 where AMD held a performance advantage over all of Intel’s parts. On the graphics side AMD had just released the 6000 series of chips, all of which came close in performance to NVIDIA’s Fermi architecture, but had a decided advantage in terms of die size and power consumption. Then the doubts started to roll in around the April timeframe. Whispers hinted that Bulldozer was delayed, and not only was it delayed it was not meeting performance expectations.
The introduction of the first Llano products did not help things. The “improved” CPU performance was less than expected, even though the GPU portion was class leading. The manufacturing issues we saw with Llano did not bode well for AMD or the upcoming Bulldozer products. GLOBALFOUNDRIES was simply not able to achieve good yields on these new 32 nm products. Then of course the hammer struck. Bulldozer was released, well behind schedule, and with performance that barely rose above that of the previous Phenom II series of chips. The top end FX-8150 was competitive with the previous Phenom II X6 1100T, but it paled in comparison to the Intel i7 2600 which was right around the same price range.
Introduction, Specifications, and Packaging
OCZ has been in the SSD game for quite some time now. Their first contender was the OCZ Vertex, which we reviewed back in Febuary of 2009. While the original Vertex was powered by an Indilinx BareFoot controller, the Vertex line switched over to SandForce for the second and third generations. The fourth generation brings Indilinx back to the Vertex, this time with the Everest 2. You may recall Everest made its first appearance in the OCZ Octane, which has already proven itself to be a solid contender in the market.
Before we get into the meat and portatoes, we'll kick this off by saying this will not be a typical Vertex 4 review. We had benches run on 512GB and 256GB Vertex 4 samples, but the numbers we were seeing seemed 'off', so OCZ provided me with an alpha/engineering level firmware late last night. I suspect most other reviews you read today will include results from the 1.30 initial shipping firmware, or perhaps from the 1.31 bugfix firmware (which corrected an issue with secure erasure), but this piece will cover both 1.30 and a newer 1.52 interim build. Sometimes it's necessary to burn the midnight oil in the interest of presenting the full picture (or one as complete as possible) to our readers, and this was one of those pieces. We will revisit the Vertex 4 again very soon in the form of a more final product review, but for now we'll go with what we've got.
Introduction and Features
SilverStone's new Strider Gold Evolution series power supplies include four models: 750W, 850W, 1000W and 1200W, which are based on the original Strider Gold series. SilverStone engineers have tweaked the design to improve both efficiency and performance and they replaced the original fan with a SilverStone Air Penetrator fan with fluid dynamic bearings to make it even quieter. The Strider Gold ST1000-G Evolution that we have up for review today is rated for 1000W DC output and is 80 Plus Gold certified. It comes with a full compliment of all modular cables, including dual EPS connectors and six PCI-E connectors for multiple high-end graphics adapter support. SilverStone also includes one of their new 140mm magnetic fan filters to help reduce dust buildup inside your PC.
Here is what SilverStone has to say about their new Strider Gold Evolution 1000W power supply:
“To make great power supplies even better, SilverStone engineers delved into the details with the Strider Gold series. In order to reduce the already low noise generated by the Strider Gold, they replaced the fan with SilverStone’s own Air Penetrator, which has superb cooling ability and is equipped with fluid dynamic bearings. This allowed for reprogramming of the fan controller to enable lower fan speed at any given loading condition compared to the predecessor. Various components were also re-tested and replaced with newer versions for minor boost in efficiency and performance. An excellent FF141 fan filter with magnets is even included in the package for use to reduce dust buildup.
With the fan and component upgrades, the Strider Gold Evolution series will again encompass wattage range from 750W to 1200W with 80 PLUS Gold level efficiency, ±3% voltage regulation, low ripple & noise, and high amperage single +12V rail as its main features. For discerning users looking to get a top-notch power supply with all features covered and more, this is the perfect solution.”
SilverStone ST1000-G Evolution PSU Key Features:
• 1,000W 24-Hour continuous power output (1,100W Peak)
• 80 Plus Gold certification: 87%~90% at 20%~100% loading
• Ultra silent 139mm Air Penetrator fan with fluid dynamic bearings
• SilverStone FF141 magnetic fan filter included
• 100% Modular cables
• Class-leading single +12V rail with 83A
• Strict ±3% voltage regulation and low ripple and noise
• Japanese made main capacitors
• Dual EPS 8-pin connectors
• Multiple PCI-E connectors (8-pin and 6-pin)
• Capacitors attached to PCI-E cables to reduce electrical noise
• Supports ATX12V 2.3 and EPS12V
• Active Power Factor Correction
Get notified when we go live!