All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Windows RT: Runtime? Or Get Up and Run Time?
Update #1, 10/26/2012: Apparently it does not take long to see the first tremors of certification woes. A Windows developer by the name of Jeffrey Harmon allegedly wrestled with Microsoft certification support 6 times over 2 months because his app did not meet minimum standards. He was not given clear and specific reasons why -- apparently little more than copy/paste of the regulations he failed to achieve. Kind-of what to expect from a closed platform... right? Imagine if some nonsensical terms become mandated or other problems crop up?
Also, Microsoft has just said they will allow PEGI 18 games which would have received an ESRB M rating. Of course their regulations can and will change further over time... the point is the difference between a store refusing to carry versus banishing from the whole platform even for limited sharing. The necessity of uproars, especially so early on and so frequently, should be red flags for censorship to come. Could be for artistically-intentioned nudity or sexual themes. Could even be not about sex, language, and violence at all.
Last month, I suggested that the transition to Windows RT bares the same hurdles as transitioning to Linux. Many obstacles blocking our path, like Adobe and PC gaming, are considering Linux; the rest have good reason to follow.
This month we receive Windows RT and Microsoft’s attempt to shackle us to it: Windows 8.
To be clear: Microsoft has large incentives to banish the legacy of Windows. The way Windows 8 is structured reduces it to a benign tumorous growth atop Windows RT. The applications we love and the openness we adore are contained to an app.
I will explain how you should hate this -- after I explain why and support it with evidence.
Microsoft is currently in the rare state of sharp and aggressive focus to a vision. Do not misrepresent this as greed: it is not. Microsoft must face countless jokes about security and stability. Microsoft designed Windows with strong slants towards convenience over security.
That ideology faded early into the life of Windows XP. How Windows operates is fundamentally different. Windows machines are quite secure, architecturally. Con-artists are getting desperate. Recent attacks are almost exclusively based on fear and deception of the user. Common examples are fake anti-virus software or fraudulent call center phone calls. We all win when attackers get innovative: survival of the fittest implies death of the weakest.
Bulldozer to Vishera
Bulldozer is the word. Ok, perhaps it is not “the” word, but it is “a” word. When AMD let that little codename slip some years back, AMD enthusiasts and tech journalists started to salivate about the possibilities. Here was a unique and very new architecture that promised excellent single thread performance and outstanding multi-threaded performance all in a package that was easy to swallow and digest. Probiotics for the PC. Some could argue that the end product for Bulldozer and probiotics are the same, but I am not overly fond of writing articles containing four letter colorful metaphors.
The long and short of Bulldozer is that it was a product that was pushed out too fast, it had specifications that were too aggressive for the time, and it never delivered on the promise of the architecture. Logically there are some very good reasons behind the architecture, but implementing these ideas into a successful product is another story altogether. The chip was never able to reach the GHz range it was supposed to and stay within reasonable TDP limits. To get the chip out in a timely manner, timings had to be loosened internally so the chip could even run. Performance per clock was pretty dismal, and the top end FX-8150 was only marginally faster than the previous top end Phenom II X6 1100T. In some cases, the X6 was still faster and a more competent “all around” processor.
There really was not a whole lot for AMD to do about the situation. It had to have a new product, and it just did not turn out as nicely as they had hoped. The reasons for this are legion, but simply put AMD is competing with a company that is over ten times the size, with the resulting R&D budgets that such a size (and margins) can afford. Engineers looking for work are a dime a dozen, and Intel can hire as many as they need. So, instead of respinning Bulldozer ad nauseum and releasing new speed grades throughout the year by tweaking the process and metal layer design, AMD let the product line sit and stagnate at the top end for a year (though they did release higher TDP models based on the dual module FX-4000 and triple module FX-6000 series). Engineers were pushed into more forward looking projects. One of these is Vishera.
A curious new driver from AMD
In case you missed the news, AMD is going to be making a big push with their Radeon brand from now until the end of the year starting with an incredibly strong game bundle that includes as many as three full games and 20% off the new Medal of Honor. The second part of this campaign is a new driver specifically the 12.11 beta that will be posted to the public later this week.
AMD is claiming to have made some substantial improvements on quite a few games including the very popular Battlefield 3 and the upcoming Medal of Honor (both of which use the same base engine). But keep in mind that 15% is a LOT and this is the best case scenario in specific maps and you may not see benefits on others.
There are going to be some debates about the validity of these performance boosts from AMD until we can get some more specific details on WHAT has changed. Essentially the company line is that they have finally "caught up" to the GCN GPU architecture introduced with the Radeon HD 7970 in January of 2012. We traditionally see this happen with new GPU architectures from both vendors but for it to have taken this long is troublesome and will surely cause some raised eyebrows from gamers and the competition.
We decided to run through the Radeon HD 7870 GHz Edition with this new 12.11 beta driver to compare it to the 12.9 beta driver we had just completed testing on a few weeks ago. AMD claims performance advantages for all the GCN cards including the 7700/7800/7900 cards though we only had time to test a single card for our initial article. The results are on the following pages...
Some computer components get all the glory. Your normal lineup of FPS crushing GPU’s, Handbrake dominating CPU’s, and super-fast Memory end up with most of the headlines. Yet behind the scenes, there are some computer components we use that are pivotal in our use and enjoyment of computers and receive very little fanfare. Without networking we wouldn’t have file sharing, LAN parties or even the Internet itself. Without routers and network adapters, we wouldn’t have networking.
ASUS recently sent a whole slew of networking components our way and we’ve decided to take them for a spin and see if they’re worth your hard earned dollars. Our box of ASUS goodies included:
- ASUS RT-N66U Gigabit Router – Dual Band Wireless-N900
- ASUS PCE-N10 - Wireless N PCI-E Adapter Wireless-N
- ASUS PCE-N15 - Wireless N PCI-E Adapter Wireless-N
- ASUS USB-N53 - Dual Band Wireless N Adapter
- ASUS USB-N66 - Dual Band Wireless-N900 Adapter
Without further ado, let’s jump in and tackle each one.
ASUS RT-N66U Gigabit Router – Dual Band Wireless-N900
Routers are one of those components that most of us don’t really think about unless something goes horribly wrong. Most people will buy one they find on a big box store shelf (or even worse, just use their ISP’s router), pull it out of the box, plug a few cables into it and then forget about it in a closet for a few years.
A look outside and in
We handle a fair amount of system reviews here at PC Perspective and use them mainly as a way to feature unique and interesting designs and configurations. We know how the hardware will perform for the most part; doing extensive CPU and GPU testing on nearly a daily basis. Sometimes we'll get systems in that are extremely budget friendly, other times vendors pass us machines that have MSRPs similar to a Kia automobile. Then there are times, like today, we get a unique design that is a great mix of both.
AVADirect has had a Mini Gaming PC design for a while now but recently has gone through a refresh that adds in support for the latest Ivy Bridge processors, NVIDIA Kepler GPUs all using a new case from BitFenix that combines it in a smaller, mini-ITX form factor.
The quick specifications look like this:
- BitFenix Prodigy chassis
- Intel Core i7-3770K CPU, Overclocked at 4.4 GHz
- ASUS P8Z77-I Deluxe Z77 Motherboard
- EVGA GeForce GTX 680 2GB GPU
- OCZ 240GB Vertex 3 SSD
- Seagate 2TB SATA 6G HDD
- 8GB Crucual DDR3-1866 Memory
- Cooler Master 850 watt Silent Pro PSU
You'll also see a large, efficient Prolimatech cooler inside along with a Blu-ray burner and Windows 7 for a surprisingly reasonable $2100 price tag.
The BitFenix Prodigy chassis is a unique design that starts with sets of FiberFlex legs and handles surrounding the mini-ITX case. The minor flexibility of the legs absorbs sound and impact on the table while the handles work great for picking up the system for LAN events and the like. While at first I was worried about using them to support the weight of the rig, I had no problems and was assured by both BitFenix and by AVADirect it would withstand the torture.
Check out our video review before continuing on to the full article with benchmarks and pricing!
Wireless storage for PC, Mac, iOS and Android
Today we are taking a look at the new Patriot Gauntlet 320 external USB 3.0 and wireless hard drive, available starting at $149 at Newegg.com.
The premise is quite simple: take a portable hard drive with USB 3.0 support and add in the ability to share the unit wirelessly with up to 8 different machines and power it by a lithium-ion battery. Not only does the Gauntlet show up in your network as a mountable drive in Windows and Mac OS, the Gauntlet supports using free applications for iOS devices and Android devices to share and stream media.
There are some limitations that you might want to consider including the inability to access network-based devices when using the pass through Internet capability the Gauntlet provides. Also, data transfer performance on the wireless connectivity that the Gauntlet provides seemed pretty low, even with the 802.11n support.
Potential uses cases for the Gauntlet include any time you need a shared data source like working on group projects for school or the office, on-the-go storage for devices like Ultrabooks with smaller hard drives and users that have large media collections they want to use with smart phones and tablets.
Check out our full video review below!
Note that in the video, our early sample of the Gauntlet 320 has the "node" label on it; the Gauntlet Node is a separate device that is only a DIY enclosure WITHOUT an included hard drive. Originally there was a sticker cover the "node" label but incorrectly removed it before filming. Just a heads up!
And Why the Industry Misses the Point
I am going to take a somewhat unpopular stance: I really like stereoscopic 3D. I also expect to change your mind and get you excited about stereoscopic 3D too - unless of course a circumstance such as monovision interferes with your ability to see 3D at all. I expect to accomplish where the industry has failed simply because I will not ignore the benefits of 3D in my explanation.
Firstly - we see a crisp image when our brain is more clearly able to make out objects in a scene.
We typically have two major methods of increasing the crispness of an image: we either increase the resolution or we increase the contrast of the picture. As resolution increases we receive a finer grid of positional information to place and contain the objects in the scene. As contrast increases we receive a wider difference between the brightest points and the darkest points from a scene which prevents objects from blending together in a mess of grey.
We are also able to experience depth information by comparing the parallax effect across both of our eyes. We are able to encapsulate each object into a 3D volume and position each capsule a more defined distance apart. Encapsulated objects appear crisper because we can more clearly see them as sharply defined independent objects.
Be careful with this stereoscopic 3D image. To see the 3D effect you must slowly cross your eyes until the two images align in the center. This should only be attempted by adults with fully developed eyes and without prior medical conditions. Also, sit a comfortable distance away so you do not need to cross your eyes too far inward and rest your eyes until they no longer feel strained. In short - do not pull an eye muscle or something. Use common sense. Also move your mouse cursor far away from the image as it will break your focusing lock and click on the image to make it full sized.
Again, be careful when crossing your eyes to see stereoscopic 3D and relax them when you are done.
The above image is a scene from Unreal Tournament 3 laid out in a cross-eyed 3D format. If you are safely able to experience the 3D image then I would like you to pay careful attention to how crisp the 3D image appeared. Compare this level of crispness to either the left or right eye image by itself.
Which has the crisper picture quality?
That is basically why 3D is awesome: it makes your picture quality appear substantially better by giving your brain more information about the object. This effect can also play with how the brain perceives the world you present it: similar to how HDR tonal mapping plays with exposure ranges we cannot see and infrared photography plays with colors we cannot see to modify the photograph - which we can see - for surreal effects.
Introduction and Internals
The Western Digital RAID Edition line of hard drives has been around for some time now, and has largely impressed us with each subsequent release. Since the launches of the RE4-GP and later, the faster spinning RE4, WD's enterprise line had been capped at the 2TB mark. Now that has changed with the introduction of a new line: simply named the RE Series:
Yup, that's right. 4 TeraBytes! With the Green and Red series capped at 3TB, this new RE is the largest capacity drive available from Western Digital. The catch is that, since it's tailored and built for enterprise use, it comes at a rather hefty price premium.
Introduction and Features
Seasonic has a well earned reputation for producing some of the best PC power supplies on the planet. Over the years, Seasonic has been the OEM (Original Equipment Manufacturer) of choice for companies like Corsair, PC Power & Cooling, and XFX to name just a few. But Seasonic also markets power supplies under their own brand name. The new X-Series 1050W and 1250W are Seasonic's newest and most powerfull PSUs to date. Both power supplies are based on Seasonic's X-Series line, which has brought several major advancements to the standard PC power supply platform since its introduction three years ago.
• Proprietary circuit design delivers High efficiency (80Plus Gold or Platinum certified)
• Full modular DC Connector Module features integrated VRMs (3.3V and 5V)
• Hybrid Silent Fan Control (3 modes of operation: Fanless, Silent and Cooling)
• High-quality Sanyo Denki SanAce120 dual ball bearing fan with PWM
• High-reliability 105°C grade A capacitors and solid polymer capacitors
Here is what Seasonic has to say about their new X-Series Gold 1050W and 1250W power supplies:
"The X-1050 and X-1250 are the newest additions to one of our most successful retail lines currently available. Now the X-Series will extend from 400 and 460 watt fanless and then 560, 660, 760, 850 watt and now 1050 and 1250 watts for top end systems; a total of eight X-Series models in all.
80Plus Gold The X-1050W and X-1250W PSUs are certified in accordance to the 80PLUS organization's high standards, offering the newest technology and innovation for performance and energy savings with up to 90% efficiency and a true power factor of greater than 0.9 PF.
Full Modular Design (DC to DC) Common to all X-Series power supplies, the new X-150 and X-1250 feature the unique integrated DC connector panel with onboard VRM (Voltage Regulator Module) that enables not only near perfect DC-to-DC conversion with reduction of current loss/impedance and increase of efficiency but also a fully modular DC cabling that enables maximum flexibility of integration and forward compatibility.
Seasonic Hybrid Silent Fan Control An industry first, advanced 3 phased (Fanless, Silent and Cooling Mode) thermal control balances between silence and cooling. In addition, a selector switch is provided to allow you to select between Seasonic S2FC control, without fanless mode or S3FC fan control with fanless mode.
Sanyo-Denki San Ace Silent Fan The world-renowned Sanyo Denki ball bearing fans are made of the highest quality components to insure maximum quality and performance. The use of spoon shaped high-density plastic fan blades with smoothed leading edges, strict tolerance ball bearings and precision copper axel are just some features to ensure ultra-low noise performance and quality."
Another GK106 Completes the Stack
It has been an interesting year for graphics cards and 2012 still has another solid quarter of releases ahead of it. With the launch of AMD's 7000-series back in January, followed by the start of NVIDIA's Kepler lineup in March, we have had new graphics cards on a very regular basis ever since. And while AMD's Radeon HD 7000 cards seemed to be bunched together a bit better, NVIDIA has staggered the release of the various Kepler cards, either because of capacity at the manufacturing facilities or due to product marketing plans - take your pick.
Today we see the completion of the NVIDIA GeForce GTX 600 stack (if you believe the PR at NVIDIA) with the release of the GeForce GTX 650 Ti, a $150 graphics card that fills in the gap between the somewhat anemic GTX 650 and GT 640 cards and the most recently unveiled card, the GTX 660 2GB that currently sells for $229.
The GTX 650 Ti has more in common with the GTX 660 than it does the GTX 650, both being based on the GK106 GPU, but is missing some of the unique features that NVIDIA has touted of the 600-series cards like GPU Boost. Let's dive into the product and see if this new card will be the best option for those of you with $150 graphics budgets.
Thoughts about Interface Design in General
I have been in several situations where a variety of people claim the gamepad is superior for gaming because that is what it was designed for. No elaboration or further justification is given. The controller is designed for gaming and is therefore clearly better. End of – despite often being start to – discussion in their minds.
Really it is a compromise between the needs of popular games and the environment of a couch.
Interface design is complicated. When you design an interface you need to consider: the expected types of applications; the environment of the user; what you are permitted to use; what tolerances are allowed; what your audience is used to; and so on, so forth. There is a lot to consider when you design an application for a user and I could make an educated guess that it is at least as hard to design the input device itself.
The history of keyboard design is a great example of tradeoffs in input devices.
Sometimes it is better to be worse...
The first wave of keyboards were interfaces to the mechanical typewriter. These keyboards were laid out in alphabetical order because as long as each key is accessible and the user could find the letter they wanted – who cares, right? We already have an order for the alphabet that people understands so the users should not have too much difficulty in finding the letter they need.
Another constraint quickly game to light: typists were too fast and the machines jammed.
The engineers now needed to design an input method which could keep up with the typist. Correcting the machine itself was somewhat futile so the solution was to make the typist as slow as possible. The most common letters in the English language were spread all over the place and – while possibly by fluke – the left hand is favored, as in made do more work, over the often dominant right hand.
The problem required making the most aggravating keyboard layout engineers could imagine. QWERTY was born.
Introduction and Technical Specifications
Courtesy of ASUS
It's been a couple months since we've had a chance to evaluate a Z77-based motherboard, so we are taking this opportunity to throw ASUS's P8Z77-V Deluxe on our test bench to put it through our comprehensive real-world and synthetic benchmarks. This $279 board has been available for several months and supports the LGA 1155 platform that includes Sandy Bridge and Ivy Bridge processors.
Courtesy of ASUS
There are many features to drool over about the ASUS P8Z77-V Deluxe, but my favorite ones include the board's unique power management features, Wi-Fi functionality with remote access, and customized UEFI BIOS. This board also includes other enhancements that focus on support for faster USB 3.0 and PCIe 3.0 integration as well as extra SATA 6GB/s ports that provide double the bandwidth of current bus systems.
Introduction, Specifications and Packaging
Last week, Samsung flew myself and a few of my fellow peers in the storage review community out to Seoul, Korea. The event was the 2012 Samsung SSD Global Summit:
At this event, Samsung officially announced their new 840 Pro, which we were able to obtain early under NDA and therefore publish in concert with the announcement. The 840 Pro was largely an incremental inprovement over their 830 Series. Newer, faster flash coupled with a higher clocked controller did well to improve on an already excellent product.
As the event closed, we were presented with the second model of the lineup - the 840. This model, sans the 'Pro' moniker, is meant more for general consumer usage. The first mass marketed SSD to use Triple Level Cell (TLC) flash, it sacrifices some write speed and long-term reliability in favor of what should become considerably lower cost/GB as production ramps up to full capacity. TLC flash is the next step beyond MLC, which is in turn a step after SLC. Here's a graphic to demonstrate:
Trinity Finally Comes to the Desktop
Trinity. Where to start? I find myself asking that question, as the road to this release is somewhat tortuous. Trinity, as a product code name, came around in early 2011. The first working silicon was shown that Summer. The first actual release of product was the mobile part in late Spring of this year. Throughout the summer notebook designs based on Trinity started to trickle out. Today we cover the release of the desktop versions of this product.
AMD has certainly had its ups and downs when it comes to APU releases. Their first real APU was Zacate, based on the new Bobcat CPU architecture. This product was an unmitigated success for AMD. Llano, on the other hand, had a pretty rocky start. Production and various supply issues caused it to be far less of a success than hoped. These issues were oddly enough not cleared up until late Spring of this year. By then mobile Trinity was out and people were looking towards the desktop version of the chip. AMD saw the situation, and the massive supply of Llano chips that it had, and decided to delay introduction of desktop Trinity until a later date.
To say that expectations for Trinity are high is an understatement. AMD has been on the ropes for quite a few years in terms of CPU performance. While the Phenom II series were at least competitive with the Core 2 Duo and Quad chips, they did not match up well against the latest i7/i5/i3 series of parts. Bulldozer was supposed to erase the processor advantage Intel had, but it came out of the oven as a seemingly half baked part. Piledriver was designed to succeed Bulldozer, and is supposed to shore up the architecture to make it more competitive. Piledriver is the basis of Trinity. Piledriver does sport significant improvements in clockspeed, power consumption, and IPC (instructions per clock). People are hopeful that Trinity would be able to match the performance of current Ivy Bridge processors from Intel, or at least get close.
So does it match Intel? In ways, I suppose. How much better is it than Bulldozer? That particular answer is actually a bit surprising. Is it really that much of a step above Llano? Yet another somewhat surprising answer for that particular question. Make no mistake, Trinity for desktop is a major launch for AMD, and their continued existence as a CPU manufacturer depends heavily on this part.
PhysX Settings Comparison
Borderlands 2 is a hell of a game; we actually ran a 4+ hour live event on launch day to celebrate its release and played it after our podcast that week as well. When big PC releases occur we usually like to take a look at performance of the game on a few graphics cards as well to see how NVIDIA and AMD cards stack up. Interestingly, for this title, PhysX technology was brought up again and NVIDIA was widely pushing it as a great example of implementation of the GPU-accelerated physics engine.
What you may find unique in Borderlands 2 is that the game actually allows you to enabled PhysX features at Low, Medium and High settings, with either NVIDIA or AMD Radeon graphics cards installed in your system. In past titles, like Batman: Arkham City and Mafia II, PhysX was only able to be enabled (or at least at higher settings) if you had an NVIDIA card. Many gamers that used AMD cards saw this as a slight and we tended to agree. But since we could enable it with a Radeon card installed, we were curious to see what the results would be.
Of course, don't expect the PhysX effects to be able to utilize the Radeon GPU for acceleration...
Borderlands 2 PhysX Settings Comparison
The first thing we wanted to learn was just how much difference you would see by moving from Low (the lowest setting, there is no "off") to Medium and then to High. The effects were identical on both AMD and NVIDIA cards and we made a short video here to demonstrate the changes in settings.
Or: the countdown to a fresh Start.
Over time – and not necessarily much of it – usage of a platform can become a marriage. I trusted Windows, nee MS-DOS, guardianship over all of my precious applications which depend upon it. Chances are you too have trusted Microsoft or a similar proprietary platform holder to provide a household for your content.
It is time for a custody hearing.
These are the reasons why I still use Windows – and who could profit as home wreckers.
1st Reason – Games
The most obvious leading topic.
Computer games have been dominated by Windows for quite some time now. When you find a PC game at retail or online you will find either a Windows trademark or the occasional half-eaten fruit somewhere on the page or packaging.
One of the leading reasons for the success of the PC platform is the culture of backwards compatibility. Though the platform has been rumored dead ad-infinitum it still exists – surrounded by a wasteland of old deprecated consoles. I still play games from past decades on their original platform.
Ahead of the release of Windows 8 and the onslaught of Windows 8-based tablets that will hit the market next month, Intel is taking the cover off the processor that many of these new devices will be powered by, the Intel Atom Z2760 previously known by the codename of Clover Trail. Intel is claiming that the Atom Z2760 is the beginning of a completely new Atom direction, now a complete SoC (system-on-a-chip) design that lowers power requirements, extends battery life and allows Intel's x86 architecture to find its way into smaller and more portable devices.
At it's heart, Clover Trail is based on the same Saltwell CPU core design that was found in the Medfield processor powering a handful of smartphones over in Europe. That means the Atom lineup remains an in-order architecture with a dual-issue command structure - nothing incredibly revolutionary there.
Unlike Medfield though, the Atom Z2760 is a dual-core design that still enables HyperThreading for four-threaded operating system integration. The cores will run at 1.8 GHz and it includes 1MB of L2 cache divided between the two cores evenly. Memory is connected through a dual-channel 32-bit bus to low power DDR2 memory running at 800 MHz and capacities up to 2GB.
Trinity's GPU Performance
Editor's Note: Right before the release of this story some discussion has been ongoing at other hardware sites about the methods AMD employed with this NDA and release of information. Essentially, AMD allowed us to write about only the gaming benchmarks and specifications for the Trinity APU, rather than allowing the full gamut of results including CPU tests, power consumption, etc. Why? Obviously AMD wants to see a good message be released about their product; by release info in stages they can at least allow a brief window for that.
Does it suck that they did this? Yes. Do I feel like we should have NOT published this because of those circumstances? Not at all. Information is information and we felt that getting it to you as soon as possible was beneficial. Also, because the parts are not on sale today we are not risking adversely affecting your purchasing decision with these limited benchmarks. When the parts DO go on sale, you will have our full review with all the positives and negatives laid out before you, in the open.
This kind of stuff happens often in our world - NVIDIA sent out GTX 660 cards but not GTX 650s because of lack luster performance for example - and we balance it and judge it on a case by case basis. I don't think anyone looking at this story sees a "full review" and would think to make a final decision about ANY product from it. That's not the goal. But just as we sometimes show you rumored specs and performance numbers on upcoming parts before the NDAs expire, we did this today with Trinity - it just so happens it was with AMD's blessing.
AMD has graciously allowed us the chance to give readers a small glimpse at the performance of the upcoming A series APUs based on the Trinity processor. Today we are covering the SKUs that will be released, general gaming performance, and what kind of power consumption we are seeing as compared to the previous Llano processor and any Intel processor we can lay hands upon.
Trinity is based on the updated Piledriver architecture, which is an update to Bulldozer. Piledriver improves upon IPC by a small amount over Bulldozer, but the biggest impact is that of power consumption and higher clockspeeds. It was pretty well known that Bulldozer did not hit the performance expectations of both AMD and consumers. Part of this was due to the design pulling more power at the target clockspeeds than was expected. To remedy this, AMD lowered clockspeeds. Piledriver fixes most of those power issues, as well as sprinkles some extra efficiency into the design, so that clockspeeds can scale to speeds that will make these products more competitive with current Intel offerings.
The top end model that AMD will be offering of the socket FM2 processors (for the time being) is the A10 5800K. This little number is a dual module/quad core processor running at 3.8 GHz with a turbo speed of 4.2 GHz. We see below the exact model range of products that AMD will be offering. This does not include the rumored Athlon II editions that will have a disabled GPU onboard. Each module features 2 MB of L2 cache, for a total of 4 MB on the processor. The A10 series does not feature a dedicated L3 cache as the FX processors do. This particular part is unlocked as well, so expect some decent overclocking right off the bat.
The A10 5800K features the VLIW 4 based graphics portion, which is significantly more efficient than the previous VLIW 5 based unit in Llano (A8 3870K and brethren). Even though it features the same number of stream processors as the 3870K, AMD is confident that this particular unit is upwards of 20% faster than the previous model. This GPU portion is running at a brisk 800 MHz. The GPU core is also unlocked, so expect some significant leaps in that piece of the puzzle as well.
That is about all I can give out at this time, since this is primarily based on what we see in the diagram and what we have learned from the previous Trinity release (for notebooks).
Introduction and Features
EVGA might not be the first name that comes to mind when looking for a high-end power supply but they are about to change that with the introduction of the SuperNOVA NEX1500 Classified 1500W power supply. Not only is the SuperNOVA NEX1500 Classified the highest capacity power supply we have reviewed to date, it also comes bundled with EVGA’s SuperNOVA software that allows monitoring all of the power supply’s functions in real-time from your desktop.
EVGA was founded in 1999 with headquarters in Brea, California. They currently specialize in producing NVIDIA based graphics adapters and Intel based motherboards and they are now expanding their product line to include enthusiast grade power supplies, starting with the NEX1500. EVGA plans to add 750W and 650W models to follow.
The EVGA SuperNOVA NEX1500 Classified power supply can deliver up to 1500W combined load while operating on 120VAC mains and can be “overclocked” to 1650W if operated on 240 VAC mains. It used to be that ~1200W DC output was about as high as you could go on 120VAC mains but as the overall PSU total efficiency has increased higher outputs are now possible. The NEX1500 supports either single or multiple +12V rail modes (DIP switch selectable) and can deliver up to 124A on the +12V rail (133A in OC mode). WOW – you can weld ½” steel plate with 120A!! (OK, I might have trouble keeping the arc stable at 12V, but that is still some serious current.) This bad-boy even comes with a handle and is backed by a 10-Year warranty. And just to tickle your interest, here are a few stats you might be wondering about:
• 1500W Continuous power output @50°C (1650W in OC mode)
• 124A +12V rail (133A in OC mode)
• SuperNOVA control and monitoring software included (USB interface)
• (19) PCI-E connectors and (2) EPS12V connectors
• OEM is Etasis Electronics Corp. (well-known in the server industry)
• MSRP $449.99 USD (available now)
Here is what EVGA has to say about the new SuperNOVA NEX1500 Classified PSU:
“The EVGA NEX1500 Classified is the ultimate enthusiast power supply. Designed to support the toughest hardware, the EVGA NEX1500 Classified supports ground-breaking new features like SuperNOVA enthusiast software control, Overclock Mode that increases the maximum power output up to 1650W, and fully modular, individually sleeved cables.
You can also count on EVGA to provide the utmost reliability and performance, with 100% Japanese capacitors and a durable ball bearing fan. The NEX1500 Classified is designed from start to finish to be the best choice for today’s most demanding high-end computers. Get to the next level with the EVGA NEX1500 Classified Power Supply!"
EVGA SuperNOVA NEX1500 Classified 1500W PSU Key Features:
• SuperNOVA, exclusive power supply control and monitoring software
• Control and adjust +12V voltage for maximum overclocking potential
• Switch between single or multiple +12V rails for ultimate control
• Overclock Mode allows PSU to deliver up to 1650W with 230VAC input
• Unbeatable 10-Year Warranty and unparalleled EVGA Customer Support
• 80PLUS Gold certified, with up to 90% efficiency under typical loads
• Highest quality Japanese brand capacitors ensure long-term reliability
• Individually sleeved cables for outstanding looks and cable management
• Fully modular to reduce clutter and improve airflow
• NVIDIA SLI Certified
• Sanyo Denki ball bearing fan for exceptional reliability and quiet operation
• Universal AC input (100-240V) with Active PFC
• Heavy-duty Protections: OVP, UVP, OCP, OPP, SCP and OTP
• Dimensions: 150mm (W) x 86mm (H) x 200mm (L)
Introduction, Specifications and Packaging
Samsung has been at this SSD thing for quite some time now. The first SSD I bought was in fact a Samsung unit meant for an ultraportable laptop. Getting it into my desktop was a hack and a half, involving a ZIF to IDE adapter, which then passed through yet another adapter to convert to SATA. The drive was wicked fast at the time, and while it handily slaughtered my RAID-0 pair of 74GB VelociRaptors in random reads, any writes caused serious stuttering of the drive, and therefore the entire OS. I was clearly using the drive outside of its intended use, but hey, I was an early adopter.
Several SSDs later came the Intel X25-M. It was a great drive, but in its earliest form was not without fault. Luckily, these kinks were worked out industry-wide, and everyone quickly accelerated their firmware optimizations as to better handle random writes. Samsung took a few generations to get this under control. The first to truly get over this hump was the 830 Series, which launched earlier this year. It utilized a triple core Arm 9 CPU which was able to effectively brute force heavy random write workloads. It also significantly increased the speed and nimbleness of the 830 across the board, which combined with Samsung's excellent reliability record, quickly made it my most recommended series as of late.
...and now we have the 840 Series, which launched today. Well, technically it launched yesterday if you're reading from the USA. Here in Korea the launch started at 10 AM and spanned a day of product press briefings leading to the product NDA expiration at 8 PM Korea time. This review will focus on the 512GB capacity of the 840 Pro model. We will follow on with the 840 (non-pro) at a later date:
Read on for the full review!
Get notified when we go live!