Author:
Subject: Storage, Mobile
Manufacturer: Patriot

Wireless storage for PC, Mac, iOS and Android

Today we are taking a look at the new Patriot Gauntlet 320 external USB 3.0 and wireless hard drive, available starting at $149 at Newegg.com.

Gauntlet_320_Pkg_Left.jpg

The premise is quite simple: take a portable hard drive with USB 3.0 support and add in the ability to share the unit wirelessly with up to 8 different machines and power it by a lithium-ion battery.  Not only does the Gauntlet show up in your network as a mountable drive in Windows and Mac OS, the Gauntlet supports using free applications for iOS devices and Android devices to share and stream media.

Gauntlet_320-Left.png

There are some limitations that you might want to consider including the inability to access network-based devices when using the pass through Internet capability the Gauntlet provides.  Also, data transfer performance on the wireless connectivity that the Gauntlet provides seemed pretty low, even with the 802.11n support. 

Potential uses cases for the Gauntlet include any time you need a shared data source like working on group projects for school or the office, on-the-go storage for devices like Ultrabooks with smaller hard drives and users that have large media collections they want to use with smart phones and tablets.

Check out our full video review below!

Note that in the video, our early sample of the Gauntlet 320 has the "node" label on it; the Gauntlet Node is a separate device that is only a DIY enclosure WITHOUT an included hard drive.  Originally there was a sticker cover the "node" label but incorrectly removed it before filming.  Just a heads up!

Manufacturer: PC Perspective

And Why the Industry Misses the Point

3d_01_title2.png

I am going to take a somewhat unpopular stance: I really like stereoscopic 3D. I also expect to change your mind and get you excited about stereoscopic 3D too - unless of course a circumstance such as monovision interferes with your ability to see 3D at all. I expect to accomplish where the industry has failed simply because I will not ignore the benefits of 3D in my explanation.

Firstly - we see a crisp image when our brain is more clearly able to make out objects in a scene.

We typically have two major methods of increasing the crispness of an image: we either increase the resolution or we increase the contrast of the picture. As resolution increases we receive a finer grid of positional information to place and contain the objects in the scene. As contrast increases we receive a wider difference between the brightest points and the darkest points from a scene which prevents objects from blending together in a mess of grey.

We are also able to experience depth information by comparing the parallax effect across both of our eyes. We are able to encapsulate each object into a 3D volume and position each capsule a more defined distance apart. Encapsulated objects appear crisper because we can more clearly see them as sharply defined independent objects.

Be careful with this stereoscopic 3D image. To see the 3D effect you must slowly cross your eyes until the two images align in the center. This should only be attempted by adults with fully developed eyes and without prior medical conditions. Also, sit a comfortable distance away so you do not need to cross your eyes too far inward and rest your eyes until they no longer feel strained. In short - do not pull an eye muscle or something. Use common sense. Also move your mouse cursor far away from the image as it will break your focusing lock and click on the image to make it full sized.

3d_03.png

Again, be careful when crossing your eyes to see stereoscopic 3D and relax them when you are done.

The above image is a scene from Unreal Tournament 3 laid out in a cross-eyed 3D format. If you are safely able to experience the 3D image then I would like you to pay careful attention to how crisp the 3D image appeared. Compare this level of crispness to either the left or right eye image by itself.

Which has the crisper picture quality?

That is basically why 3D is awesome: it makes your picture quality appear substantially better by giving your brain more information about the object. This effect can also play with how the brain perceives the world you present it: similar to how HDR tonal mapping plays with exposure ranges we cannot see and infrared photography plays with colors we cannot see to modify the photograph - which we can see - for surreal effects.

So what goes terribly wrong? Read on to the article to find out.

Subject: Storage

Introduction and Internals

Introduction:

The Western Digital RAID Edition line of hard drives has been around for some time now, and has largely impressed us with each subsequent release. Since the launches of the RE4-GP and later, the faster spinning RE4, WD's enterprise line had been capped at the 2TB mark. Now that has changed with the introduction of a new line: simply named the RE Series:

121010-163545-4.8.jpg

Yup, that's right. 4 TeraBytes! With the Green and Red series capped at 3TB, this new RE is the largest capacity drive available from Western Digital. The catch is that, since it's tailored and built for enterprise use, it comes at a rather hefty price premium.

Read on for the full review!

Manufacturer: Seasonic

Introduction and Features

2-X_1050-1250_Banner.jpg

Seasonic has a well earned reputation for producing some of the best PC power supplies on the planet. Over the years, Seasonic has been the OEM (Original Equipment Manufacturer) of choice for companies like Corsair, PC Power & Cooling, and XFX to name just a few. But Seasonic also markets power supplies under their own brand name. The new X-Series 1050W and 1250W are Seasonic's newest and most powerfull PSUs to date. Both power supplies are based on Seasonic's X-Series line, which has brought several major advancements to the standard PC power supply platform since its introduction three years ago.

• Proprietary circuit design delivers High efficiency (80Plus Gold or Platinum certified)
• Full modular DC Connector Module features integrated VRMs (3.3V and 5V)
• Hybrid Silent Fan Control (3 modes of operation: Fanless, Silent and Cooling)
• High-quality Sanyo Denki SanAce120 dual ball bearing fan with PWM
• High-reliability 105°C grade A capacitors and solid polymer capacitors

3-Side-Nameplate.jpg

Here is what Seasonic has to say about their new X-Series Gold 1050W and 1250W power supplies:

"The X-1050 and X-1250 are the newest additions to one of our most successful retail lines currently available. Now the X-Series will extend from 400 and 460 watt fanless and then 560, 660, 760, 850 watt and now 1050 and 1250 watts for top end systems; a total of eight X-Series models in all.

80Plus Gold The X-1050W and X-1250W PSUs are certified in accordance to the 80PLUS organization's high standards, offering the newest technology and innovation for performance and energy savings with up to 90% efficiency and a true power factor of greater than 0.9 PF.

Full Modular Design (DC to DC) Common to all X-Series power supplies, the new X-150 and X-1250 feature the unique integrated DC connector panel with onboard VRM (Voltage Regulator Module) that enables not only near perfect DC-to-DC conversion with reduction of current loss/impedance and increase of efficiency but also a fully modular DC cabling that enables maximum flexibility of integration and forward compatibility.

Seasonic Hybrid Silent Fan Control An industry first, advanced 3 phased (Fanless, Silent and Cooling Mode) thermal control balances between silence and cooling. In addition, a selector switch is provided to allow you to select between Seasonic S2FC control, without fanless mode or S3FC fan control with fanless mode.

Sanyo-Denki San Ace Silent Fan The world-renowned Sanyo Denki ball bearing fans are made of the highest quality components to insure maximum quality and performance. The use of spoon shaped high-density plastic fan blades with smoothed leading edges, strict tolerance ball bearings and precision copper axel are just some features to ensure ultra-low noise performance and quality."

Please continue reading our Seasonic X-1250 PSU review!

Author:
Manufacturer: NVIDIA

Another GK106 Completes the Stack

It has been an interesting year for graphics cards and 2012 still has another solid quarter of releases ahead of it.  With the launch of AMD's 7000-series back in January, followed by the start of NVIDIA's Kepler lineup in March, we have had new graphics cards on a very regular basis ever since.  And while AMD's Radeon HD 7000 cards seemed to be bunched together a bit better, NVIDIA has staggered the release of the various Kepler cards, either because of capacity at the manufacturing facilities or due to product marketing plans - take your pick. 

Today we see the completion of the NVIDIA GeForce GTX 600 stack (if you believe the PR at NVIDIA) with the release of the GeForce GTX 650 Ti, a $150 graphics card that fills in the gap between the somewhat anemic GTX 650 and GT 640 cards and the most recently unveiled card, the GTX 660 2GB that currently sells for $229. 

01.jpg

The GTX 650 Ti has more in common with the GTX 660 than it does the GTX 650, both being based on the GK106 GPU, but is missing some of the unique features that NVIDIA has touted of the 600-series cards like GPU Boost.  Let's dive into the product and see if this new card will be the best option for those of you with $150 graphics budgets.

Continue reading our review of the NVIDIA GeForce GTX 650 Ti 1GB!!!

Manufacturer: PC Perspective

Thoughts about Interface Design in General

I have been in several situations where a variety of people claim the gamepad is superior for gaming because that is what it was designed for. No elaboration or further justification is given. The controller is designed for gaming and is therefore clearly better. End of – despite often being start to – discussion in their minds.

Really it is a compromise between the needs of popular games and the environment of a couch.

Interface design is complicated. When you design an interface you need to consider: the expected types of applications; the environment of the user; what you are permitted to use; what tolerances are allowed; what your audience is used to; and so on, so forth. There is a lot to consider when you design an application for a user and I could make an educated guess that it is at least as hard to design the input device itself.

The history of keyboard design is a great example of tradeoffs in input devices.

Sometimes it is better to be worse...

Dvorak.svg_.png

The first wave of keyboards were interfaces to the mechanical typewriter. These keyboards were laid out in alphabetical order because as long as each key is accessible and the user could find the letter they wanted – who cares, right? We already have an order for the alphabet that people understands so the users should not have too much difficulty in finding the letter they need.

Another constraint quickly game to light: typists were too fast and the machines jammed.

The engineers now needed to design an input method which could keep up with the typist. Correcting the machine itself was somewhat futile so the solution was to make the typist as slow as possible. The most common letters in the English language were spread all over the place and – while possibly by fluke – the left hand is favored, as in made do more work, over the often dominant right hand.

The problem required making the most aggravating keyboard layout engineers could imagine. QWERTY was born.

What has been designed to threaten QWERTY? Read on to find out.

Author:
Subject: Motherboards
Manufacturer: ASUS

Introduction and Technical Specifications

Introduction

image1.jpg

Courtesy of ASUS

It's been a couple months since we've had a chance to evaluate a Z77-based motherboard, so we are taking this opportunity to throw ASUS's P8Z77-V Deluxe on our test bench to put it through our comprehensive real-world and synthetic benchmarks. This $279 board has been available for several months and supports the LGA 1155 platform that includes Sandy Bridge and Ivy Bridge processors.

 

image2.jpg

Courtesy of ASUS

There are many features to drool over about the ASUS P8Z77-V Deluxe, but my favorite ones include the board's unique power management features, Wi-Fi functionality with remote access, and customized UEFI BIOS.  This board also includes other enhancements that focus on support for faster USB 3.0 and PCIe 3.0 integration as well as extra SATA 6GB/s ports that provide double the bandwidth of current bus systems.

 

Continue reading more about our review of the ASUS P8Z77-V Deluxe!

Subject: Storage
Tagged: tlc, ssd, Samsung, pro, mdx, 840

Introduction, Specifications and Packaging

Introduction

Last week, Samsung flew myself and a few of my fellow peers in the storage review community out to Seoul, Korea. The event was the 2012 Samsung SSD Global Summit:

120924-104020-4.28.jpg

At this event, Samsung officially announced their new 840 Pro, which we were able to obtain early under NDA and therefore publish in concert with the announcement. The 840 Pro was largely an incremental inprovement over their 830 Series. Newer, faster flash coupled with a higher clocked controller did well to improve on an already excellent product.

121002-220858-6.35.jpg

As the event closed, we were presented with the second model of the lineup - the 840. This model, sans the 'Pro' moniker, is meant more for general consumer usage. The first mass marketed SSD to use Triple Level Cell (TLC) flash, it sacrifices some write speed and long-term reliability in favor of what should become considerably lower cost/GB as production ramps up to full capacity. TLC flash is the next step beyond MLC, which is in turn a step after SLC. Here's a graphic to demonstrate:

slc-mlc-tlc-glass.jpg

Read on for the full review!

Author:
Subject: Processors
Manufacturer: AMD

Trinity Finally Comes to the Desktop

Trinity.  Where to start?  I find myself asking that question, as the road to this release is somewhat tortuous.  Trinity, as a product code name, came around in early 2011.  The first working silicon was shown that Summer.  The first actual release of product was the mobile part in late Spring of this year.  Throughout the summer notebook designs based on Trinity started to trickle out.  Today we cover the release of the desktop versions of this product.

trin_01.jpg

AMD has certainly had its ups and downs when it comes to APU releases.  Their first real APU was Zacate, based on the new Bobcat CPU architecture.  This product was an unmitigated success for AMD.  Llano, on the other hand, had a pretty rocky start.  Production and various supply issues caused it to be far less of a success than hoped.  These issues were oddly enough not cleared up until late Spring of this year.  By then mobile Trinity was out and people were looking towards the desktop version of the chip.  AMD saw the situation, and the massive supply of Llano chips that it had, and decided to delay introduction of desktop Trinity until a later date.

To say that expectations for Trinity are high is an understatement.  AMD has been on the ropes for quite a few years in terms of CPU performance.  While the Phenom II series were at least competitive with the Core 2 Duo and Quad chips, they did not match up well against the latest i7/i5/i3 series of parts.  Bulldozer was supposed to erase the processor advantage Intel had, but it came out of the oven as a seemingly half baked part.  Piledriver was designed to succeed Bulldozer, and is supposed to shore up the architecture to make it more competitive.  Piledriver is the basis of Trinity.  Piledriver does sport significant improvements in clockspeed, power consumption, and IPC (instructions per clock).  People are hopeful that Trinity would be able to match the performance of current Ivy Bridge processors from Intel, or at least get close.

So does it match Intel?  In ways, I suppose.  How much better is it than Bulldozer?  That particular answer is actually a bit surprising.  Is it really that much of a step above Llano?  Yet another somewhat surprising answer for that particular question.  Make no mistake, Trinity for desktop is a major launch for AMD, and their continued existence as a CPU manufacturer depends heavily on this part.

Continue reading our review of the AMD Trinity A10 APUs!!

Author:
Manufacturer: Various

PhysX Settings Comparison

Borderlands 2 is a hell of a game; we actually ran a 4+ hour live event on launch day to celebrate its release and played it after our podcast that week as well.  When big PC releases occur we usually like to take a look at performance of the game on a few graphics cards as well to see how NVIDIA and AMD cards stack up.  Interestingly, for this title, PhysX technology was brought up again and NVIDIA was widely pushing it as a great example of implementation of the GPU-accelerated physics engine.

What you may find unique in Borderlands 2 is that the game actually allows you to enabled PhysX features at Low, Medium and High settings, with either NVIDIA or AMD Radeon graphics cards installed in your system.  In past titles, like Batman: Arkham City and Mafia II, PhysX was only able to be enabled (or at least at higher settings) if you had an NVIDIA card.  Many gamers that used AMD cards saw this as a slight and we tended to agree.  But since we could enable it with a Radeon card installed, we were curious to see what the results would be.

screenshot-16.jpg

Of course, don't expect the PhysX effects to be able to utilize the Radeon GPU for acceleration...

Borderlands 2 PhysX Settings Comparison

The first thing we wanted to learn was just how much difference you would see by moving from Low (the lowest setting, there is no "off") to Medium and then to High.  The effects were identical on both AMD and NVIDIA cards and we made a short video here to demonstrate the changes in settings.

Continue reading our article that compares PhysX settings on AMD and NVIDIA GPUs!!

Manufacturer: PC Perspective
Tagged: windows 8, linux, bsd

Or: the countdown to a fresh Start.

Over time – and not necessarily much of it – usage of a platform can become a marriage. I trusted Windows, nee MS-DOS, guardianship over all of my precious applications which depend upon it. Chances are you too have trusted Microsoft or a similar proprietary platform holder to provide a household for your content.

It is time for a custody hearing.

These are the reasons why I still use Windows – and who could profit as home wreckers.

Windows8TheEnd.png
Windows 8 -- keep your rings. You are not ready for commitment.

1st Reason – Games

Win8_End_Steam.png

The most obvious leading topic.

Computer games have been dominated by Windows for quite some time now. When you find a PC game at retail or online you will find either a Windows trademark or the occasional half-eaten fruit somewhere on the page or packaging.

One of the leading reasons for the success of the PC platform is the culture of backwards compatibility. Though the platform has been rumored dead ad-infinitum it still exists – surrounded by a wasteland of old deprecated consoles. I still play games from past decades on their original platform.

Check in after the break to find out why I still use Windows.

Author:
Subject: Processors, Mobile
Manufacturer: Intel

Hardware Specifications

Ahead of the release of Windows 8 and the onslaught of Windows 8-based tablets that will hit the market next month, Intel is taking the cover off the processor that many of these new devices will be powered by, the Intel Atom Z2760 previously known by the codename of Clover Trail.  Intel is claiming that the Atom Z2760 is the beginning of a completely new Atom direction, now a complete SoC (system-on-a-chip) design that lowers power requirements, extends battery life and allows Intel's x86 architecture to find its way into smaller and more portable devices. 

atom_b_rgb_3000.png

At it's heart, Clover Trail is based on the same Saltwell CPU core design that was found in the Medfield processor powering a handful of smartphones over in Europe.  That means the Atom lineup remains an in-order architecture with a dual-issue command structure - nothing incredibly revolutionary there. 

die_diagram.jpg

Unlike Medfield though, the Atom Z2760 is a dual-core design that still enables HyperThreading for four-threaded operating system integration.  The cores will run at 1.8 GHz and it includes 1MB of L2 cache divided between the two cores evenly.  Memory is connected through a dual-channel 32-bit bus to low power DDR2 memory running at 800 MHz and capacities up to 2GB. 

Continue reading our information on Intel's Clover Trail Atom Z2760 SoC!!

Author:
Subject: Processors
Manufacturer: AMD

Trinity's GPU Performance

Editor's Note: Right before the release of this story some discussion has been ongoing at other hardware sites about the methods AMD employed with this NDA and release of information.  Essentially, AMD allowed us to write about only the gaming benchmarks and specifications for the Trinity APU, rather than allowing the full gamut of results including CPU tests, power consumption, etc.  Why?  Obviously AMD wants to see a good message be released about their product; by release info in stages they can at least allow a brief window for that.  

Does it suck that they did this?  Yes.  Do I feel like we should have NOT published this because of those circumstances?  Not at all.  Information is information and we felt that getting it to you as soon as possible was beneficial.  Also, because the parts are not on sale today we are not risking adversely affecting your purchasing decision with these limited benchmarks.  When the parts DO go on sale, you will have our full review with all the positives and negatives laid out before you, in the open.  

This kind of stuff happens often in our world - NVIDIA sent out GTX 660 cards but not GTX 650s because of lack luster performance for example - and we balance it and judge it on a case by case basis.  I don't think anyone looking at this story sees a "full review" and would think to make a final decision about ANY product from it.  That's not the goal.  But just as we sometimes show you rumored specs and performance numbers on upcoming parts before the NDAs expire, we did this today with  Trinity - it just so happens it was with AMD's blessing.  

AMD has graciously allowed us the chance to give readers a small glimpse at the performance of the upcoming A series APUs based on the Trinity processor.  Today we are covering the SKUs that will be released, general gaming performance, and what kind of power consumption we are seeing as compared to the previous Llano processor and any Intel processor we can lay hands upon.

Trinity is based on the updated Piledriver architecture, which is an update to Bulldozer.  Piledriver improves upon IPC by a small amount over Bulldozer, but the biggest impact is that of power consumption and higher clockspeeds.  It was pretty well known that Bulldozer did not hit the performance expectations of both AMD and consumers.  Part of this was due to the design pulling more power at the target clockspeeds than was expected.  To remedy this, AMD lowered clockspeeds.  Piledriver fixes most of those power issues, as well as sprinkles some extra efficiency into the design, so that clockspeeds can scale to speeds that will make these products more competitive with current Intel offerings.

 

The Lineup

The top end model that AMD will be offering of the socket FM2 processors (for the time being) is the A10 5800K.  This little number is a dual module/quad core processor running at 3.8 GHz with a turbo speed of 4.2 GHz.  We see below the exact model range of products that AMD will be offering.  This does not include the rumored Athlon II editions that will have a disabled GPU onboard.  Each module features 2 MB of L2 cache, for a total of 4 MB on the processor.  The A10 series does not feature a dedicated L3 cache as the FX processors do.  This particular part is unlocked as well, so expect some decent overclocking right off the bat.

 

trin_line.jpg

The A10 5800K features the VLIW 4 based graphics portion, which is significantly more efficient than the previous VLIW 5 based unit in Llano (A8 3870K and brethren).  Even though it features the same number of stream processors as the 3870K, AMD is confident that this particular unit is upwards of 20% faster than the previous model.  This GPU portion is running at a brisk 800 MHz.  The GPU core is also unlocked, so expect some significant leaps in that piece of the puzzle as well.

trin_perf.jpg

That is about all I can give out at this time, since this is primarily based on what we see in the diagram and what we have learned from the previous Trinity release (for notebooks).

Click to read the entire post here.

Manufacturer: EVGA

Introduction and Features

EVGA might not be the first name that comes to mind when looking for a high-end power supply but they are about to change that with the introduction of the SuperNOVA NEX1500 Classified 1500W power supply. Not only is the SuperNOVA NEX1500 Classified the highest capacity power supply we have reviewed to date, it also comes bundled with EVGA’s SuperNOVA software that allows monitoring all of the power supply’s functions in real-time from your desktop.

EVGA was founded in 1999 with headquarters in Brea, California. They currently specialize in producing NVIDIA based graphics adapters and Intel based motherboards and they are now expanding their product line to include enthusiast grade power supplies, starting with the NEX1500. EVGA plans to add 750W and 650W models to follow.

2-SuperNOVA-Banner.jpg

The EVGA SuperNOVA NEX1500 Classified power supply can deliver up to 1500W combined load while operating on 120VAC mains and can be “overclocked” to 1650W if operated on 240 VAC mains.  It used to be that ~1200W DC output was about as high as you could go on 120VAC mains but as the overall PSU total efficiency has increased higher outputs are now possible. The NEX1500 supports either single or multiple +12V rail modes (DIP switch selectable) and can deliver up to 124A on the +12V rail (133A in OC mode). WOW – you can weld ½” steel plate with 120A!! (OK, I might have trouble keeping the arc stable at 12V, but that is still some serious current.) This bad-boy even comes with a handle and is backed by a 10-Year warranty. And just to tickle your interest, here are a few stats you might be wondering about:
• 1500W Continuous power output @50°C (1650W in OC mode)
• 124A +12V rail (133A in OC mode)
• SuperNOVA control and monitoring software included (USB interface)
• (19) PCI-E connectors and (2) EPS12V connectors
• OEM is Etasis Electronics Corp. (well-known in the server industry)
• MSRP $449.99 USD (available now)

Here is what EVGA has to say about the new SuperNOVA NEX1500 Classified PSU:
“The EVGA NEX1500 Classified is the ultimate enthusiast power supply. Designed to support the toughest hardware, the EVGA NEX1500 Classified supports ground-breaking new features like SuperNOVA enthusiast software control, Overclock Mode that increases the maximum power output up to 1650W, and fully modular, individually sleeved cables.

You can also count on EVGA to provide the utmost reliability and performance, with 100% Japanese capacitors and a durable ball bearing fan. The NEX1500 Classified is designed from start to finish to be the best choice for today’s most demanding high-end computers. Get to the next level with the EVGA NEX1500 Classified Power Supply!"

3-NEX1500.jpg

EVGA SuperNOVA NEX1500 Classified 1500W PSU Key Features:

• SuperNOVA, exclusive power supply control and monitoring software
• Control and adjust +12V voltage for maximum overclocking potential
• Switch between single or multiple +12V rails for ultimate control
• Overclock Mode allows PSU to deliver up to 1650W with 230VAC input
• Unbeatable 10-Year Warranty and unparalleled EVGA Customer Support
• 80PLUS Gold certified, with up to 90% efficiency under typical loads
• Highest quality Japanese brand capacitors ensure long-term reliability
• Individually sleeved cables for outstanding looks and cable management
• Fully modular to reduce clutter and improve airflow
• NVIDIA SLI Certified
• Sanyo Denki ball bearing fan  for exceptional reliability and quiet operation
• Universal AC input (100-240V) with Active PFC
• Heavy-duty Protections: OVP, UVP, OCP, OPP, SCP and OTP
• Dimensions: 150mm (W) x 86mm (H) x 200mm (L)

Please continue reading our review of the EVGA SuperNOVA NEX1500 PSU!!!

Subject: Storage
Tagged: 840, mdx, pro, Samsung, ssd

Introduction, Specifications and Packaging

Introduction

Samsung has been at this SSD thing for quite some time now. The first SSD I bought was in fact a Samsung unit meant for an ultraportable laptop. Getting it into my desktop was a hack and a half, involving a ZIF to IDE adapter, which then passed through yet another adapter to convert to SATA. The drive was wicked fast at the time, and while it handily slaughtered my RAID-0 pair of 74GB VelociRaptors in random reads, any writes caused serious stuttering of the drive, and therefore the entire OS. I was clearly using the drive outside of its intended use, but hey, I was an early adopter.

Several SSDs later came the Intel X25-M. It was a great drive, but in its earliest form was not without fault. Luckily, these kinks were worked out industry-wide, and everyone quickly accelerated their firmware optimizations as to better handle random writes. Samsung took a few generations to get this under control. The first to truly get over this hump was the 830 Series, which launched earlier this year. It utilized a triple core Arm 9 CPU which was able to effectively brute force heavy random write workloads. It also significantly increased the speed and nimbleness of the 830 across the board, which combined with Samsung's excellent reliability record, quickly made it my most recommended series as of late.

DSC00687.JPG

...and now we have the 840 Series, which launched today. Well, technically it launched yesterday if you're reading from the USA. Here in Korea the launch started at 10 AM and spanned a day of product press briefings leading to the product NDA expiration at 8 PM Korea time. This review will focus on the 512GB capacity of the 840 Pro model. We will follow on with the 840 (non-pro) at a later date:

DSC00688.JPG

Read on for the full review!

Author:
Subject: Mobile
Manufacturer: Lenovo

Introduction, Design

u410-1.jpg

Before Intel released the ultrabook standard there were already laptops that we’re close to what Intel would envision, and while some had already gained attention on their own, most were not given any special attention. One of these laptops was the IdeaPad U series, a part of Lenovo’s consumer line-up which had long focused on thin and light design.

I reviewed one of those laptops, the Lenovo U260, in 2010. That 12.5 laptop weighed in at just 3.04 pounds and is - to this very day - among the thinnest and lightest laptops we’ve reviewed at PC Perspective.

Alas, the U260 was not long for this world, but its largest siblings live on. Now we’re taking a look at the U410, Lenovo’s 14-inch ultrabook and the largest product in the U-Series. Let’s see what kind of hardware it brings to this suddenly crowded category.

u410table.png

Well, there are no surprises here, but you shouldn’t have expected any. Intel’s moves to make cool, thin laptops more widespread has ironically robbed them of their excitement. They’re all roughly the same in size and weight and they can all be equipped with identical Intel processors.

This makes it hard for any particular ultrabook - even those with a bloodline that starts prior to Intel’s ultrabook push - to stand out. Let’s see if the Lenovo IdeaPad U410 can conjure some magic.

Continue reading our review of the Lenovo IdeaPad U410!!

Manufacturer: Corsair

Introduction and Features

Corsair continues to bring a full line of high quality power supplies, memory components, cases, cooling components, SSDs and accessories to market for the PC enthusiast and professional alike.  Corsair's updated Professional Series HX power supplies include four models; the HX650, HX750, HX850 and HX1050.  All of the power supplies in the Professional Series feature modular cables, premium quality components, an energy-efficient design (now 80 Plus Gold certified) and quiet operation; and they are backed by a 7-year warranty and lifetime access to Corsair's comprehensive technical support and customer service. The most obvious differences between the new models and the old Professional Series HX PSUs are the new 80 Plus Gold efficiency certification (upgraded from 80 Plus Silver) and the ability to operate in fanless-mode.

2-hx850_psu_sideview_cable_.jpg

Here is what Corsair has to say about their new Professional Series HX PSUs:

"Legendary Performance and Reliability

Corsair Professional Series HX power supplies are designed for PC builders and upgraders who need a highly efficient, quiet, and supremely-reliable power supply, with a modular cable-set that makes installation a breeze.

Quiet Operation at Low Loads

Thanks to their highly-efficient design, Corsair Professional Series power supplies generate minimal heat, and are able to operate in a silent, fully-fanless mode at up to 20% of the PSU’s maximum load (170W for the HX850). This means that Professional Series HX PSUs will be completely silent when you’re performing less intensive tasks, such as web browsing or chatting in forums. And the thermally-controlled fan spins up gradually above 20% load, so that it still operates quietly during normal use and when gaming. Basic PC power supplies have fans that spin all the time your PC is on – whether you’re pushing your graphics card to the limit or just surfing the web – making them noisier and more intrusive.

Modular Cables for Easy Installation

Professional Series power supplies have a comprehensive modular cable set that allows you to use only the cables you need for your particular set of components. The benefits of this include a cleaner, neater installation, and that ‘professionally-built’ look, plus increased airflow through the case due to reduced cable clutter. The cables are also long enough to support full-tower cases.

80 PLUS Gold: High Efficiency – Low Heat

Efficiency is the measurement of how effectively a power supply converts AC power from your wall outlet to the DC power used by your PC’s components. If your power supply isn’t efficient, it will generate more heat, which requires more cooling and more fan noise. And, it might even affect your power bill.

Professional Series HX PSUs are among the most efficient on the market. Each model has 80 Plus Gold certification, which ensures up to 90% energy-efficiency. This helps to keep your PC cool and quiet, and it may even save you money too.

Reliable

Professional Series HX PSUs are built with premium components, such as 105°C capacitors, and are capable of continuous power delivery at a temperature rating of 50°C, ensuring maximum performance and reliability even in the most demanding and hot-running performance PCs.

The Corsair Advantage

Corsair Professional Series PSUs are backed by a reassuring 7-year warranty and comprehensive customer support via telephone, email, forum and the Tech Support Express helpdesk."

Please continue reading our review of the Corsair Professional Series HX850!

Author:
Manufacturer: NVIDIA

GK106 Completes the Circle

The release of the various Kepler-based graphics cards have been interesting to watch from the outside.  Though NVIDIA certainly spiced things up with the release of the GeForce GTX 680 2GB card back in March, and then with the dual-GPU GTX 690 4GB graphics card, for quite quite some time NVIDIA was content to leave the sub-$400 markets to AMD's Radeon HD 7000 cards.  And of course NVIDIA's own GTX 500-series.

But gamers and enthusiasts are fickle beings - knowing that the GTX 660 was always JUST around the corner, many of you were simply not willing to buy into the GTX 560s floating around Newegg and other online retailers.  AMD benefited greatly from this lack of competition and only recently has NVIDIA started to bring their latest generation of cards to the price points MOST gamers are truly interested in. 

Today we are going to take a look at the brand new GeForce GTX 660, a graphics cards with 2GB of frame buffer that will have a starting MSRP of $229.  Coming in $80 under the GTX 660 Ti card released just last month, does the more vanilla GTX 660 have what it takes to replace the success of the GTX 460?

The GK106 GPU and GeForce GTX 660 2GB

NVIDIA's GK104 GPU is used in the GeForce GTX 690, GTX 680, GTX 670 and even the GTX 660 Ti.  We saw the much smaller GK107 GPU with the GT 640 card, a release I was not impressed with at all.  With the GTX 660 Ti starting at $299 and the GT 640 at $120, there was a WIDE gap in NVIDIA's 600-series lineup that the GTX 660 addresses with an entirely new GPU, the GK106.

First, let's take a quick look at the reference card from NVIDIA for the GeForce GTX 660 2GB - it doesn't differ much from the reference cards for the GTX 660 Ti and even the GTX 670.

01.jpg

The GeForce GTX 660 uses the same half-length PCB that we saw for the first time with the GTX 670 and this will allow retail partners a lot of flexibility with their card designs. 

Continue reading our review of the GeForce GTX 660 graphics card!

Author:
Subject: Processors
Manufacturer:

Apple Produces the new A6 for the iPhone 5

 

Today is the day that world gets introduced to the iPhone 5.  I of course was very curious about what Apple would be bringing to market the year after the death of Steve Jobs.  The excitement leading up to the iPhone announcement was somewhat muted as compared to years past, and a lot of that could be attributed to what has been happening in the Android market.  Companies like Samsung and HTC have released new high end phones that are not only faster and more expansive than previous versions, but they also worked really well and were feature packed.  While the iPhone 5 will be another success for Apple, for those somewhat dispassionate about the cellphone market will likely just shrug and say to themselves, “It looks like Apple caught up for the year, but too bad they really didn’t introduce anything really groundbreaking.”

a6_01.jpg

If there was one area that many were anxiously awaiting, it was that of the SOC (system on a chip) that Apple would use for the iPhone 5.  Speculation went basically from using a fresh piece of silicon based on the A5X (faster clocks, smaller graphics portion) to having a quad core monster running at high speeds but still sipping power.  It seems that we actually got something in between.  This is not a bad thing, but as we go forward we will likely see that the silicon again only matches what other manufacturers have been using since earlier this year.

Click here to read the entire article.

Author:
Subject: Processors
Manufacturer: Intel

Core Philosophy

Ah, IDF – the Intel Developer Forum.  Almost every year–while I sit in slightly uncomfortable chairs and stare at outdated and color washed projector screens–information is passed on about Intel's future architectures, products and technologies.  Last year we learned the final details about Ivy Bridge, and this year we are getting the first details about Haswell, which is the first architecture designed by Intel from the ground up for servers, desktops, laptops, tablets and phones. 

Design Philosophy

While Sandy Bridge and Ivy Bridge were really derivatives of prior designs and thought processes, the Haswell design is something completely different for the company.  Yes, the microarchitecture of Haswell is still very similar to Sandy Bridge (SNB), but the differences are more philosophical rather than technological. 

IMG_8192.JPG

Intel's target is a converged core: a single design that is flexible enough to be utilized in mobility devices like tablets while also scaling to the performance levels required for workstations and servers.  They retain the majority of the architecture design from Sandy Bridge and Ivy Bridge including the core design as well as the key features that make Intel's parts unique: HyperThreading, Intel Turbo Boost, and the ring interconnect. 

The three pillars that Intel wanted to address with Haswell were performance, modularity, and power innovations.  Each of these has its own key goals including improving performance of legacy code (existing), and having the ability to extract greater parallelism with less coding work for developers. 

IMG_8193.JPG

Continue reading our preview of the upcoming Intel Haswell architecture!!