Author:
Subject: Processors
Manufacturer: AMD

2013 Elite Mobility APU - Temash

AMD has a lot to say today.  At an event up in Toronto this month we got to sit down with AMD’s marketing leadership and key engineers to learn about the company’s plans for 2013 mobility processors.  This includes a refreshed high performance APU known as Richland that will replace Trinity as well as two brand new APUs based on Jaguar CPU cores and the GCN architecture for low power platforms. 

Josh has put together an article that details the Jaguar + GCN design of Temash and Kabini and I have also posted some initial performance results of the Kabini reference system AMD handed me in May.  This article will detail the plans that AMD has for each of these three mobile segments, starting with the newest entry, AMD’s Elite Mobility APU platform – Temash.

temash01_0.jpg

The goal of the APU, the combination of traditional x86 processing cores and a discrete style graphics system, was to offer unparalleled performance in smaller and more efficient form factors.  AMD believes that their leadership in the graphics front will offer them a good sized advantage in areas including performance tablets, hybrids and small screen clamshells that may or not be touch enabled.  They are acknowledging though that getting into the smallest tablets (like the Nexus 7) is not on the table quite yet and that content creation desktop replacements are probably outside the scope of Richland. 

 

2013 Elite Mobility APU – Temash

AMD will have the first x86 quad-core SoC design with Temash and AMD thinks it will make a big splash in a relatively new market known as the “high performance” tablet. 

temash02_0.jpg

Temash, built around Jaguar CPU cores and the graphics technology of GCN, will be able to offer fully accelerated video playback with transcode support as well with features like image stabilization and Perfect Picture enabled.  Temash will also be the only SoC to offer support for DX11 graphics and even though some games might not have the ability to show off added effects there are quite a few performance advantages of DX11 over DX10/9.  With more than 100% claimed GPU performance upgrade you’ll be able to drive displays at 2560x1600 for productivity use and even be able to take advantage of wireless display options. 

Continue reading our preview of the new 2013 mobility platforms featuring AMD APUs!!

Subject: Mobile
Manufacturer: Lenovo

Introduction and Design

P4212143.jpg

While Lenovo hasn’t historically been known for its gaming PCs, it’s poised to make quite a splash with the latest entry in its IdeaPad line. Owing little to the company’s business-oriented roots, the Y500 aims to be all power—moreso than any other laptop from the manufacturer to date—tactfully squeezed into a price tag that would normally be unattainable given the promised performance. But can it succeed?

Our Y500 review unit can be had for $1,249 at Newegg and other retailers, or for as low as $1,180 at Best Buy. Lenovo also sells customizable models, though the price is generally higher. Here’s the full list of specifications:

specs.png

The configurations offered by Lenovo range in price fairly widely, from as low as $849 for a model sporting 8 GB of RAM with a single GT 650M with 2 GB GDDR5. The best value is certainly this configuration that we received, however.

What’s so special about it? Well, apart from the obvious (powerful quad-core CPU and 16 GB RAM), this laptop actually includes two NVIDIA GeForce GT 650M GPUs (both with 2 GB GDDR5) configured in SLI. Seeing as it’s just a 15.6-inch model, how does it manage to do that? By way of a clever compromise: the exchange of the usual optical drive for an Ultrabay, something normally only seen in Lenovo’s ThinkPad line of laptops. So I guess the Y500 does owe a little bit of its success to its business-grade brethren after all.

P4212175.jpg

In our review unit (and in the particular configuration noted above), this Ultrabay comes prepopulated with the second GT 650M, equipped with its own heatsink/fan and all. The addition of this GPU effectively launches the Y500 into high-end gaming laptop territory—at least on the spec sheet. Other options for the Ultrabay also exist (sold separately), including a DVD burner and a second hard drive. The bay is easily removable via a switch on the back of the PC (see below).

P4212148.jpg

Continue Reading our review of the Lenovo IdeaPad Y500!

Subject: Motherboards
Manufacturer: MSI

Introduction and Technical Specifications

Introduction

02-board-pic.gif

Courtesy of MSI

With the Z77A-GD65 Gaming motherboard, MSI takes is award-winning Intel Z77-based board design and melds it with a Killer - a Killer NIC that is. MSI integrated the Killer e2205 GigE NIC into the board's design for the ultimate solution for online gaming. The Killer NIC is well known in gaming circles for its superior hardware-based network traffic prioritization engine, making it a natural integration choice for a top-end gaming board. We put the MSI Z77A-GD65 Gaming through our rigorous suite of tests to measures is performance and were not disappointed. At a retail price of $179, this board is a steal.

03-profile-pic.gif

Courtesy of MSI

In designing the Z77A-GD65 Gaming board, MSI provided a total of 12 digital power phases for the CPU. MSI packed this board full of features: SATA 2 and SATA 3 ports; a Killer e2205 GigE NIC; three PCI-Express x16 slots for up to tri-card support; and USB 2.0 and 3.0 port support.

04-rear-panel-pic.gif

Courtesy of MSI

Continue reading our review of the MSI Z77A-GD65 Gaming motherboard!

Subject: Motherboards
Manufacturer: GIGABYTE

Introduction

Today, Gigabyte unveiled their Intel Z87-based board lineup to select members of the press at a live event from their headquarters in City of Industry, CA. Their Z87 boards are broken down into four series - the Extreme OC series, the Gaming Series, the Thunderbolt series, and the Standard series.

02-z87-line.gif

Intel Z87 motherboard lineup
Courtesy of GIGABYTE

New Features

Gigabyte includes both a new interface for their UEFI BIOS and a new power paradigm, dubbed Ultra Durable 5 Plus, into each of their Intel Z87 boards.

UEFI BIOS Enhancement

03-uefi-intro.gif

UEFI BIOS explanation
Courtesy of GIGABYTE

They also fully redesigned their UEFI BIOS interface to make it more customizable, easier to use, and to allow real-time feedback for settings changes.

Continue reading our preview of Gigabyte's upcoming Z87 motherboards!!

Subject: Motherboards
Manufacturer: ASUS

Introduction

Today, ASUS introduces their Intel Z87-based motherboard lineup with board refreshes across all of their product lines: ASUS (mainstream), Republic of Gamers (ROG), The Ultimate Force (TUF), and Workstation (WS). With the exception of their TUF and ROG board lines, ASUS decided to introduce a new and improved color scheme for their boards - black and gold. The motherboard surfaces are black with gold colored heat sinks. While black and gold may not seem like the best match-up, don't judge the boards until you have seen them first hand - the black and gold go very well together.

ROG Motherboards

03-ROG MAXIMUS VI GENE.jpg

ASUS Maximus VI Gene
Courtesy of ASUS

Their ROG line will include the Maximus VI Extreme, the Maximus VI Formula, the Maximus VI Gene, and the Maximus VI Hero. All ROG boards feature the standard red and black color scheme common to that brand. Additionally, ASUS includes SupremeFX audio standard with all ROG boards and their Sonic Radar on-screen overlay technology. Sonic Radar is an in-game overlay that can be used to accurately pinpoint game-based sound sources. For powering these boards, ASUS includes 60amp-rated blackwing chokes and NexFET MOSFETS with 90% power efficiency operation. Use of these power components was seen to reduce on-board temperatures in the ASUS labs by as much as 5 degrees Celcius.

02-ROG MAXIMUS VI EXTREME.jpg

ASUS Maximus VI Extreme
Courtesy of ASUS

ASUS upped the ante even more with their Maximus VI Extreme board by including the ASUS OC Panel. This panel includes a display and can be mounted in a 5.25" drive bay or used externally for real time voltage and temperature monitoring as well as tweaking of various frequency and voltage BIOS settings. The ASUS OC Panel is supported on all ROG boards and will be available for after-market purchase for the non-Extreme boards.

04-ROG MAXIMUS VI HERO.jpg

ASUS Maximus VI Hero
Courtesy of ASUS

The Maximus VI Hero motherboard is the newest member of the ROG line, branded as a more affordable solution for the gamer. This board is marketed as a head-to-head competitor for MSI's MPOWER board.

Continue reading our preview of the ASUS Z87 motherboard lineup!

Subject: Storage
Tagged: sshd, Seagate, hybrid

Introduction

Introduction

Seagate recently announced and released their third generation of laptop Solid State Hybrid Drives. Originally thier hybrids carried the Momentus (laptop HDD) name forward, tacking on 'XT' to denote the on-board caching ability. The Momentus XT was intriduced in a 500GB (1st gen) and 750GB (2nd gen) model. The new line gets a new and simple title - Laptop SSHD.

laptop-sshd-family-gallery-500x500.jpg

In addition to the new name, we now have two capacity points available. The 'Laptop SSHD' retains the old 9.5mm form factor and now pushes a full 1TB of capacity, while the 'Laptop Thin SSHD' drops a platter and reduces availabile capacity to 500GB. The bonus with the 500GB model is that it maintains similar performance yet shaves off some thickness, making it Seagate's first 7mm Hybrid.

Today we will take a look at the new Thin SSHD, comparing it to the performance of the older generation Seagate Hybrids, as well as to Intel's RST caching solution:

130513-075713-4.8.jpg

Read on for the full review!

Author:
Subject: Systems
Manufacturer: iBuyPower

Power and Value

We have seen our fair share of mini-ITX cases and system builds over the last six months, including rigs from Digital Storm and AVADirect.  They attempt to offer a balance between performance, power, noise and size and some do it better than others.  With the continued development of the mini-ITX form factor more users than ever are realizing you can get nearly top-end performance for gaming in a smaller package. 

Today we are taking a look at the iBuyPower Revolt, in particular the Revolt R770, the highest end base offering of the system.  Built around a small, but not tiny, PC chassis iBuyPower is able to include some pretty impressive specifications:

  • Intel Core i7-3770K processor
  • Custom built Z77 mini-ITX motherboard
  • NZXT CPU water cooler
  • NVIDIA GeForce GTX 670 2GB graphics cards
  • 8GB DDR3-1600 memory
  • 120 GB Intel 320 Series SSD
  • 1TB Western Digital Blue hard drive

site1.jpg

You get all of this in a case that is only 16-in x 16-in x 4.5-in built with a glossy black and white color scheme.  The company claims that the Revolt was "designed to be a gaming system for any location" including a home theater, a dorm room or in your study.  It includes "vents and air channels positioned precisely to deliver cool ambient air exactly where it is needed" and "integrated atmospheric lighting system is customizable in color."

Check out or quick video review below and then follow on to the full post for more photos of the system and a quick check of performance!

Continue reading our review of the iBuyPower Revolt Mini-ITX gaming system!!

Manufacturer: Enermax/Ecomaster

Introduction and Features

2-TriAthlor-Banner.jpg

Enermax has a well-earned reputation for delivering reliable power supplies, enclosures, and other accessories to the PC enthusiast market.  Their new TriAthlor Series includes three power supplies including 550W, 650W and 700W models.  We will be taking a detailed look at the TriAthlor 550W power supply in this review.  All TriAthlor power supplies are certified to deliver 80 Plus Bronze efficiencies and feature modular cables and quiet operation.  Ecomaster is the authorized US agent for Enermax branded products.

3-TriAthlor-Banner2.jpg

Enermax TriAthlor Series PSU Key Features:

4a-Features-table2.jpg

(Courtesy of Enermax)

Please continue reading our Enermax TriAthlor 550W PSU review!!!

Author:
Subject: Processors, Mobile
Manufacturer: Intel

A much needed architecture shift

It has been almost exactly five years since the release of the first Atom branded processors from Intel, starting with the Atom 230 and 330 based on the Diamondville design.  Built for netbooks and nettops at the time, the Atom chips were a reaction to a unique market that the company had not planned for.  While the early Atoms were great sellers, they were universally criticized by the media for slow performance and sub-par user experiences. 

Atom has seen numerous refreshes since 2008, but they were all modifications of the simplistic, in-order architecture that was launched initially.  With today's official release of the Silvermont architecture, the Atom processors see their first complete redesign from the ground up.  With the focus on tablets and phones rather than netbooks, can Intel finally find a foothold in the growing markets dominated by ARM partners? 

I should note that even though we are seeing the architectural reveal today, Intel doesn't plan on having shipping parts until late in 2013 for embedded, server and tablets and not until 2014 for smartphones.  Why the early reveal on the design then?  I think that pressure from ARM's designs (Krait, Exynos) as well as the upcoming release of AMD's own Kabini is forcing Intel's hand a bit.  Certainly they don't want to be perceived as having fallen behind and getting news about the potential benefits of their own x86 option out in the public will help.

silvermont26.jpg

Silvermont will be the first Atom processor built on the 22nm process, leaving the 32nm designs of Saltwell behind it.  This also marks the beginning of a new change in the Atom design process, to adopt the tick/tock model we have seen on Intel's consumer desktop and notebook parts.  At the next node drop of 14nm, we'll see see an annual cadence that first focuses on the node change, then an architecture change at the same node. 

By keeping Atom on the same process technology as Core (Ivy Bridge, Haswell, etc), Intel can put more of a focus on the power capabilities of their manufacturing.

Continue reading about the new Intel Silvermont architecture for tablets and phones!!

Author:
Manufacturer: Intel

The Intel HD Graphics are joined by Iris

Intel gets a bad wrap on the graphics front.  Much of it is warranted but a lot of it is really just poor marketing about the technologies and features they implement and improve on.  When AMD or NVIDIA update a driver or fix a bug or bring a new gaming feature to the table, they are sure that every single PC hardware based website knows about and thus, that as many PC gamers as possible know about it.  The same cannot be said about Intel though - they are much more understated when it comes to trumpeting their own horn.  Maybe that's because they are afraid of being called out on some aspects or that they have a little bit of performance envy compared to the discrete options on the market. 

Today might be the start of something new from the company though - a bigger focus on the graphics technology in Intel processors.  More than a month before the official unveiling of the Haswell processors publicly, Intel is opening up about SOME of the changes coming to the Haswell-based graphics products. 

We first learned about the changes to Intel's Haswell graphics architecture way back in September of 2012 at the Intel Developer Forum.  It was revealed then that the GT3 design would essentially double theoretical output over the currently existing GT2 design found in Ivy Bridge.  GT2 will continue to exist (though slightly updated) on Haswell and only some versions of Haswell will actually see updates to the higher-performing GT3 options.  

01.jpg

In 2009 Intel announced a drive to increase graphics performance generation to generation at an exceptional level.  Not long after they released the Sandy Bridge CPU and the most significant performance increase in processor graphics ever.  Ivy Bridge followed after with a nice increase in graphics capability but not nearly as dramatic as the SNB jump.  Now, according to this graphic, the graphics capability of Haswell will be as much as 75x better than the chipset-based graphics from 2006.  The real question is what variants of Haswell will have that performance level...

02.jpg

I should note right away that even though we are showing you general performance data on graphics, we still don't have all the details on what SKUs will have what features on the mobile and desktop lineups.  Intel appears to be trying to give us as much information as possible without really giving us any information. 

Read more on Haswell's new graphics core here.

Author:
Manufacturer: Oculus

Our first thoughts and impressions

Since first hearing about the Kickstarter project that raised nearly 2.5 million dollars from over 9,500 contributors, I have eagerly been awaiting the arrival of my Oculus Rift development kit.  Not because I plan on quitting the hardware review business to start working on a new 3D, VR-ready gaming project but just because as a technology enthusiast I need to see the new, fun gadgets and what they might mean for the future of gaming.

I have read other user's accounts of their time with the Oculus Rift, including a great write up in a Q&A form Ben Kuchera over at Penny Arcade Report, but I needed my own hands-on time with the consumer-oriented VR (virtual reality) product.  Having tried it for very short periods of time at both Quakecon 2012 and CES 2013 (less than 5 minutes) I wanted to see how it performed and more importantly, how my body reacted to it.

I don't consider myself a person that gets motion sick.  Really, I don't.  I fly all the time, sit in the back of busses, ride roller coasters, watch 3D movies and play fast-paced PC games on large screens.  The only instances I tend to get any kind of unease with motion is on what I call "roundy-round" rides, the kind that simply go in circles over and over.  Think about something like this, The Scrambler, or the Teacups at Disney World.  How would I react to time with the Oculus Rift, this was my biggest fear... 

For now I don't want to get into the politics of the Rift, how John Carmack was initially a huge proponent of the project then backed off on how close we might be the higher-quality consumer version of the device.  We'll cover those aspects in a future story.  For now I only had time for some first impressions.

Watch the video above for a walk through of the development kit as well as some of the demos, as best can be demonstrated in a 2D plane! 

Continue on to the full story for some photos and my final FIRST impressions of the Oculus Rift!

Subject: Editorial

Good effort goes a long way

The wait has been long and anxious for Heart of the Swarm, the expansion to 2010's StarCraft 2: Wings of Liberty. Blizzard originally hinted at a very rapid release schedule which did not exactly come to fruition. The nearly three years of development time for Heart of the Swarm is longer than a single studio spends on a full Call of Duty title; although, one could make a very credible argument that a Blizzard expansion requires more effort to create than said complete Call of Duty title.

But as Duke Nukem Forever demonstrated, a long time in development does not guarantee a fully baked product coming out the other end.

Blizzard games have always been highly entertaining albeit without deep artistic substance; their games are not first on the list for a university literature syllabus. But, there is a lot of room in life for engaging entertainment. In terms of the PC, Blizzard has always been one of the leading developers for the platform; they know how to deliver an exceptional PC experience if they choose to.

Watch the video and read on to find out if they did!

Author:
Manufacturer: Various

Our 4K Testing Methods

You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office.  Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160.  For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays.  Oh, and this TV only cost us $1300.

seiki5.jpg

In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable.  You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz.  That doesn't mean we are limited to 30 FPS of performance though, far from it.  As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.

I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others.  Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome.  The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations.  Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle.  This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise. 

4ksizes.png

Image from Digital Trends

I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.

Continue reading our results from testing 4K 3840x2160 gaming on high end graphics cards!!

Author:
Subject: Processors
Manufacturer: AMD

heterogeneous Uniform Memory Access

 

Several years back we first heard AMD’s plans on creating a uniform memory architecture which will allow the CPU to share address spaces with the GPU.  The promise here is to create a very efficient architecture that will provide excellent performance in a mixed environment of serial and parallel programming loads.  When GPU computing came on the scene it was full of great promise.  The idea of a heavily parallel processing unit that will accelerate both integer and floating point workloads could be a potential gold mine in wide variety of applications.  Alas, the promise of the technology did not meet expectations when we have viewed the results so far.  There are many problems with combining serial and parallel workloads between CPUs and GPUs, and a lot of this has to do with very basic programming and the communication of data between two separate memory pools.

huma_01.jpg

CPUs and GPUs do not share common memory pools.  Instead of using pointers in programming to tell each individual unit where data is stored in memory, the current implementation of GPU computing requires the CPU to write the contents of that address to the standalone memory pool of the GPU.  This is time consuming and wastes cycles.  It also increases programming complexity to be able to adjust to such situations.  Typically only very advanced programmers with a lot of expertise in this subject could program effective operations to take these limitations into consideration.  The lack of unified memory between CPU and GPU has hindered the adoption of the technology for a lot of applications which could potentially use the massively parallel processing capabilities of a GPU.

The idea for GPU compute has been around for a long time (comparatively).  I still remember getting very excited about the idea of using a high end video card along with a card like the old GeForce 6600 GT to be a coprocessor which would handle heavy math operations and PhysX.  That particular plan never quite came to fruition, but the idea was planted years before the actual introduction of modern DX9/10/11 hardware.  It seems as if this step with hUMA could actually provide a great amount of impetus to implement a wide range of applications which can actively utilize the GPU portion of an APU.

Click here to continue reading about AMD's hUMA architecture.

Author:
Manufacturer: Corsair

Obsidian Series for under $100

If you need a case for your next PC build, the chances are good that Corsair has a model that you'll like.  Ranging from the obscenely large Obsidian 900D to the $69 Carbide 200R and just about everything in between, Corsair has a ton of options  Today we are reviewing the brand new entrant to the Obsidian series, the 350D, that brings Corsair to the Micro-ATX form factor. 

The Obsidian series is the flagship chassis line from Corsair and typically means you are getting the best of the best from the expanding components company.  With an MSRP of just $99 you are definitely making some sacrifices on features and on size, limiting us to Micro-ATX or Mini-ITX motherboards and systems. 

IMG_973601.jpg

The front panel has an attractive brushed finish to it with removable front panel (and fan filter).

IMG_974203.jpg

Connections up top include headphones, microphone as well as a pair of USB 3.0 ports.  There power button is right in the center with dual LEDs on each side.  The reset button is just to the right of the mic port and is recessed enough to prevent accidental presses.

Continue reading our review of the Corsair Obsidian 350D chassis!!

Author:
Subject: Processors
Manufacturer: AMD

Jaguar Hits the Embedded Space

 

It has long been known that AMD has simply not had a lot of luck going head to head against Intel in the processor market.  Some years back they worked on differentiating themselves, and in so doing have been able to stay afloat through hard times.  The acquisitions that AMD has made in the past decade are starting to make a difference in the company, especially now that the PC market that they have relied upon for revenue and growth opportunities is suddenly contracting.  This of course puts a cramp in AMD’s style, but with better than expected results in their previous quarter, things are not nearly as dim as some would expect.

Q1 was still pretty harsh for AMD, but they maintained their marketshare in both processors and graphics chips.  One area that looks to get a boost is that of embedded processors.  AMD has offered embedded processors for some time, but with the way the market is heading they look to really ramp up their offerings to fit in a variety of applications and SKUs.  The last generation of G-series processors were based upon the Bobcat/Brazos platform.  This two chip design (APU and media hub) came in a variety of wattages with good performance from both the CPU and GPU portion.  While the setup looked pretty good on paper, it was not widely implemented because of the added complexity of a two chip design plus thermal concerns vs. performance.

soc_arch.jpg

AMD looks to address these problems with one of their first, true SOC designs.  The latest G-series SOC’s are based upon the brand new Jaguar core from AMD.  Jaguar is the successor to the successful Bobcat core which is a low power, dual core processor with integrated DX11/VLIW5 based graphics.  Jaguar improves performance vs. Bobcat in CPU operations between 6% to 13% when clocked identically, but because it is manufactured on a smaller process node it is able to do so without using as much power.  Jaguar can come in both dual core and quad core packages.  The graphics portion is based on the latest GCN architecture.

Read the rest of the AMD G-Series release by clicking here!

Author:
Manufacturer: AMD

The card we have been expecting

Despite all the issues that were brought up with our new graphics performance testing methodology we are calling Frame Rating, there is little debate in the industry that AMD is making noise once again in the graphics field.  From the elaborate marketing and game bundles with all Radeon HD 7000 series cards over the last year to the hiring of Roy Taylor, VP of sales but also the company's most vocal supporter. 

slide1_0.jpg

Along with the marketing though goes plenty of technology and important design wins.  With the dominance of the APU on the console side (Wii U, Playstation 4 and the next Xbox), AMD is making sure that the familiarity with its GPU architecture there pays dividends on the PC side as well.  Developers will be focusing on AMD's graphics hardware for 5-10 years with the console generation and that could result in improved performance and feature support for Radeon graphics for PC gamers. 

Today's release of the Radeon HD 7990 6GB Malta dual-GPU graphics card shows a renewed focus on high-end graphics markets since the release of the Radeon HD 7970 in January of 2012.  And while you may have seen something for sale previously with the HD 7990 name attached, those were custom designs built by partners, not by AMD. 

slide2_0.jpg

Both ASUS and PowerColor currently have high-end dual-Tahiti cards for sale.  The PowerColor HD 7990 Devil 13 used the brand directly but ASUS' ARES II kept away from the name and focused on its own high-end card brands instead. 

The "real" Radeon HD 7990 card was first teased at GDC in March and takes a much less dramatic approach to its design without being less impressive technically.  The card includes a pair of Tahiti, HD 7970-class GPUs on a single PCB with 6GB of total memory.  The raw specifications are listed here:

slide6_0.jpg

Considering there are two HD 7970 GPUs on the HD 7990, the doubling of the major specs shouldn't be surprising though it is a little deceiving.  There are 8.6 billion transistors yes, but there are still 4.3 billion on each GPU.  Yes there are 4096 stream processors but only 2048 on each GPU requiring software GPU scaling to increase performance.  The same goes with texture fill rate, compute performance, memory bandwidth, etc.  The same could be said for all dual-GPU graphics cards though.

Continue reading our review of the AMD Radeon HD 7990 6GB Graphics Card!!

Author:
Manufacturer: Various

A very early look at the future of Catalyst

Today is a very interesting day for AMD.  It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board.  Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver. 

If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed.  The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet.  That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised. 

Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February.  Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology.  We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light.  Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD. 

Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered. 

If you are just catching up on the story, you really need some background information.  The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically.  From that piece:

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand.  We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern.  Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case.  Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?”  My answer based on the below graph would be no.

runt.jpg

An example of a runt frame in a CrossFire configuration

NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen.  For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.

Continue reading our article on the new prototype driver from AMD to address frame pacing issues in CrossFire!!

Subject: Motherboards
Manufacturer: GIGABYTE

Introduction and Technical Specifications

Introduction

02-6641_big.jpg

Courtesy of GIGABYTE

The Z77N-WiFi is GIGABYTE's latest edition to the Mini-ITX lineup. Although the board is not as packed with features as some of the other enthusiast-minded mini-ITX boards, GIGABYTE did some interesting things with the board layout to space components out on the board more evenly. The Z77N-WiFi even comes standard with dual-Realtek GbE NICs and an Intel 802.11n-based WiFi mPCIe card. We put the board through our normal gamut of tests to see how well this mighty mite sized up with its full-sized brethren. The Z77N-WiFi board comes with an equally reasonable retail price at a mere $129.99.

Continue reading our review of the GIGABYTE Z77N-WiFi motherboard!

Subject: Storage

Introduction, Specifications and Packaging

Introduction

A while back, we saw OCZ undergo a major restructuring. 150+ product SKUs were removed from their lineup, leaving a solid core group of products for the company to focus on. The Vertex and Agility lines were spared, and the Vector was introduced and well received by the community. With all of that product trimming, we were bound to see another release at some point:

ext-front.JPG

Today we see a branch from one of those tree limbs in the form of the Vertex 3.20. This is basically a Vertex 3, but with the 25nm IMFT Sync flash replaced by newer 20nm IMFT Sync flash. The drop to 20nm comes with a slight penalty in write endurance (3000 cycles, down from the 5000 rating of 25nm) for the gain of cheaper production cost (more dies per 300mm wafer).

imft 20 nm.jpg

IMFT has been cooking up 20nm flash for a while now, and it is becoming mature enough to enter the mainstream. The first entrant was Intel's own 335 Series, which debuted late last year. 20nm flash has no real groundbreaking improvements other than the reduced size, so the hope is that this shrink will translate to lower cost/GB to the end user. Let's see how the new Vertex shakes out.

Specifications:

  • Capacity: 120, 240GB
  • Sequential read:  550 MB/sec
  • Sequential write: 520 MB/sec
  • Random read IOPS (up to):  35 k-IOPS
  • Random write IOPS (up to):  65 k-IOPS

Packaging:

packaging.JPG

This simple plastic packaging does away with the 3.5" bracket previously included with all OCZ models.

Continue reading our review of the OCZ Vertex 3.20 240GB SSD!!