All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
GK106 Completes the Circle
The release of the various Kepler-based graphics cards have been interesting to watch from the outside. Though NVIDIA certainly spiced things up with the release of the GeForce GTX 680 2GB card back in March, and then with the dual-GPU GTX 690 4GB graphics card, for quite quite some time NVIDIA was content to leave the sub-$400 markets to AMD's Radeon HD 7000 cards. And of course NVIDIA's own GTX 500-series.
But gamers and enthusiasts are fickle beings - knowing that the GTX 660 was always JUST around the corner, many of you were simply not willing to buy into the GTX 560s floating around Newegg and other online retailers. AMD benefited greatly from this lack of competition and only recently has NVIDIA started to bring their latest generation of cards to the price points MOST gamers are truly interested in.
Today we are going to take a look at the brand new GeForce GTX 660, a graphics cards with 2GB of frame buffer that will have a starting MSRP of $229. Coming in $80 under the GTX 660 Ti card released just last month, does the more vanilla GTX 660 have what it takes to replace the success of the GTX 460?
The GK106 GPU and GeForce GTX 660 2GB
NVIDIA's GK104 GPU is used in the GeForce GTX 690, GTX 680, GTX 670 and even the GTX 660 Ti. We saw the much smaller GK107 GPU with the GT 640 card, a release I was not impressed with at all. With the GTX 660 Ti starting at $299 and the GT 640 at $120, there was a WIDE gap in NVIDIA's 600-series lineup that the GTX 660 addresses with an entirely new GPU, the GK106.
First, let's take a quick look at the reference card from NVIDIA for the GeForce GTX 660 2GB - it doesn't differ much from the reference cards for the GTX 660 Ti and even the GTX 670.
The GeForce GTX 660 uses the same half-length PCB that we saw for the first time with the GTX 670 and this will allow retail partners a lot of flexibility with their card designs.
Apple Produces the new A6 for the iPhone 5
Today is the day that world gets introduced to the iPhone 5. I of course was very curious about what Apple would be bringing to market the year after the death of Steve Jobs. The excitement leading up to the iPhone announcement was somewhat muted as compared to years past, and a lot of that could be attributed to what has been happening in the Android market. Companies like Samsung and HTC have released new high end phones that are not only faster and more expansive than previous versions, but they also worked really well and were feature packed. While the iPhone 5 will be another success for Apple, for those somewhat dispassionate about the cellphone market will likely just shrug and say to themselves, “It looks like Apple caught up for the year, but too bad they really didn’t introduce anything really groundbreaking.”
If there was one area that many were anxiously awaiting, it was that of the SOC (system on a chip) that Apple would use for the iPhone 5. Speculation went basically from using a fresh piece of silicon based on the A5X (faster clocks, smaller graphics portion) to having a quad core monster running at high speeds but still sipping power. It seems that we actually got something in between. This is not a bad thing, but as we go forward we will likely see that the silicon again only matches what other manufacturers have been using since earlier this year.
Ah, IDF – the Intel Developer Forum. Almost every year–while I sit in slightly uncomfortable chairs and stare at outdated and color washed projector screens–information is passed on about Intel's future architectures, products and technologies. Last year we learned the final details about Ivy Bridge, and this year we are getting the first details about Haswell, which is the first architecture designed by Intel from the ground up for servers, desktops, laptops, tablets and phones.
While Sandy Bridge and Ivy Bridge were really derivatives of prior designs and thought processes, the Haswell design is something completely different for the company. Yes, the microarchitecture of Haswell is still very similar to Sandy Bridge (SNB), but the differences are more philosophical rather than technological.
Intel's target is a converged core: a single design that is flexible enough to be utilized in mobility devices like tablets while also scaling to the performance levels required for workstations and servers. They retain the majority of the architecture design from Sandy Bridge and Ivy Bridge including the core design as well as the key features that make Intel's parts unique: HyperThreading, Intel Turbo Boost, and the ring interconnect.
The three pillars that Intel wanted to address with Haswell were performance, modularity, and power innovations. Each of these has its own key goals including improving performance of legacy code (existing), and having the ability to extract greater parallelism with less coding work for developers.
Lenovo is one of several companies–including Acer and HP–that have embraced the ultrabook concept with both arms. Lenovo has not simply released a few high-end models similar to what as ASUS has done. Rather, it has released a fleet of products which include the U300/U310, the U410 and the U300S. And now there is a new high end product in the lineup called the ThinkPad X1 Carbon.
There several traits that mark the ThinkPad X1 Carbon as unique. This is the first ThinkPad to bear the ultrabook title, for example. Further, the 430u (which was initially announced all the way back in January at CES 2012) is not yet available. The X1 Carbon is also one of only a few laptops to use carbon fiber in its frame. And this laptop is the only ultrabook on the market with a trackpointer. As I’ve mentioned before, unique design traits are kind of a big deal, and a peek inside the X1 reminds us why.
There’s nothing in that spec sheet that stands out. Yes, the Core i7 low-voltage processor will prove faster than the Core i5s in most competitors, but you can usually option to an i7 if that’s what you desire. The solid state drive is completely standard for an ultrabook above $1000, and four gigabytes of RAM can be hand in virtually any laptop on store shelves today–even those selling for $500.
This means Lenovo needs to bring something special if it wants to justify a premium price. You can buy this laptop right now for about $1250, but grabbing the upgrades found in our review unit raises the price to about $1500. That’s in line with the HP Envy 14 Spectre, an editor’s choice winner, and the ASUS Zenbook Prime UX31A, which would have won an editor’s choice if ASUS had a handle on its quality control. Let’s see if the Carbon can deal with these top-tier competitors.
I say let the world go to hell
… but I should always have my tea. (Notes From Underground, 1864)
You can praise video games as art to justify its impact on your life – but do you really consider it art?
Best before the servers are taken down, because you're probably not playing it after.
Art allows the author to express their humanity and permits the user to consider that perspective. We become cultured when we experiment with and to some extent understand difficult human nature problems. Ideas are transmitted about topics which we cannot otherwise understand. We are affected positively as humans in society when these issues are raised in a safe medium.
Video games, unlike most other mediums, encourage the user to coat the creation with their own expressions. The player can influence the content through their dialogue and decision-tree choices. The player can accomplish challenges in their own unique way and talk about it over the water cooler. The player can also embed their own content as a direct form of expression. The medium will also mature as we further learn how to leverage interactivity to open a dialogue for these artistic topics in completely new ways and not necessarily in a single direction.
Consciously or otherwise – users will express themselves.
With all of the potential for art that the medium allows it is a shame that – time and time again – the industry and its users neuter its artistic capabilities in the name of greed, simplicity, or merely fear.
Introduction and Externals
Corsair manufactures a wide variety of components and peripherals for PC enthusiasts. They essentially target the most enthusiastic customers in whatever market they enter – breaking the ice with the coldest and harshest critics who are never above nitpicking faults and flaws. Despite tossing their first generation products to the sharks they perform uncharacteristically well for a new contender almost every time. They look before they leap.
The Corsair K60 and K90 were launched simultaneously and represent Corsair’s first attempt at producing a mechanical keyboard. Corsair has included media keys, a metal volume wheel, and a Windows-key lock on both keyboards if you find yourself yelling, “I HATE THIS KEY!” at your desktop because your game is now minimized and cannot receive your hatred.
Rubberized when down, not when up -- but stable either way.
I never said I wasn't one of the nitpickers.
Both keyboards are built around an aluminum chassis with a nonslip coating to each key. Each keycap has a sharply defined edges compared to the more round edges found on a Razer Blackwidow and other similar keyboards. Neither keyboard has rubberized tips on their ergonomic flaps although slipping has not been an issue in my testing.
Introduction, Virtual V-Sync Testing
In my recent review of the Origin EON11-S portable gaming laptop I noted that the performance of the laptop was far behind that of a larger 15.6” or 17.3” model. The laptop won a gold award despite this, as all laptops of this size are bound to physics, but it was an issue worth nothing.
Origin surprised me by responding that they had something in the works that might buff up performance. This confused me. Were they going to cast a spell on it? Would they beam in a beefier GPU? What could they possibly do that would increase performance without changing the hardware?
Now I have the answer. It’s called Lucid VirtuMVP and it uses your existing integrated GPU to improve performance. As with Lucid’s other products, VirtuMVP makes it possible for two different GPUs – in this case, your integrated GPU and your discrete GPU – to work together. It’s not magic – just ingenuity. Let’s take a closer look.
Introduction and Features
Manufacturers are continually looking for ways to differentiate their products from the rest of the field and highlight new or improved features to make you want to buy their products. Leave it to Corsair to actually come up with a truly new and unique power supply for the PC enthusiast market! The Corsair AX1200i is a smart PSU that incorporates a Digital Signal Processor (DSP) to deliver extremely clean, efficient power with the ability to make real-time adjustments to various internal parameters. The included Corsair Link software can be used to monitor and adjust performance, noise (fan speed), and Over Current Protection (OCP) settings.
It’s been almost two years since we reviewed Corsair’s flagship power supply, the Professional Series Gold AX1200. While the new AX1200i retains many of the original features it now comes with 80 Plus Platinum efficiency certification and a built-in DACS (Data Acquisition and Control System) thanks to Corsair Link technology.
(Courtesy of Corsair)
Here is what Corsair has to say about their new AX1200i Digital ATX power supply unit: "The revolutionary AX1200i is the first desktop PC power supply to use digital (DSP) control and Corsair Link to bring you an unprecedented level of monitoring and performance customization. The DSP in the AX1200i makes on-the-fly adjustments for incredibly tight voltage regulation, 80 PLUS Platinum efficiency, and clean, stable power.
Real-time monitoring and control with Corsair Link. Put the AX1200i under your control by connecting it directly to a USB header on your motherboard with the included cable, or to a Corsair Commander (available separately). Then, download the free Corsair Link Dashboard software for unrivaled power supply monitoring and control options.
Monitor power input and output, efficiency, fan speed, and internal temperature, directly from the Windows based application. Or, take it to the next level and set up and modify fan speed profiles, or even select from virtual “single rail” or “multi-rail” software modes, with selectable OCP points. "
Corsair AX1200i Digital ATX PSU Key Features:
• Digital Signal Processor (DSP) for extremely clean and efficient power
• Corsair Link Integration for monitoring and adjusting performance
• 1,200 watts continuous power output (50°C)
• Dedicated single +12V rail with user-configurable virtual rails
• 80Plus Platinum certified, delivering up to 92% efficiency
• ZVS / ZCS technology for high efficiency
• Independent DC-to-DC converters
• Ultra quiet 140mm double ball bearing fan
• Silent, Fanless mode up to ~30% load
• Self-test switch to verify power supply functionality
• Premium quality components
• Fully modular cable system
• Conforms to ATX12V v2.31 and EPS 2.92 standards
• Universal AC input (90-264V) with Active PFC
• Over-current, over-voltage, under-voltage and short circuit protection
• Dimensions: 150mm (W) x 86mm (H) x 200mm (L)
• 7-Year warranty and legendary Corsair customer service
Multiple Contenders - EVGA SC
One of the most anticipated graphics card releases of the year occurred this month in the form of the GeForce GTX 660 Ti from NVIDIA, and as you would expect we were there on the day one with an in-depth review of the card at reference speeds.
The GeForce GTX 660 Ti is based on GK104, and what you might find interesting is that it is nearly identical to the specifications of the GTX 670. Both utilize 7 SMX units for a total of 1344 stream processors – or CUDA cores – and both run at a reference clock speed of 915 MHz base and 980 MHz Boost. Both include 112 texture units though the GeForce GTX 660 Ti does see a drop in ROP count from 32 to 24. Also, L2 cache drops from 512KB to 384KB along with a memory bus width drop from 256-bit to 192-bit.
We already spent quite a lot of time talking about the GTX 660 Ti compared to the other NVIDIA and AMD GPUs in the market in our review (linked above) as well as on our most recent episode of the PC Perspective Podcast. Today's story is all about the retail cards we received from various vendors including EVGA, Galaxy, MSI and Zotac. We are going to show you each card's design, the higher clocked settings that were implemented, performance differences between them and finally the overclocking comparisons of all four.
Ah, the end of August. School is about to start. American college football is about to get underway. Hot Chips is now in full swing. I guess the end of August caters to all sorts of people. For the people who are most interested in Hot Chips, the amount of information on next generation CPU architectures is something to really look forward to. AMD is taking this opportunity to give us a few tantalizing bits of information about their next generation Steamroller core which will be introduced with the codenamed “Kaveri” APU due out in 2013.
AMD is seemingly on the brink of releasing the latest architectural update with Vishera. This is a Piledriver+ based CPU that will find its way into AM3+ sockets. On the server side it is expected that the Abu Dhabi processors will also be released in a late September timeframe. Trinity was the first example of a Piledriver based product, and it showed markedly improved thermals as compared to previous Bulldozer based products, and featured a nice little bump in IPC in both single and multi-threaded applications. Vishera and Abu Dhabi look to be Piledriver+, which essentially means that there are a few more tweaks in the design that *should* allow it to go faster per clock than Trinity. There have been a few performance leaks so far, but nothing that has been concrete (or has shown final production-ready silicon).
Until that time when Vishera and its ilk are released, AMD is teasing us with some Steamroller information. This presentation is featured at Hotchips today (August 28). It is a very general overview of improvements, but very few details about how AMD is achieving increased performance with this next gen architecture are given. So with that, I will dive into what information we have.
I often think of ASUS as the PC’s answer to Apple. Their products are not up to Apple’s rigorous engineering, nor is the customer service as accessible, but ASUS does offer a number of products that were obviously designed to meet a set of high standards. I’ve always enjoyed the company’s G-Series gaming laptops, ultraportables like the U33 Bamboo and high-end multimedia laptops like the N53 and N56.
The original Zenbook didn’t impress me, however. PC Perspective never reviewed it, but I did have some hands-on time with one courtesy of Intel’s CES 2012 ultrabook giveaway. The build quality wasn’t great, the touchpad was quite poor and the overall look and feel proved a bit tacky (the cursive lettering below the display panel being the most obvious example).
ASUS has now followed up the original Zenbook with the new Zenbook Prime. There are a couple of different variants. We received the 13-inch UX31A which come equipped with the 1080p IPS display panel. As for the rest? Well, see below.
This is one well equipped ultrabook, which explains why it comes with a nearly $1500 price tag. You don’t have to spend that much, however. The basic Zenbook Prime, which still has the IPS display but downgrades to a Core i5, is $999 on Amazon.
Does the flagship of ASUS design deliver the goods? Let’s find out.
Introduction and Features
Earlier this year, SilverStone released two brand new enclosures designed for HTPC enthusiasts, the Grandia Series GD07 and GD08. The two cases are nearly identical except for the front panel and some differences in the internal drive cage layout. The GD07 sports a stealthy clean look with a lockable front door for security that conceals all the drives and I/O functions while the GD08 comes with a traditional exposed front and classic black anodized aluminum front panel. Both enclosures have been completely re-engineered and include the latest features like oversized motherboard and expansion card support, USB 3.0 ports, and excellent case cooling while being less expensive than some of SilverStone’s premium HTPC enclosures (the GD07 is currently selling for $139.99 and the GD08 is selling for $149.99 USD). We are taking a detailed look at the SilverStone GD07 HTPC case in this review.
SilverStone Technology was one of the original manufacturers to enter the HTPC case market and they continue to offer one of the largest selections of HTPC enclosures available today. SilverStone currently offers 18 different HTPC enclosures spanning three different series including the Crown, Grandia and Lascala lines. In addition to designing premium HTPC enclosures, SilverStone has a long-standing reputation among PC enthusiasts for providing a full line of high quality computer chassis, power supplies, cooling components, and accessories.
SilverStone Grandia Series GD07 HTPC Chassis Key Features:
• Full-size HTPC enclosure
• Positive air pressure design for excellent cooling with minimal noise
• Quick-access air filters helps prevent dust buildup
• Lockable front door and power button ensure system security
• Supports Micro-ATX, ATX and Extended-ATX motherboards
• Supports extended length graphic cards up to 13.6” long
• Four external 5.25" optical drive bays behind front door
• Five internal 3.5" and two internal 2.5” drive bays
• Excellent case ventilation with multiple dedicated vents
• Three 120mm cooling fans included
• USB 3.0 ports
Lenovo has become an important player in the mainstream laptop market. Five years ago the offerings from Lenovo were not great, but today the IdeaPad line has matured. This has been reflected in Lenovo’s growth. The company has posted gains in global market share over the last few years.
In this review we’re looking at the Z580, a laptop that’s smack dab in the middle of the company’s IdeaPad brand. It’s a 15.6” laptop that starts at $469 but can be optioned to around $900. Our review unit is a well configured version which includes an Intel Core i5-3210M processor. Lenovo’s website prices it out at a cool $599.
What else will six Benjamin Franklins buy you? Let’s take a look.
The $600 price point is important. Studies of the laptop market have consistently shown that the average price of a new laptop hovers around $600 (much to the dismay of manufacturers, who’d rather people spent more).
This market is extremely completive as a result. If you want a portable laptop with an IPS display you don’t have many options, but consumers who want a powerful and competent laptop for $600 have a buffet to choose from. Can the Z580 make room for itself in this crowd?
Most IT workers or computer enthusiasts tend to ‘accumulate’ computer and electronics gear over time. Over the years it is easy to end up with piles of old and outdated computer parts, components and electronics–whether it’s an old Pentium machine that your work was throwing out, RAM chips you no longer needed after your last upgrade, or an old CRT monitor that your cousin wasn’t sure what to do with. Tossing the accumulated hardware out with the next trash pickup doesn’t even enter the equation, because there’s that slight possibility you might need it someday.
I myself have one (or two, and maybe half an attic…) closet full of old stuff ranging from my old Commodore 64/1541 Floppy disk drive with Zork 5.25” floppies, to a set of four 30 pin 1 MB/70ns SIMM chips that cost $100 each as upgrades to my first 486 DX2/50 Mhz Compudyne PC back in 1989. (Yes, you read that right, $100 for 1 MB of memory.) No matter if you have it all crammed into one closet or spread all over your house, you likely have a collection of gear dating back to the days of punch cards, single button joysticks, and InvisiClues guides.
Occasionally I’ll look into my own closet and lament all the ‘wasted’ technology that resides there. I’m convinced much of the hardware still has some sparks of life left. As a result, I am always looking for a reason to revive some of it from the dead. Since they’ve already been bought and paid for, it feels almost blasphemous to the technology gods not to do something with the hardware. In some cases, it might not be worth the effort, (Windows Vista on an old Micron Transport Trek2 PII-300 laptop doesn’t end well for anyone). In others cases, you can build something fun or useful using parts that you have sitting around and are waiting for a new lease on life.
Introduction, Design And Features
The gaming keyboard market seems to rigorously follow a common rule of consumer products - more is more. If a keyboard is for gamers it should include lots of fancy gaming related features, and the more that are included, the more hardcore the keyboard. Macro buttons, customizable back-lighting and LCD screens are all features of modern gaming keyboards–and you don’t see many companies going the other direction.
But there are products that buck the trend. One of them is the CMStorm QuickFire Rapid, a mechanical gaming keyboard that became available in North America earlier this year. Unlike most competitors, the QuickFire rapid cuts features instead of adding them. Back-lighting? Macro keys? You’re kidding me, right? This keyboard doesn’t even include a numpad.
Cooler Master (the company behind CMStorm) has not cut out the features that matter, however. This keyboard comes with Cherry MX keys (the blue variant, in this case) and also supports PS/2 connections for full NKRO. For those who’ve seen the light of day recently, this gobbly gook means the QuickFire Rapid scans key activation individually and therefore can detect new key activations even when other keys are still depressed. It’s a feature hardcore gamers love because of their tendency to press multiple keys simultaneously.
Cutting back on unneeded features has a notable side effect–it reduces price. Currently this keyboard is available for $79.99 at retail or as low as $65 on Amazon.com. Only Razor’s bare-bones version of the BlackWidow keyboard sells for less, and it only beats the QuickFire by $5 dollars.
So can you really buy a decent gaming keyboard for $65, and will you miss the numpad? Let’s find out.
Introduction and Features
Cooler Master is a well know name among PC enthusiasts and they are currently celebrating their 20th anniversary! In addition to a full line of PC enclosures, Cooler Master offers power supplies, coolers, and numerous accessories for mobile computing, notebooks, servers, and gaming. In this review we will be taking a detailed look at the new Cooler Master HAF XM enclosure, which is a very interesting mid-tower version of their popular HAF X series full-tower enclosure.
Here is what Cooler Master has to say about the new HAF XM enclosure: “The pure essence of the HAF X packed into a mid-tower size. The HAF XM supports up to four huge 200mm fans (Top x 2, Front, and Side), a large 140mm rear exhaust fan, and ample space for even the most demanding system builds. Liquid cooling support, an easy-access 90 degree rotated latched side panel, front HDD and SSD X-Docks, USB 3.0, and support for a hidden 2.5” HDD/SSD behind the motherboard tray make the HAF XM the ultimate setup for those that want enthusiast-grade features in a mid-tower package.”
Another GK104 Option for $299
If you missed our live stream with PC Perspective's Ryan Shrout and NVIDIA's Tom Petersen discussing the new GeForce GTX 660 Ti you can find the replay at this link!!
While NVIDIA doesn't like us to use the codenames anymore, very few GPUs are as flexible and as stout as the GK104. Originally released with the GTX 680 and then with the dual-GPU beast known as the GTX 690 and THEN with the more modestly priced GTX 670, this single chip has caused AMD quite a few headaches. It appears things will only be worse with the release of the new GeForce GTX 660 Ti today, once again powered by GK104 and the Kepler architecture at the $299 price point.
While many PC gamers lament about the lack of games that really push hardware today, NVIDIA has been promoting the GTX 660 Ti as the upgrade option of choice for gamers on a 2 -4 year cycle. Back in 2008 the GTX 260 was the mid-range enthusiast option while in 2010 it was the GTX 470 based on Fermi. NVIDIA claims GTX 260 users will see more than 3x the performance increase with the 660 Ti all while generating those pixels more efficiently.
I mentioned that the GeForce GTX 660 Ti was based on GK104 and what you mind find interesting is that it is nearly identical to the specifications of the GTX 670. Both utilize 7 SMXs for a total of 1344 stream processors or CUDA cores and both run at a reference clock speed of 915 MHz base and 980 MHz Boost. Both include 112 texture units though the GeForce GTX 660 Ti does see a drop in ROP count from 32 to 24 and L2 cache drops from 512KB to 384KB. Why?
Spicing up the GTX 670
The Power Edition graphics card series from MSI is a relatively new addition to its lineup. The Power Edition often mimics that of the higher-end Lightning series, but at a far lower price (and perhaps a smaller feature set). This allows MSI to split the difference between the reference class boards and the high end Lightning GPUs.
Doing this allows users a greater variety of products to choose from, and to better tailor users' purchases by their needs and financial means. Not everyone wants to pay $600 for a GTX 680 Lightning, but what if someone was able to get similar cooling, quality, and overclocking potential for a much lower price? This is what MSI has done with one of its latest Power Edition cards.
The GTX 670 Power Edition
The NVIDIA GTX 670 cards have received accolades throughout the review press. It is a great combination of performance, power consumption, heat production, and price. It certainly caused AMD a great amount of alarm, and it hurriedly cut prices on the HD 7900 series of cards in response. The GTX 670 is a slightly cut-down version of the full GTX 680, and it runs very close to the clock speed of its bigger brother. In fact, other than texture and stream unit count, the cards are nearly identical.
Overclocked and 4GB Strong
Even though the Kepler GK104 GPU is now matured in the market, there is still a ton of life left in this not-so-small chip and Galaxy sent us a new graphics card to demonstrate just that. The Galaxy GeForce GTX 670 GC 4GB card that we are reviewing today takes the GTX 670 GPU (originally released and reviewed on May 10th) and juices it up on two different fronts: clock rates and memory capacity.
The Galaxy GTX 670 GC 4GB graphics card is based on GK104 as mentioned below and meets most of the same specifications as the reference GTX 670. That includes 1344 CUDA cores or stream processors, 112 texture units and 32 ROP units along with a 256-bit GDDR5 memory bus.
The GC title indicates that the Galaxy GTX 670 GC 4GB is overclocked as well - this card runs at 1006 MHz base clock, 1085 MHz Boost clock and 1500 MHz memory clock. Compared to the defaults of 915 MHz, 980 MHz and 1500 MHz (respectively) this Galaxy model gets a 10% increase in clock speed though we'll see how much that translates into gaming performance as we go through our review.
Of course, also in the title of the review, the Galaxy GTX 670 GC includes 4GB of frame buffer, twice as much as the reference cards. The goal is obviously to attract gamers with high resolution screens (2560x1600 or 2560x1440) as well as users interested in triple panel NVIDIA Surround gaming. We test both of those resolutions in our game collection on the following pages to see just how that works out.
7950 gets a quick refresh
Back in June, AMD released (or at least announced) an update to the Radeon HD 7970 3GB card called the GHz Edition. Besides the higher clock speeds, the card was the first AMD offering to include PowerTune with Boost–a dynamic clock scaling capability that allowed the GPU to increase clock speeds when power and temperature allowed.
While similar in ideology to the GPU Boost that NVIDIA invented with the GTX 680 Kepler launch, AMD's Boost is completely predictable and repeatable. Everyone's HD 7970 GHz Edition performs exactly the same regardless of your system or environment.
Here is some commentary that I had on the technology back in June that remains unchanged:
AMD's PowerTune with Boost technology differs from NVIDIA's GPU Boost in a couple of important ways. First, much to its original premise, AMD can guarantee exactly how all Radeon HD 7970 3GB GHz Edition graphics cards will operate, and at what speeds in any given environment. There should be no variability between the card that I get and the card that you can buy online. Using digital temperature estimation in conjunction with voltage control, the PowerTune implementation of boost is completely deterministic.
As the above diagram illustrates, the "new" part of PowerTune with the GHz Edition is the ability to vary the voltage of the GPU in real-time to address a wider range of qualified clock speeds. On the previous HD 7970s the voltage was a locked static voltage in its performance mode, meaning that it would not increase or decrease during load operations. As AMD stated to us in a conversation just prior to launch, "by having multiple voltages that can be invoked, we can be at a more optimal clock/voltage combination more of the time, and deliver higher average performance."
The problem I have with AMD's boost technology is that they are obviously implementing this as a reaction to NVIDIA's technology. That isn't necessarily a bad thing, but the tech feels a little premature because of it. We were provided no tools prior to launch to actually monitor the exact clock speed of the GPU in real-time. The ability to monitor these very small changes in clock speed are paramount to our ability to verify the company's claims, and without it we will have questions about the validity of results. GPU-Z and other applications we usually use to monitor clock speeds (including AMD's driver) only report 1050 MHz as the clock speed–no real-time dynamic changes are being reported.
(As a side note, AMD has promised to showcase their internal tool to show real-time clock speed changes in our Live Review at http://pcper.com/live on Friday the 22nd, 11am PDT / 2pm EDT.) [It has since been archived for your viewing pleasure.]
A couple of points to make here: AMD still has not released that tool to show us internal steps of clock speeds, and instead told me today that they were waiting for an updated API to allow other software (including their own CCC) to be able to report the precise results.
Today AMD is letting us preview the new HD 7950 3GB card that will be shipping soon with updated clock speeds and Boost support. The new base clock speed of the HD 7950 will be 850 MHz, compared to the 800 MHz of the original reference HD 7950. The GPU will be able to boost as high as 925 MHz. That should give the new 7950s a solid performance gain over the original with a clock speed increase of as much as 15%.