The Intel HD Graphics are joined by Iris
Intel gets a bad wrap on the graphics front. Much of it is warranted but a lot of it is really just poor marketing about the technologies and features they implement and improve on. When AMD or NVIDIA update a driver or fix a bug or bring a new gaming feature to the table, they are sure that every single PC hardware based website knows about and thus, that as many PC gamers as possible know about it. The same cannot be said about Intel though - they are much more understated when it comes to trumpeting their own horn. Maybe that's because they are afraid of being called out on some aspects or that they have a little bit of performance envy compared to the discrete options on the market.
Today might be the start of something new from the company though - a bigger focus on the graphics technology in Intel processors. More than a month before the official unveiling of the Haswell processors publicly, Intel is opening up about SOME of the changes coming to the Haswell-based graphics products.
We first learned about the changes to Intel's Haswell graphics architecture way back in September of 2012 at the Intel Developer Forum. It was revealed then that the GT3 design would essentially double theoretical output over the currently existing GT2 design found in Ivy Bridge. GT2 will continue to exist (though slightly updated) on Haswell and only some versions of Haswell will actually see updates to the higher-performing GT3 options.
In 2009 Intel announced a drive to increase graphics performance generation to generation at an exceptional level. Not long after they released the Sandy Bridge CPU and the most significant performance increase in processor graphics ever. Ivy Bridge followed after with a nice increase in graphics capability but not nearly as dramatic as the SNB jump. Now, according to this graphic, the graphics capability of Haswell will be as much as 75x better than the chipset-based graphics from 2006. The real question is what variants of Haswell will have that performance level...
I should note right away that even though we are showing you general performance data on graphics, we still don't have all the details on what SKUs will have what features on the mobile and desktop lineups. Intel appears to be trying to give us as much information as possible without really giving us any information.
Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.
The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.
The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.
Our 4K Testing Methods
You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office. Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160. For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays. Oh, and this TV only cost us $1300.
In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable. You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz. That doesn't mean we are limited to 30 FPS of performance though, far from it. As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.
I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others. Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome. The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations. Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle. This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise.
I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.
Subject: Graphics Cards | April 24, 2013 - 10:14 PM | Tim Verry
Tagged: xfx, malta, hd 7990, GCN, dual gpu, amd
Now that AMD’s dual-gpu Malta graphics card is official, cards from Add-In Board (AIB) partners are starting to roll in. One such recently announced card is the XFX Radeon HD 7990 card. The XFX card is based on the reference AMD design, which includes two Radeon HD 7970 GPUs in a Crossfire configuration.
The two GPUs can boost up to 1GHz clock speeds and feature a total of 4096 stream processors, 256 texture units, 64 ROPs, and 8.6 billion transistors. The card also includes 3GB of GDDR5 memory per GPU running off a 384-bit bus. It supports AMD’s Eyefinity technology and offers up one DL-DVI and four mini-DisplayPort video outputs.
The XFX HD 7990 uses the reference AMD heatsink as well, which includes a massive aluminum fin stack with five copper heatpipes that run the length of the heasink and directly touch the two 7970 GPUs. Three shrouded fans, in turn, keep the heatsink cool.
The dual-GPU monster is eligible for AMD’s Never Settle bundle which includes eight free games. With purchase of the HD 7990 (from any eligible AIB), you get free key codes for the following games:
- Bioshock Infinite
- Crysis 3
- Deus Ex: Human Revolution
- Far Cry 3
- Far Cry 3: Blood Dragon
- Hitman: Absolution
- Sleeping Dogs
- Tomb Raider
The XFX press release further assures gamers that the card can, in fact, play Crysis 3 at maximum settings at a resolution of 3840 x 2160. The company did not mention pricing, however.
For those interested in AMD’s new Malta GPU, check out our review as well as how the card performs when paired with a prototype AMD driver that seeks to address some of the frame rating issues exhibited by AMD's Crossfire multi-GPU solution.
Subject: Graphics Cards | April 24, 2013 - 07:09 PM | Tim Verry
Tagged: amd, powercolor, hd 7990, malta, dual gpu, crossfire
PowerColor (a TUL corporation brand) launched its dual-GPU Radeon HD 7990 V2 graphics card, and this time the card is based on the (recently reviewed) official dual-GPU AMD “Malta” GPU announced at the Games Developers Conference (GDC). The new HD 7990 V2 graphics card features two AMD HD 7970 cards in a Crossfire configuration. That means that the Malta-based card features a total of 4096 stream processors, and a rated 8.2 TFLOPS of peak performance.
The PowerColor HD 7990 V2 joins the company’s existing Devil 13 and HD 7990 graphics cards. The new card sports a triple-fan shrouded heatsink that is somewhat tamer-looking that the custom Devil 13. Other hardware includes 3GB of GDDR5 RAM per GPU clocked at 1500MHz and running on a 384-bit bus (again, per GPU) for a total of 6GB. Both GPUs have clock speeds of 950MHz base and up to 1GHz boost.
The new GPU has a single DL-DVI and four mini-DisplayPort video outputs. PowerColor is touting the card’s Eyefinity prowess as well as its ZeroCore support for reducing power usage when idle. The board has a TDP of 750W and is powered by two PCI-E power connections. In all, the HD 7990 V2 graphics card measures 305 x 110 x 38mm. While PowerColor has not released pricing or availability, expect the card to be available soon and around the same price (or a bit lower than) as its existing (custom) HD 7990.
The full press release can be found here.
The card we have been expecting
Despite all the issues that were brought up with our new graphics performance testing methodology we are calling Frame Rating, there is little debate in the industry that AMD is making noise once again in the graphics field. From the elaborate marketing and game bundles with all Radeon HD 7000 series cards over the last year to the hiring of Roy Taylor, VP of sales but also the company's most vocal supporter.
Along with the marketing though goes plenty of technology and important design wins. With the dominance of the APU on the console side (Wii U, Playstation 4 and the next Xbox), AMD is making sure that the familiarity with its GPU architecture there pays dividends on the PC side as well. Developers will be focusing on AMD's graphics hardware for 5-10 years with the console generation and that could result in improved performance and feature support for Radeon graphics for PC gamers.
Today's release of the Radeon HD 7990 6GB Malta dual-GPU graphics card shows a renewed focus on high-end graphics markets since the release of the Radeon HD 7970 in January of 2012. And while you may have seen something for sale previously with the HD 7990 name attached, those were custom designs built by partners, not by AMD.
Both ASUS and PowerColor currently have high-end dual-Tahiti cards for sale. The PowerColor HD 7990 Devil 13 used the brand directly but ASUS' ARES II kept away from the name and focused on its own high-end card brands instead.
The "real" Radeon HD 7990 card was first teased at GDC in March and takes a much less dramatic approach to its design without being less impressive technically. The card includes a pair of Tahiti, HD 7970-class GPUs on a single PCB with 6GB of total memory. The raw specifications are listed here:
Considering there are two HD 7970 GPUs on the HD 7990, the doubling of the major specs shouldn't be surprising though it is a little deceiving. There are 8.6 billion transistors yes, but there are still 4.3 billion on each GPU. Yes there are 4096 stream processors but only 2048 on each GPU requiring software GPU scaling to increase performance. The same goes with texture fill rate, compute performance, memory bandwidth, etc. The same could be said for all dual-GPU graphics cards though.
A very early look at the future of Catalyst
Today is a very interesting day for AMD. It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board. Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver.
If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed. The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet. That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised.
Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February. Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology. We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light. Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD.
Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered.
If you are just catching up on the story, you really need some background information. The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically. From that piece:
It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand. We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern. Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case. Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?” My answer based on the below graph would be no.
An example of a runt frame in a CrossFire configuration
NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen. For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.
New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%
Subject: Graphics Cards | April 23, 2013 - 03:53 PM | Jeremy Hellstrom
Tagged: nvidia, graphics drivers, geforce, 320.00 beta
GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.
With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.
Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):
- Up to 20% in Dirt: Showdown
- Up to 18% in Tomb Raider
- Up to 8% in StarCraft II
- Up to 6% in other top games like Far Cry 3
For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.
Enjoy the new GeForce Game Ready drivers and let us know what you think.
Windows Vista/Windows 7 Fixed Issues
The Windows 7 Magnifier window flickers. 
Games default to stereoscopic 3D mode after installing the driver. 
[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. 
[Crysis 3]: There are black artifacts in the game. 
[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. 
[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled 
[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. 
[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. 
[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. 
[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. 
[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. 
[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. 
[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. 
[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. 
[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. 
[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. 
SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.
Subject: Graphics Cards | April 23, 2013 - 10:05 AM | Ryan Shrout
Tagged: amd, never settle, never settle reloaded, bundle
While browsing around on Twitter today I saw mention of a leaked slide on the Tech Report forums that seems to point in the direction of upcoming games to be included in future AMD Never Settle gaming bundles. AMD has been knocking the ball out of the park when it comes to bundled software with graphics card releases as they have gotten essentially every major PC game in the last 12 months.
This slide indicates that Grid 2, Company of Heroes 2, Rome: Total War II, Splinter Cell Blacklist, Lost Planet 3, Battlefield 4, Raven's Cry and Watch Dogs will all eventually make their way to the AMD bundle list at some point this year. Whether it will be in one mega-bundle or several different promotions throughout the year isn't known, but AMD is serious about keeping up appearances in the PC gaming front.
Subject: General Tech, Graphics Cards | April 19, 2013 - 02:51 PM | Ryan Shrout
Tagged: raja koduri, apple, amd
Interesting information has surfaced today about the addition of a new executive at AMD. Raja Koduri, who previously worked for ATI and AMD as Chief Technology Officer, departed the company in 2009 for a four year stint at Apple, helping to turn that company into an SoC power house. Developing its own processors has enabled Apple to stand apart from the competition in many mobile spaces and Koduri is partly responsible for the technological shift at Apple.
Starting on Monday though, Raja Koduri is officially back at AMD, taking over as the CVP (Corporate Vice President) of Visual Computing. This position will result in more complete control over the entirety of the hardware and software platforms AMD is developing including desktop discrete, mobile and APU/SoC designs. This marks the second major returning visionary executive in recent memory to AMD, the first of which was Jim Keller in August of 2012 (also returning from a period with Apple).
It will take some time for Koduri to have effect on AMD's current roadmap
Having known Raja Koduri for quite a long time I have always seen the man as an incredibly intelligent engineer that was able to find strengths in designs that others could not. Much of the success of the ATI/AMD GPU divisions during the 2000s was due to Koduri's leadership (among others of course) and I think having him back at AMD at an even more senior role is great news for both discrete graphics fans and APU users.
In a discussion with Koduri recently, Anandtech got some positive feedback for PC gamers:
Raja believes there’s likely another 15 years ahead of us for good work in high-end discrete graphics, so we’ll continue to see AMD focus on that part of the market.
Koduri sees 15 years more GPU evolution
So even though this hiring isn't going to change AMD's position on the APU and SoC strategy, it is good to have someone at the CVP level that sees the importance and value of discrete, high power GPU technology.
In many talks with AMD over the last 6 months we kept hearing about the healthy influx of quality personnel though much of it was still under wraps. Keller was definitely one of them and Koduri is another and both of the hires give a lot of hope for AMD as a company going forward. Some in the industry have already written AMD off but I find it hard to believe that this caliber of executive would return to a sinking ship.