PCPer Live! Frame Rating and FCAT - Your Questions Answered!

Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM |
Tagged: video, nvidia, live, frame rating, fcat

Update: Did you miss the live stream?  Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!

I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology.  Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.

I also know that there are lots of questions about the process, the technology and the results we have shown.  In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.

Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input. 

fr-2.png

The primary part of this live stream will be about education - not about bashing one particular product line or talking up another.  And part of that education is your ability to interact with us live, ask questions and give feedback.  During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience.  The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below.  It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.

Frame Rating and FCAT Live Stream

11am PT / 2pm ET - May 9th

PC Perspective Live! Page

So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!

AMD to erupt Volcanic Islands GPUs as early as Q4 2013?

Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM |
Tagged: Volcanic Islands, radeon, ps4, amd

So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.

It is times like these where GPGPU-based seismic computation becomes useful.

The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).

Radeon9000.jpg

So apparently a discrete GPU can have serial processing units embedded on it now.

Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.

Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.

Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.

This chip, Hawaii, is rumored to have the following specifications:

  • 4096 stream processors
  • 16 serial processor cores on 8 modules
  • 4 geometry engines
  • 256 TMUs
  • 64 ROPs
  • 512-bit GDDR5 memory interface, much like the PS4.
  • 20 nm Gate-Last silicon fab process
    • Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)

Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.

Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?

Source: TechPowerUp

Zotac Announces Overclocked GTX TITAN AMP! Edition Graphics Card

Subject: Graphics Cards | May 1, 2013 - 05:41 AM |
Tagged: zotac, nvidia, gtx titan, AMP!

Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.

The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.

Zotac GeForce GTX Titan AMP! Edition.jpg

The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.

Source: Zotac

XFX Announces Malta Dual-GPU Radeon HD 7990

Subject: Graphics Cards | April 24, 2013 - 10:14 PM |
Tagged: xfx, malta, hd 7990, GCN, dual gpu, amd

Now that AMD’s dual-gpu Malta graphics card is official, cards from Add-In Board (AIB) partners are starting to roll in. One such recently announced card is the XFX Radeon HD 7990 card. The XFX card is based on the reference AMD design, which includes two Radeon HD 7970 GPUs in a Crossfire configuration.

The two GPUs can boost up to 1GHz clock speeds and feature a total of 4096 stream processors, 256 texture units, 64 ROPs, and 8.6 billion transistors. The card also includes 3GB of GDDR5 memory per GPU running off a 384-bit bus. It supports AMD’s Eyefinity technology and offers up one DL-DVI and four mini-DisplayPort video outputs.

XFX Radeon HD 7990.jpg

The XFX HD 7990 uses the reference AMD heatsink as well, which includes a massive aluminum fin stack with five copper heatpipes that run the length of the heasink and directly touch the two 7970 GPUs. Three shrouded fans, in turn, keep the heatsink cool.

The dual-GPU monster is eligible for AMD’s Never Settle bundle which includes eight free games. With purchase of the HD 7990 (from any eligible AIB), you get free key codes for the following games:

  • Bioshock Infinite
  • Crysis 3
  • Deus Ex: Human Revolution
  • Far Cry 3
  • Far Cry 3: Blood Dragon
  • Hitman: Absolution
  • Sleeping Dogs
  • Tomb Raider

The XFX press release further assures gamers that the card can, in fact, play Crysis 3 at maximum settings at a resolution of 3840 x 2160. The company did not mention pricing, however.

For those interested in AMD’s new Malta GPU, check out our review as well as how the card performs when paired with a prototype AMD driver that seeks to address some of the frame rating issues exhibited by AMD's Crossfire multi-GPU solution.

Source: XFX

PowerColor Launches HD 7990 V2 Based On Official AMD Malta GPU

Subject: Graphics Cards | April 24, 2013 - 07:09 PM |
Tagged: amd, powercolor, hd 7990, malta, dual gpu, crossfire

PowerColor (a TUL corporation brand) launched its dual-GPU Radeon HD 7990 V2 graphics card, and this time the card is based on the (recently reviewed) official dual-GPU AMD “Malta” GPU announced at the Games Developers Conference (GDC). The new HD 7990 V2 graphics card features two AMD HD 7970 cards in a Crossfire configuration. That means that the Malta-based card features a total of 4096 stream processors, and a rated 8.2 TFLOPS of peak performance.

PowerColor HD 7990 V2 Malta.jpg

The PowerColor HD 7990 V2 joins the company’s existing Devil 13 and HD 7990 graphics cards. The new card sports a triple-fan shrouded heatsink that is somewhat tamer-looking that the custom Devil 13. Other hardware includes 3GB of GDDR5 RAM per GPU clocked at 1500MHz and running on a 384-bit bus (again, per GPU) for a total of 6GB. Both GPUs have clock speeds of 950MHz base and up to 1GHz boost.

PowerColor HD 7990 V2 Malta GPU.jpg

The new GPU has a single DL-DVI and four mini-DisplayPort video outputs. PowerColor is touting the card’s Eyefinity prowess as well as its ZeroCore support for reducing power usage when idle. The board has a TDP of 750W and is powered by two PCI-E power connections. In all, the HD 7990 V2 graphics card measures 305 x 110 x 38mm. While PowerColor has not released pricing or availability, expect the card to be available soon and around the same price (or a bit lower than) as its existing (custom) HD 7990.

The full press release can be found here.

Source: PowerColor

New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%

Subject: Graphics Cards | April 23, 2013 - 03:53 PM |
Tagged: nvidia, graphics drivers, geforce, 320.00 beta

NVIDIA rolled out a new set of beta drivers that provide up to 20% faster performance and have been optimized for a handful of new titles, including Dead Island: Riptide, Star Trek, and Neverwinter.

GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.

With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.

Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):

  • Up to 20% in Dirt: Showdown
  • Up to 18% in Tomb Raider
  • Up to 8% in StarCraft II
  • Up to 6% in other top games like Far Cry 3

For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.

Enjoy the new GeForce Game Ready drivers and let us know what you think.

nvidia-geforce-320-00-beta-drivers-gtx-680-performance.png

Windows Vista/Windows 7 Fixed Issues

The Windows 7 Magnifier window flickers. [1058231]

Games default to stereoscopic 3D mode after installing the driver. [1261633]

[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. [1239252]

[Crysis 3]: There are black artifacts in the game. [1251495]

[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. [1253727]

[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled [1220312]

[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. [899771]

[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. [909749]

[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. [1042342]

[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. [1021046]

[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. [1037744]

[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. [1254359]

[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. [905544]

[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. [1253206]

[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. [1251578]

[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. [1253206]

SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.

Source: NVIDIA

Upcoming Never Settle Bundle Games Leaked

Subject: Graphics Cards | April 23, 2013 - 10:05 AM |
Tagged: amd, never settle, never settle reloaded, bundle

While browsing around on Twitter today I saw mention of a leaked slide on the Tech Report forums that seems to point in the direction of upcoming games to be included in future AMD Never Settle gaming bundles.  AMD has been knocking the ball out of the park when it comes to bundled software with graphics card releases as they have gotten essentially every major PC game in the last 12 months. 

neversettle.jpg

This slide indicates that Grid 2, Company of Heroes 2, Rome: Total War II, Splinter Cell Blacklist, Lost Planet 3, Battlefield 4, Raven's Cry and Watch Dogs will all eventually make their way to the AMD bundle list at some point this year.  Whether it will be in one mega-bundle or several different promotions throughout the year isn't known, but AMD is serious about keeping up appearances in the PC gaming front.

Source: Tech Report

Raja Koduri Returns to AMD After 4 Years at Apple

Subject: General Tech, Graphics Cards | April 19, 2013 - 02:51 PM |
Tagged: raja koduri, apple, amd

Interesting information has surfaced today about the addition of a new executive at AMD.  Raja Koduri, who previously worked for ATI and AMD as Chief Technology Officer, departed the company in 2009 for a four year stint at Apple, helping to turn that company into an SoC power house.  Developing its own processors has enabled Apple to stand apart from the competition in many mobile spaces and Koduri is partly responsible for the technological shift at Apple.

Starting on Monday though, Raja Koduri is officially back at AMD, taking over as the CVP (Corporate Vice President) of Visual Computing.  This position will result in more complete control over the entirety of the hardware and software platforms AMD is developing including desktop discrete, mobile and APU/SoC designs.  This marks the second major returning visionary executive in recent memory to AMD, the first of which was Jim Keller in August of 2012 (also returning from a period with Apple). 

koduri1.JPG

It will take some time for Koduri to have effect on AMD's current roadmap

Having known Raja Koduri for quite a long time I have always seen the man as an incredibly intelligent engineer that was able to find strengths in designs that others could not.  Much of the success of the ATI/AMD GPU divisions during the 2000s was due to Koduri's leadership (among others of course) and I think having him back at AMD at an even more senior role is great news for both discrete graphics fans and APU users. 

In a discussion with Koduri recently, Anandtech got some positive feedback for PC gamers:

Raja believes there’s likely another 15 years ahead of us for good work in high-end discrete graphics, so we’ll continue to see AMD focus on that part of the market.

koduri2.jpg

Koduri sees 15 years more GPU evolution

So even though this hiring isn't going to change AMD's position on the APU and SoC strategy, it is good to have someone at the CVP level that sees the importance and value of discrete, high power GPU technology. 

In many talks with AMD over the last 6 months we kept hearing about the healthy influx of quality personnel though much of it was still under wraps.  Keller was definitely one of them and Koduri is another and both of the hires give a lot of hope for AMD as a company going forward.  Some in the industry have already written AMD off but I find it hard to believe that this caliber of executive would return to a sinking ship. 

Source: CNET

SEIKI SE50UY04 50-in 4K 3840x2160 TV Unboxing and Preview

Subject: General Tech, Graphics Cards, Displays | April 18, 2013 - 08:52 PM |
Tagged: video, seiki, se50UY04, hdtv, hdmi 1.4, displays, 4k, 3840x2160

This just in!  We have a 4K TV in the PC Perspective Offices!

seiki1.jpg

While we are still working on the ability to test graphics card performance at this resolution with our Frame Rating capture system, we decided to do a live stream earlier today as we unboxed, almost dropped and then eventually configured our new 4K TV. 

The TV in question?  A brand new SEIKI SE50UY04 50-in 3840x2160 ready display.  Haven't heard of it?  Neither have we.  I picked it up over the weekend from TigerDirect for $1299, though it actually a bit higher now at $1499.

seiki2.jpg

The TV itself is pretty unassuming and other than looking for the 4K label on the box you'd be hard pressed to discern it from other displays.  It DID come with a blue, braided UHD-ready HDMI cable, so there's that.

seiki3.jpg

One point worth noting is that the stand on the TV is pretty flimsy; there was definitely wobble after installation and setup.

seiki4.jpg

Connecting the TV to our test system was pretty easy - only a single HDMI cable was required and the GeForce GTX 680s in SLI we happened to have on our test bed recognized it as a 3840x2160 capable display.  Keep in mind that you are limited to a refresh rate of 30 Hz though due to current limitations of HDMI 1.4.  The desktop was clear and sharper and if you like screen real estate...this has it. 

The first thing we wanted to try was some 4K video playback and we tried YouTube videos, some downloaded clips we found scattered across the Internet and a couple of specific examples I had been saving.  Isn't that puppy cute?  It was by far the best picture I had seen on a TV that close up - no other way to say it.

We did have issues with video playback in some cases due to high bit rates.  In one case we had a YUV uncompressed file that was hitting our SSD so hard on read speeds that we saw choppiness.  H.265 save us!

seiki5.jpg

And of course we demoed some games as well - Battlefield 3, Crysis 3, Skyrim and Tomb Raider.  Each was able to run at 3840x2160 without any complaints or INI hacks.  They all looked BEAUTIFUL when in a still position but we did notice some flickering on the TV that might be the result of the 120 Hz interpolation and possibly the "dynamic luminance control" feature that SEIKI has. 

We'll definitely test some more on this in the coming days to see if we can find a solution as I know many PC gamers are going to be excited about the possibility of using this as a gaming display!  We are working on a collection of benchmarks on some of the higher end graphics solutions like the GeForce TITAN, GTX 680s, HD 7990 and HD 7970s!

If you want to check out the full experience of our unboxing and first testing, check out the full live stream archived below!!

A good value card that can become great once overclocked, ASUS' HD 7850 DirectCU II

Subject: Graphics Cards | April 18, 2013 - 04:15 PM |
Tagged: asus, HD 7850 DirectCU II

With a custom cooler, 1 DP port, 2 DVI-I connectors, and 1 HDMI connector and only requiring a single PCIe 6 pin power connector the ASUS 7850 DirectCU II is a great blend of efficiency and flexibility for those looking for a card which costs around $200.  On the other hand if you have no plans to overclock the card, the GTX660 which [H]ard|OCP compared this card to is slightly more powerful, costs the same and is a better choice for those who are planning on running dual GPUs.  Check out the overclocked performance of this HD7850 in the full review. 

H_cu2.jpg

"ASUS has refreshed its AMD Radeon HD 7850 DirectCU II video card with DirectCU and DIGI+ VRM with Super Alloy Power, poised to give you a robust video card with an improved overclocking experience. We will see whether this new revision brings new value to the Radeon HD 7850 GPU and we will compare it to the NVIDIA GeForce GTX 660."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Bundles Metro: Last Light with GTX 660 and Higher

Subject: Graphics Cards | April 16, 2013 - 10:24 PM |
Tagged: nvidia, metro last light, Metro

Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards.  Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.

metro.jpg

Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did. 

metro2.jpg

This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.

NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.

The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.

How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more?  Did NVIDIA step up its game this time around?

Source: NVIDIA

Poll: Visual Effects of Vsync on Gaming Animation

Subject: Graphics Cards | April 16, 2013 - 03:01 PM |
Tagged: vsync, stutter, smoothness, microstutter, frame rating, animation

We are running a poll in conjunction with our Frame Rating: Visual Effects of Vsync on Gaming Animation story that compares animation smoothness between fixed 30 FPS and 60 FPS captures and Vsync enabled versions. 

If you haven't read the story linked above, these questions won't make any sense to you so please go read it and then stop back here to answer the polls!

NVIDIA Rumored To Release 700-Series GeForce Cards At Computex 2013

Subject: Graphics Cards | April 15, 2013 - 03:34 PM |
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex

Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.

NVIDIA GeForce Logo.jpg

According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.

Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.

What do you make of this rumor? Do you think a refreshed Kepler is coming this year?

Source: Fudzilla

A New Gigabyte WindForce 450W GPU Cooler May Be Coming to a GTX 680 and Titan Near You

Subject: Graphics Cards | April 14, 2013 - 07:59 PM |
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte

Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.

The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.

Gigabyte WindForce 450W GPU Cooler for NVIDIA GTX Titan and GTX 680 Graphics Cards.jpg

ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.

The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670  or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).

What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).

AMD Never Settles: Far Cry 3 Blood Dragon Bundled

Subject: Editorial, General Tech, Graphics Cards | April 14, 2013 - 02:22 AM |
Tagged: never settle, never settle reloaded, amd, far cry 3

So when AMD reloaded their Never Settle bundles, they left an extra round in the barrel.

Some of my favorite games were given to me in a bundle with some piece of computer hardware. You might remember from the PC Perspective game night that I am a major fan of the Unreal Tournament franchise. My first Unreal Tournament game was an unexpected surprise when I purchased my first standalone GPU. My 166MHz Pentium computer also came bundled with Mechwarrior 2 and Wipeout.

As we discussed, AMD considers bundle-offers as a way to keep the software industry rolling forward. The quantity and quality of games which participate in the recent Never Settle bundles certainly deserve credit as it is due. Bioshock: Infinite is a game that just about every PC gamer needs to experience, and there are about a half-dozen other great titles as a part of the promotion depending upon which card or cards you purchase.

As it turns out, AMD negotiated with Ubisoft and added Far Cry 3: Blood Dragon to their Never Settle bundle. The coolest part is that AMD will retroactively email codes for this new title to anyone who has redeemed a Never Settle: Reloaded code.

So if you have ever Reloaded your Never Settle in the past, check your email as apparently you can Never Settle your reloads again.

Source: AMD

PowerColor Launches Revised Factory Overclocked Radeon HD 7790 OC V2 Graphics Card

Subject: Graphics Cards | April 13, 2013 - 10:07 PM |
Tagged: radeon hd7790, powercolor, GCN, amd, 7790

PowerColor launched a new factory overclocked graphics card recently that is a revision of a previous model. The PowerColor HD7790 OC V2 is based on AMD’s Graphics Core Next (GCN) architecture and measures a mere 180 x 150 x 38mm.

PowerColor Radeon HD7790 1GB GDDR5 OC V2 Graphics Card.jpg

The AMD Radeon HD 7790 GPU features 896 stream processors, 56 texture units, and 80 ROP units. The GPU is clocked at 1000 MHz base and 1030 MHz boost while the 1GB of GDDR5 memory is clocked at the 6Gbps reference speed. PowerColor has fitted the overclocked card with an aluminum heatsink cooled by a single 8mm copper heatpipe and 70mm fan.

The new card features two DL-DVI, one HDMI, and one DisplayPort video outputs. Its model number is AX7790-1GBD5-DHV2/OC. According to Guru3D, the new/revised card is priced at 120 pounds sterling. However, considering the currently available OC (non-V2) card is $150, the revised card is likely to come in around that price when it hits US retailers.

Also: If you have not already, read our latest Frame Rating article to see how the Radeon HD 7790 graphics card stacks up against the competition!

Source:

Refreshing the non-Ti GTX 660

Subject: Graphics Cards | April 8, 2013 - 07:33 PM |
Tagged: gk106, gtx660, asus, GTX 660 DirectCU II OC

Not everyone can afford to spend $400+ on a GPU in one shot but sometimes they can manage it if the purchase is split into two.  For those considering a multi-GPU setup, it has become obvious from Ryan's testing that NVIDIA is the way to go.  The 660 Ti is a favourite but even it might be too rich for some peoples wallets which is why it is nice to see the ASUS offer their GTX 660 DirectCU II OC for $215 after MIR.  [H]ard|OCP just put up a review of this card covering both the FPS performance of the card as it was when it arrived as well as after they pushed the base clock up almost as high as the original boost clock.  If you are on a limited GPU budget you should check out the full review.

h660.jpg

"ASUS has delivered a factory overclocked GeForce GTX 660 DirectCU II OC to our doorstep to run through the wringer. We match this ASUS video card up against AMD's Radeon HD 7870 GHz Edition and Radeon HD 7850 to see which will prevail in the battle of the mainstream cards. There are good values at this price point."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

(Not) The End of DirectX

Subject: Editorial, General Tech, Graphics Cards, Systems, Mobile | April 7, 2013 - 10:21 PM |
Tagged: DirectX, DirectX 12

Microsoft DirectX is a series of interfaces for programmers to utilize typically when designing gaming or entertainment applications. Over time it became synonymous with Direct3D, the portion which mostly handles graphics processing by offloading those tasks to the video card. At one point, DirectX even handled networking through DirectPlay although that has been handled by Games for Windows Live or other APIs since Vista.

AMD Corporate Vice President Roy Taylor was recently interviewed by the German press, "c't magazin". When asked about the future of "Never Settle" bundles, Taylor claimed that games such as Crysis 3 and Bioshock: Infinite keep their consumers happy and also keep the industry innovating.

gfwl.png

Keep in mind, the article was translated from German so I might not be entirely accurate with my understanding of his argument.

In a slight tangent, he discussed how new versions of DirectX tends to spur demand for new graphics processors with more processing power and more RAM. He has not heard anything about DirectX 12 and, in fact, he does not believe there will be one. As such, he is turning to bundled games to keep the industry moving forward.

Neowin, upon seeing this interview, reached out to Microsoft who committed to future "innovation with DirectX".

This exchange has obviously sparked a lot of... polarized... online discussion. One claimed that Microsoft is abandoning the PC to gain a foothold in the mobile market which it has practically zero share of. That is why they are dropping DirectX.

Unfortunately this does not make sense: DirectX would be one of the main advantages which Microsoft has in the mobile market. Mobile devices have access to fairly decent GPUs which can use DirectX to draw web pages and applications much smoother and much more power efficiently than their CPU counterparts. If anything, DirectX would be increased in relevance if Microsoft was blindly making a play for mobile.

The major threat to DirectX is still quite off in the horizon. At some point we might begin to see C++Amp or OpenCL nibble away at what DirectX does best: offload highly-parallel tasks to specialized processing units.

Still, releases such as DirectX 11.1 are quite focused on back-end tweaks and adjustments. What do you think a DirectX 12 API would even do, that would not already be possible with DirectX 11?

Source: c't magazin

AMD and Adobe Show OpenCL Support for next version of Adobe Premiere Pro

Subject: General Tech, Graphics Cards | April 5, 2013 - 11:48 AM |
Tagged: premiere pro, opencl, firepro, amd, Adobe

As we prepare for the NAB show (National Association of Broadcasters) this week, AMD and Adobe have released a fairly substantial news release concerning the future of Premiere Pro, Adobe's flagship professional video editing suite. 

Earlier today Adobe revealed some of its next generation professional video and audio products, including the next version of Adobe® Premiere Pro. Basically Adobe is giving users a sneak peek at the new features coming to the next versions of its software. And we’ve decided to give you a sneak peek too, providing a look at how the next version of Premiere Pro performs when accelerated by AMD FirePro™ 3D workstation graphics and OpenCL™ versus Nvidia Quadro workstation graphics and CUDA.

This will be the first time that OpenCL is used as the primary rendering engine for Premiere and is something that AMD has been hoping to see for many years.  Previous versions of the software integrated support for NVIDIA's CUDA GPGPU programming models and the revolution of the Mercury Playback Engine was truly industry changing for video production.  However, because it was using CUDA, AMD users were left out of these performance improvements in favor of the proprietary NVIDIA software solution.

Adobe's next version of Premiere Pro (though we aren't told when that will be released) switches from CUDA to OpenCL and the performance of the AMD GCN architecture is being shown off by AMD today. 

Adobe-Premiere-OpenCL-vs-Cuda.png

Using 4K TIFF 24-bit sequence content, Microsoft Windows® 7 64-bit, Intel Xeon E5530 @ 2.40 GHZ and 12GB system memory, AMD compared several FirePro graphics cards (using OpenCL) against NVIDIA Quadro options (using CUDA).  Idealy we would like to see some OpenCL NVIDIA benchmarks as well, but I assume we'll have to wait to test that here at PC Perspective.

Adobe-Premiere-GPU-Utilization.png

AMD also claims that by utilizing OpenCL rather than CUDA, the AMD FirePro GPUs are running at a lower utilization, opening up more graphics processing power for other applications and development work.

While this performance testing is conducted on a pre-release version of the next Adobe Premiere Pro, we’re really pleased with the results. As with all of the professional applications we support, we’ll continue to make driver optimizations for Adobe Premiere Pro that can only help to improve the overall user experience and application performance. So if you’re considering a GPU upgrade as part of your transition to the next version of Adobe Premiere Pro, definitely consider taking a look at AMD FirePro™ 3D workstation graphics cards.

You can continue on to read the full press release from AMD and Adobe on the collaboration or check out the complete blog post posted on AMD.com.

Source: AMD

Factory Overclocked ASUS GTX 660 Ti Dragon Pictured

Subject: Graphics Cards | April 3, 2013 - 11:24 AM |
Tagged: nvidia, kepler, gtx 660 Ti, 660 ti

Two new photos recently popped up on Cowcotland, showing off an unreleased "Dragon Edition" GTX 660 Ti graphics card from ASUS. The new card boasts some impressive factory overclocks on both the GPU and memory as well as a beefy heatsink and a new blue and black color scheme.

ASUS-nvidia-gtx-660-ti-dragon.jpg

The ASUS GTX 660 Ti Dragon will feature a custom cooler with two fans and an aluminum heastink. The back of the card includes a metal backplate to secure the cooler and help dissipate a bit of heat itself. However, there is also a cutout in the backplate to allow for (likely) additional power management circuitry. The card also features the company's power phase technology, NVIDIA's 660 Ti GK-104 GPU, and 2GB of GDDR5 memory. The graphics core is reportedly clocked at 1150MHz (no word on whether that is the base or boost figure) while the memory is overclocked to 6100MHz. For comparison, the reference GTX 660 Ti clocks are 915MHz base, 980MHz boost, and 6,000MHz memory. The new card will support DVI, DisplayPort, and HDMI video outputs.

asus-nvidia-gtx-660-ti-dragon-1.jpg

There is no word on pricing or availability, but the Dragon looks like it will be one of the fastest GTX 660 Ti cards available when (if?) it publicly released!

Source: Cowcotland