Trimming the TITAN; NVIDIA's GTX 780

Subject: Graphics Cards | May 24, 2013 - 06:10 PM |
Tagged: nvidia, gtx 780, gk110, geforce

With 768 more CUDA Cores than the 680 but 384 less than the TITAN the 780 offers improvements over the previous generation and will be available for about $350 less than the TITAN.  As you can see in [H]ard|OCP's testing it does outperform the 680 and 7970 but not by a huge margin which hurts the price to performance ratio and makes it more attractive for 680 owners to pick up a second card for SLI.  AMD owners with previous generation cards and deep pockets might be tempted to pick up a pair of these cards as they show very good frame rating results in Ryan's review.

H_780.jpg

"NVIDIA's new GeForce GTX 780 video card has finally been unveiled. We review the GTX 780 with real world gaming with the most intense 3D games, including Metro: Last Light. If the GTX TITAN had you excited but was a bit out of your price range, the GTX 780 should hold your excitement while being a lot less expensive."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Epic Games is disappointed in the PS4 and Xbox One?

Subject: Editorial, General Tech, Graphics Cards, Systems | May 23, 2013 - 06:40 PM |
Tagged: xbox one, xbox, unreal engine, ps4, playstation 4, epic games

Unreal Engine 4 was presented at the PlayStation 4 announcement conference through a new Elemental Demo. We noted how the quality seemed to have dropped in the eight months following E3 while the demo was being ported to the console hardware. The most noticeable differences were in the severely reduced particle counts and the non-existent fine lighting details; of course, Epic pumped the contrast in the PS4 version which masked the lack of complexity as if it were a stylistic choice.

Still, the demo was clearly weakened. The immediate reaction was to assume that Epic Games simply did not have enough time to optimize the demo for the hardware. That is true to some extent, but there are theoretical limits on how much performance you can push out of hardware at 100% perfect utilization.

Now that we know both the PS4 and, recently, the Xbox One: it is time to dissect more carefully.

A recent LinkedIn post from EA Executive VP and CTO, Rajat Taneja, claims that the Xbox One and PS4 are a generation ahead of highest-end PC on the market. While there are many ways to interpret that statement, in terms of raw performance that statement is not valid.

As of our current knowledge, the PlayStation 4 contains an eight core AMD "Jaguar" CPU with an AMD GPU containing 18 GCN compute units, consisting of a total of 1152 shader units. Without knowing driving frequencies, this chip should be slightly faster than the Xbox One's 768 shader units within 12 GCN compute units. The PS4 claims their system has a total theoretical 2 teraFLOPs of performance and the Xbox One would almost definitely be slightly behind that.

Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.

This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.

Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.

But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.

Update, 5/24/2013: Mark Rein of Epic Games responds to the statement made by Rajat Taneja of EA. While we do not know his opinion on consoles... we know his opinion on EA's opinion:

PCPer Live! GeForce GTX 780 3GB Graphics Card - 2pm EDT / 11am PDT

Subject: Editorial, Graphics Cards | May 23, 2013 - 12:08 PM |
Tagged: video, live, gtx 780, gk110

Missed the LIVE stream?  You can catch the video reply of it below!

Hopefully by now you have read over our review of the new NVIDIA GeForce GTX 780 graphics card that launched this morning.  Taking the GK110 GPU, cutting off some more cores, and setting a price point of $650 will definitely create some interesting discussion. 

IMG_9886.JPG

Join me today at 2pm ET / 11am PT as we discuss the GTX 780, our review and take your questions.  You can leave them in the comments below, no registration required. 

GTX 780 Stream - PC Perspective Live! - 2pm ET / 11am PT

Xbox One announced, the games: not so much.

Subject: Editorial, General Tech, Graphics Cards, Processors, Systems | May 21, 2013 - 05:26 PM |
Tagged: xbox one, xbox

xbox-one-head.jpg

Almost exactly three months have passed since Sony announced the Playstation 4 and just three weeks remain until E3. Ahead of the event, Microsoft unveiled their new Xbox console: The Xbox One. Being so close to E3, they are saving the majority of games until that time. For now, it is the box itself as well as its non-gaming functionality.

First and foremost, the raw specifications:

  • AMD APU (5 billion transistors, 8 core, on-die eSRAM)
  • 8GB RAM
  • 500GB Storage, Bluray reader
  • USB 3.0, 802.11n, HDMI out, HDMI in

The hardware is a definite win for AMD. The Xbox One is based upon an APU which is quite comparable to what the PS4 will offer. Unlike previous generations, there will not be too much differentiation based on available performance; I would not expect to see much of a fork in terms of splitscreen and other performance-sensitive features.

xbox-one-controller.jpg

A new version of the Kinect sensor will also be present with all units which developers can depend upon. Technically speaking, the camera is higher resolution and more wide-angle; up to six skeletons can be tracked with joints able to rotate rather than just hinge. Microsoft is finally also permitting developers to use the Kinect along with a standard controller to, as they imagine, allow a user to raise their controller to block with a shield. That is the hope, but near the launch of the original Kinect, Microsoft filed a patent to allow sign language recognition: has not happened yet. Who knows whether the device will be successfully integrated into gaming applications.

Of course Microsoft is known most for system software, and the Xbox runs three lightweight operating environments. In Windows 8, you have the Modern interface which runs WinRT applications and you have the desktop app which is x86 compatible.

The Xbox One borrows more than a little from this model.

The home screen, which I am tempted to call the Start Screen, for the console has a very familiar tiled interface. They are not identical to Windows but they are definitely consistent. This interface allows for access to Internet Explorer and an assortment of apps. These apps can be pinned to the side of the screen, identical to Windows 8 modern app. I am expecting there to be "a lot of crossover" (to say the least) between this and the Windows Store; I would not be surprised if it is basically the same API. This works both when viewing entertainment content as well as within a game.

Xbox_Home_UI_EN_US_Male_SS.jpg

These three operating systems run at the same time. The main operating system is basically a Hyper-V environment which runs the two other operating systems simultaneously in sort-of virtual machines. These operating systems can be layered with low latency, since all you are doing is compositing them in a different order.

Lastly, they made reference to Xbox Live, go figure. Microsoft is seriously increasing their server capacity and expects developers to utilize Azure infrastructure to offload "latency-insensitive" computation for games. While Microsoft promises that you can play games offline, this obviously does not apply to features (or whole games) which rely upon the back-end infrastructure.

xbox-one-live.jpg

And yes, I know you will all beat up on me if I do not mention the SimCity debacle. Maxis claimed that much of the game requires an online connection due to the complicated server requirements; after a crack allowed offline functionality, it was clear that the game mostly operates fine on a local client. How much will the Xbox Live cloud service offload? Who knows, but that is at least their official word.

Now to tie up some loose ends. The Xbox One will not be backwards compatible with Xbox 360 games although that is no surprise. Also, Microsoft says they are allowing users to resell and lend games. That said, games will be installed and not require the disc, from what I have heard. Apart from the concerns about how much you can run on a single 500GB drive, once the game is installed rumor has it that if you load it elsewhere (the rumor is even more unclear about whether "elsewhere" counts accounts or machines) you will need to pay a fee to Microsoft. In other words? Basically not a used game.

Well, that has it. You can be sure we will add more as information comes forth. Comment away!

Source: Xbox.com

JPR Releases Q1 2013 GPU Market Share Numbers, Good News for NVIDIA

Subject: Graphics Cards | May 20, 2013 - 12:54 PM |
Tagged: VIA, Q1 2013, nvidia, jpr, Intel, gpu market share, amd

Market analytics firm Jon Peddie Research recently released estimated market share and GPU shipment numbers from Q1 2013. The report includes information on AMD, NVIDIA, Intel, and Via and covered IGPs, processor graphics, and discrete GPUs included in desktop and mobile systems powered by X86 hardware. The report includes x86 tablets but otherwise does not factor in GPUs used in ARM devices like NVIDIA's Tegra chips. Year over Year, the PC market is down 12.6% and the GPU market declined by 12.9%. It is not all bad news for the PC market and discrete GPU makers, however. GPUs through 2016 are expected to exhibit a compound annual growth rate (CAGR) of 2.6% with as many as 394 million discrete GPUs shipped in 2016 alone.

In Q1 2013, the PC market is down 13.7% versus last quarter (Q4 2012) but the GPU market only declined 3.2%. This discrepency is explained as the result of people adding multiple GPUs to a single PC system, including adding a single discrete card to a system that already has processor graphics or an APU. By the end of Q1 2013, Intel holds 61.8% market share followed by AMD in second place with 20.2% and NVIDIA with 18%. Notably VIA is out of the game with 0.0% market share.

Q12013 Market Share.jpg

In terms of GPU shipments, NVIDIA had a relatively good first quarter of this year with an increase of 7.6% for notebook GPUs and desktop GPU shipments that remained flat. Overall, NVIDIA saw an increase in PC graphics shipments of 3.6%. On the other hand, x86 CPU giant Intel saw desktop and notebook GPUs slip by 3% and 6.3% respectively. Overall, that amounts to PC graphics shipments that fell by 5.3%. In between NVIDIA and Intel, AMD moved 30% more desktop chips (including APUs) versus Q4 2012. Meanwhile, Notebook chips (including APUs) fell by 7.3%. AMD's overall PC graphics shipments fell by 0.3%.

In all, this is decent news for the PC market as it shows that there is still interest in desktop GPUs. The PC market itself is declining and taking the GPU market with it, but it is far from the death of the desktop PC. It is interesting that NVIDIA (which announced Q1'13 revenue of $954.7 million) managed to push more chips while AMD and Intel were on the decline since NVIDIA doesn't have a x86 CPU with integrated graphics. I'm looking forward to seeing where NVIDIA stands as far as the mobile GPU market which does include ARM-powered products.

HP SlateBook x2: Tegra 4 on Android 4.2.2 in August

Subject: General Tech, Graphics Cards, Processors, Mobile | May 15, 2013 - 09:02 PM |
Tagged: tegra 4, hp, tablets

Sentences containing the words "Hewlett-Packard" and "tablet" can end in a question mark, an exclamation mark, or a period on occasion. The gigantic multinational technology company tried to own a whole mobile operating system with their purchase of Palm and abandoned those plans just as abruptly with such a successful $99 liquidation of $500 tablets, go figure, that they to some extent did it twice. The operating system was open sourced and at some point LG swooped in and bought it, minus patents, for use in Smart TVs.

So how about that Android?

HP-slatex2-01.jpg

The floodgates are open on Tegra 4 with HP announcing their SlateBook x2 hybrid tablet just a single day after NVIDIA's SHIELD move out of the projects. The SlateBook x2 uses the Tegra 4 processor to power Android 4.2.2 Jellybean along with the full Google experience including the Google Play store. Along with Google Play, the SlateBook and its Tegra 4 processor are also allowed in TegraZone and NVIDIA's mobile gaming ecosystem.

As for the device itself, it is a 10.1" Android tablet which can dock into a keyboard for extended battery life, I/O ports, and well, a hardware keyboard. You are able to attach this tablet to a TV via HDMI along with the typical USB 2.0, combo audio jack, and a full-sized SD card slot; which half any given port is available through is anyone's guess, however. Wirelessly, you have WiFi a/b/g/n and some unspecified version of Bluetooth.

HP-slatex2-02.jpg

The raw specifications list follows:

  • NVIDIA Tegra 4 SoC
    • ARM Cortex A15 quad core @ 1.8 GHz
    • 72 "Core" GeForce GPU @ ~672MHz, 96 GFLOPS
  • 2GB DDR3L RAM ("Starts at", maybe more upon customization?)
  • 64GB eMMC SSD
  • 1920x1200 10.1" touch-enabled IPS display
  • HDMI output
  • 1080p rear camera, 720p front camera with integrated microphone
  • 802.11a/b/g/n + Bluetooth (4.0??)
  • Combo audio jack, USB 2.0, SD Card reader
  • Android 4.2.2 w/ Full Google and TegraZone experiences.

If this excites you, then you only have to wait until some point in August; you will also, of course, need to wait until you save up about $479.99 plus tax and shipping.

Source: HP

Never Settle Reloaded Bundle Gets a Level Up Bonus

Subject: Graphics Cards | May 15, 2013 - 12:00 PM |
Tagged: tomb raider, never settle reloaded, never settle, level up, Crysis 3, bundle, amd

AMD dropped us a quick note to let us in on another limited time offer for buyers of AMD Radeon graphics cards.  Starting today, the Never Settle Reloaded bundle that we first told you about in February is getting an upgrade for select tiers.  For new buyers of the Radeon HD 7970, 7950 and 7790 AMD will be adding Tomb Raider into the mix.  Also, the Radeon HD 7870 will be getting Crysis 3.

Here is the updated, currently running AMD Radeon Level Up bundle matrix.

nslevelup2.jpg

Now if you buy a new AMD Radeon HD 7970, HD 7950 or HD 7870 today you will get four top-level PC games including Crysis 3, Bioshock Infinite, Far Cry 3: Blood Dragon and Tomb Raider. 

This is a limited time offer though that will end when supplies run out and we don't really have any idea when that will be.  Check out AMD's Level Up site for more details and to find retailers offering the updated bundles. 

I am curious to find out how successful these bundles have been for AMD and if NVIDIA has had a feedback on the Free-to-play bundle they offered or the new Metro: Last Light option.  Do gamers put much emphasis on the game bundles that come with each graphics card or does the performance and technology make the difference?

UPDATE: I have seen a couple of questions on whether this Level Up promotion would be retroactive. According to the details I have from AMD, this promotion is NOT retroactive.  If you have already purchased any of the affected cards you will not be getting the additional games.

Source: AMD

Haswell Laptop specs! NEC LaVie L to launch in Japan

Subject: General Tech, Graphics Cards, Processors, Systems, Mobile | May 14, 2013 - 03:54 PM |
Tagged: haswell, nec

While we are not sure when it will be released or whether it will be available for North America, we have found a Haswell laptop. Actually, NEC will release two products in this lineup: a high end 1080p unit and a lower end 1366x768 model. Unfortuantely, the article is in Japanese.

nec_haswell_01.jpg

IPS displays have really wide viewing angles, even top and bottom.

NEC is known for their higher-end monitors; most people equate the Dell Ultrasharp panels with professional photo and video production, but their top end offers are ofter a tier below the best from companies like NEC and Eizo. The laptops we are discussing today both contain touch-enabled IPS panels with apparently double the contrast ratio of what NEC considers standard. While these may or may not be the tip-top NEC offerings, they should at least be putting in decent screens.

Obviously the headliner for us is the introduction of Haswell. While we do not know exactly which product NEC decided to embed, we do know that they are relying upon it for their graphics performance. With the aforementioned higher-end displays, it seems likely that NEC is intending this device for the professional market. A price-tag of 190000 yen (just under $1900 USD) for the lower end and 200000 yen (just under $2000 USD) for the higher end further suggests this is their target demographic.

nec_haswell_02.jpg

Clearly a Japanese model.

The professional market does not exactly have huge requirements for graphics performance, but to explicitly see NEC trust Intel for their GPU performance is an interesting twist. Intel HD 4000 has been nibbling, to say the least, on the discrete GPU marketshare in laptops. I would expect this laptop would contain one of the BGA-based parts, which are soldered onto the motherboard, for the added graphics performance.

As a final note, the higher-end model will also contain a draft 802.11ac antenna. It is expected that network performance could be up to 867 megabits as a result.

Of course I could not get away without publishing the raw specifications:

LL850/MS (Price: 200000 yen):

  • Fourth-generation Intel Core processor with onboard video
  • 8GB DDR3 RAM
  • 1TB HDD w/ 32GB SSD caching
  • BDXL (100-128GB BluRay disc) drive
  • IEEE 802.11ac WiFi adapter, Bluetooth 4.0
  • SDXC, Gigabit Ethernet, HDMI, USB3.0, 2x2W stereo Yamaha speakers
  • 1080p IPS display with touch support
  • Office Home and Business 2013 preinstalled?

LL750/MS (Price: 190000 yen):

  • Fourth-generation Intel Core processor with onboard video
  • 8GB DDR3 RAM
  • 1TB HDD (no SSD cache)
  • (Optical disc support not mentioned)
  • IEEE 802.11a/b/g/n WiFi adapter, Bluetooth 4.0
  • SDXC, Gigabit Ethernet, HDMI, USB3.0, 2x2W stereo Yamaha speakers
  • 1366x768 (IPS?) touch-enabled display

Pushing $1000 cards to 5760x1200

Subject: Graphics Cards | May 10, 2013 - 07:25 PM |
Tagged: titan, radeon hd 7990, nvidia, amd

If you have been wondering how the two flagship GPUs fare in a battle royal of pure frame rate you can satisfy your curiousity at [H]ard|OCP.  They have tested both NVIDIA's TITAN and the finally released HD7990 in one of their latest reviews.  Both cards were force to push out pixels at 5760x1200 and for the most part tied, which makes sense as they both cost $1000.  The real winner was Crossfired HD 7970's which kept up with the big guns but cost $200 less to purcahse.

If that isn't extreme enough for you, they also overclocked the TITAN in a seperate review.

HOVERclock.gif

"We follow-up with a look at how the $999 GeForce GTX TITAN compares to the new $999 AMD Radeon HD 7990 video card. What makes this is unique is that the GeForce GTX TITAN is a single-GPU running three displays in NV Surround compared to the same priced dual-GPU CrossFire on a card Radeon HD 7990 in Eyefinity."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

PCPer Live! Frame Rating and FCAT - Your Questions Answered!

Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM |
Tagged: video, nvidia, live, frame rating, fcat

Update: Did you miss the live stream?  Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!

I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology.  Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.

I also know that there are lots of questions about the process, the technology and the results we have shown.  In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.

Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input. 

fr-2.png

The primary part of this live stream will be about education - not about bashing one particular product line or talking up another.  And part of that education is your ability to interact with us live, ask questions and give feedback.  During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience.  The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below.  It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.

Frame Rating and FCAT Live Stream

11am PT / 2pm ET - May 9th

PC Perspective Live! Page

So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!

AMD to erupt Volcanic Islands GPUs as early as Q4 2013?

Subject: Editorial, General Tech, Graphics Cards, Processors | May 8, 2013 - 09:32 PM |
Tagged: Volcanic Islands, radeon, ps4, amd

So the Southern Islands might not be entirely stable throughout 2013 as we originally reported; seismic activity being analyzed suggests the eruption of a new GPU micro-architecture as early as Q4. These Volcanic Islands, as they have been codenamed, should explode onto the scene opposing NVIDIA's GeForce GTX 700-series products.

It is times like these where GPGPU-based seismic computation becomes useful.

The rumor is based upon a source which leaked a fragment of a slide outlining the processor in block diagram form and specifications of its alleged flagship chip, "Hawaii". Of primary note, Volcanic Islands is rumored to be organized with both Serial Processing Modules (SPMs) and a Parallel Compute Module (PCM).

Radeon9000.jpg

So apparently a discrete GPU can have serial processing units embedded on it now.

Heterogeneous Systems Architecture (HSA) is a set of initiatives to bridge the gap between massively parallel workloads and branching logic tasks. We usually make reference to this in terms of APUs and bringing parallel-optimized hardware to the CPU. In this case, we are discussing it in terms of bringing serial processing to the discrete GPU. According to the diagram, the chip within would contain 8 processor modules each with two processing cores and an FPU for a total of 16 cores. There does not seem to be any definite identification whether these cores would be based upon their license to produce x86 processors or their other license to produce ARM processors. Unlike an APU, this is heavily skewed towards parallel computation rather than a relatively even balance between CPU, GPU, and chipset features.

Now of course, why would they do that? Graphics processors can do branching logic but it tends to sharply cut performance. With an architecture such as this, a programmer might be able to more efficiently switch between parallel and branching logic tasks without doing an expensive switch across the motherboard and PCIe bus between devices. Josh Walrath suggested a server containing these as essentially add-in card computers. For gamers, this might help out with workloads such as AI which is awkwardly split between branching logic and massively parallel visibility and path-finding tasks. Josh seems skeptical about this until HSA becomes further adopted, however.

Still, there is a reason why they are implementing this now. I wonder, if the SPMs are based upon simple x86 cores, how the PS4 will influence PC gaming. Technically, a Volcanic Island GPU would be an oversized PS4 within an add-in card. This could give AMD an edge, particularly in games ported to the PC from the Playstation.

This chip, Hawaii, is rumored to have the following specifications:

  • 4096 stream processors
  • 16 serial processor cores on 8 modules
  • 4 geometry engines
  • 256 TMUs
  • 64 ROPs
  • 512-bit GDDR5 memory interface, much like the PS4.
  • 20 nm Gate-Last silicon fab process
    • Unclear if TSMC or "Common Platform" (IBM/Samsung/GLOBALFOUNDRIES)

Softpedia is also reporting on this leak. Their addition claims that the GPU will be designed on a 20nm Gate-Last fabrication process. While gate-last is considered to be not worth the extra effort in production, Fully Depleted Silicon On Insulator (FD-SOI) is apparently "amazing" on gate-last at 28nm and smaller fabrication. This could mean that AMD is eying that technology and making this design with intent of switching to an FD-SOI process, without a large redesign which an initially easier gate-first production would require.

Well that is a lot to process... so I will leave you with an open question for our viewers: what do you think AMD has planned with this architecture, and what do you like and/or dislike about what your speculation would mean?

Source: TechPowerUp

Zotac Announces Overclocked GTX TITAN AMP! Edition Graphics Card

Subject: Graphics Cards | May 1, 2013 - 05:41 AM |
Tagged: zotac, nvidia, gtx titan, AMP!

Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.

The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.

Zotac GeForce GTX Titan AMP! Edition.jpg

The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.

Source: Zotac

XFX Announces Malta Dual-GPU Radeon HD 7990

Subject: Graphics Cards | April 24, 2013 - 10:14 PM |
Tagged: xfx, malta, hd 7990, GCN, dual gpu, amd

Now that AMD’s dual-gpu Malta graphics card is official, cards from Add-In Board (AIB) partners are starting to roll in. One such recently announced card is the XFX Radeon HD 7990 card. The XFX card is based on the reference AMD design, which includes two Radeon HD 7970 GPUs in a Crossfire configuration.

The two GPUs can boost up to 1GHz clock speeds and feature a total of 4096 stream processors, 256 texture units, 64 ROPs, and 8.6 billion transistors. The card also includes 3GB of GDDR5 memory per GPU running off a 384-bit bus. It supports AMD’s Eyefinity technology and offers up one DL-DVI and four mini-DisplayPort video outputs.

XFX Radeon HD 7990.jpg

The XFX HD 7990 uses the reference AMD heatsink as well, which includes a massive aluminum fin stack with five copper heatpipes that run the length of the heasink and directly touch the two 7970 GPUs. Three shrouded fans, in turn, keep the heatsink cool.

The dual-GPU monster is eligible for AMD’s Never Settle bundle which includes eight free games. With purchase of the HD 7990 (from any eligible AIB), you get free key codes for the following games:

  • Bioshock Infinite
  • Crysis 3
  • Deus Ex: Human Revolution
  • Far Cry 3
  • Far Cry 3: Blood Dragon
  • Hitman: Absolution
  • Sleeping Dogs
  • Tomb Raider

The XFX press release further assures gamers that the card can, in fact, play Crysis 3 at maximum settings at a resolution of 3840 x 2160. The company did not mention pricing, however.

For those interested in AMD’s new Malta GPU, check out our review as well as how the card performs when paired with a prototype AMD driver that seeks to address some of the frame rating issues exhibited by AMD's Crossfire multi-GPU solution.

Source: XFX

PowerColor Launches HD 7990 V2 Based On Official AMD Malta GPU

Subject: Graphics Cards | April 24, 2013 - 07:09 PM |
Tagged: amd, powercolor, hd 7990, malta, dual gpu, crossfire

PowerColor (a TUL corporation brand) launched its dual-GPU Radeon HD 7990 V2 graphics card, and this time the card is based on the (recently reviewed) official dual-GPU AMD “Malta” GPU announced at the Games Developers Conference (GDC). The new HD 7990 V2 graphics card features two AMD HD 7970 cards in a Crossfire configuration. That means that the Malta-based card features a total of 4096 stream processors, and a rated 8.2 TFLOPS of peak performance.

PowerColor HD 7990 V2 Malta.jpg

The PowerColor HD 7990 V2 joins the company’s existing Devil 13 and HD 7990 graphics cards. The new card sports a triple-fan shrouded heatsink that is somewhat tamer-looking that the custom Devil 13. Other hardware includes 3GB of GDDR5 RAM per GPU clocked at 1500MHz and running on a 384-bit bus (again, per GPU) for a total of 6GB. Both GPUs have clock speeds of 950MHz base and up to 1GHz boost.

PowerColor HD 7990 V2 Malta GPU.jpg

The new GPU has a single DL-DVI and four mini-DisplayPort video outputs. PowerColor is touting the card’s Eyefinity prowess as well as its ZeroCore support for reducing power usage when idle. The board has a TDP of 750W and is powered by two PCI-E power connections. In all, the HD 7990 V2 graphics card measures 305 x 110 x 38mm. While PowerColor has not released pricing or availability, expect the card to be available soon and around the same price (or a bit lower than) as its existing (custom) HD 7990.

The full press release can be found here.

Source: PowerColor

New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%

Subject: Graphics Cards | April 23, 2013 - 03:53 PM |
Tagged: nvidia, graphics drivers, geforce, 320.00 beta

NVIDIA rolled out a new set of beta drivers that provide up to 20% faster performance and have been optimized for a handful of new titles, including Dead Island: Riptide, Star Trek, and Neverwinter.

GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.

With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.

Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):

  • Up to 20% in Dirt: Showdown
  • Up to 18% in Tomb Raider
  • Up to 8% in StarCraft II
  • Up to 6% in other top games like Far Cry 3

For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.

Enjoy the new GeForce Game Ready drivers and let us know what you think.

nvidia-geforce-320-00-beta-drivers-gtx-680-performance.png

Windows Vista/Windows 7 Fixed Issues

The Windows 7 Magnifier window flickers. [1058231]

Games default to stereoscopic 3D mode after installing the driver. [1261633]

[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. [1239252]

[Crysis 3]: There are black artifacts in the game. [1251495]

[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. [1253727]

[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled [1220312]

[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. [899771]

[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. [909749]

[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. [1042342]

[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. [1021046]

[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. [1037744]

[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. [1254359]

[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. [905544]

[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. [1253206]

[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. [1251578]

[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. [1253206]

SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.

Source: NVIDIA

Upcoming Never Settle Bundle Games Leaked

Subject: Graphics Cards | April 23, 2013 - 10:05 AM |
Tagged: amd, never settle, never settle reloaded, bundle

While browsing around on Twitter today I saw mention of a leaked slide on the Tech Report forums that seems to point in the direction of upcoming games to be included in future AMD Never Settle gaming bundles.  AMD has been knocking the ball out of the park when it comes to bundled software with graphics card releases as they have gotten essentially every major PC game in the last 12 months. 

neversettle.jpg

This slide indicates that Grid 2, Company of Heroes 2, Rome: Total War II, Splinter Cell Blacklist, Lost Planet 3, Battlefield 4, Raven's Cry and Watch Dogs will all eventually make their way to the AMD bundle list at some point this year.  Whether it will be in one mega-bundle or several different promotions throughout the year isn't known, but AMD is serious about keeping up appearances in the PC gaming front.

Source: Tech Report

Raja Koduri Returns to AMD After 4 Years at Apple

Subject: General Tech, Graphics Cards | April 19, 2013 - 02:51 PM |
Tagged: raja koduri, apple, amd

Interesting information has surfaced today about the addition of a new executive at AMD.  Raja Koduri, who previously worked for ATI and AMD as Chief Technology Officer, departed the company in 2009 for a four year stint at Apple, helping to turn that company into an SoC power house.  Developing its own processors has enabled Apple to stand apart from the competition in many mobile spaces and Koduri is partly responsible for the technological shift at Apple.

Starting on Monday though, Raja Koduri is officially back at AMD, taking over as the CVP (Corporate Vice President) of Visual Computing.  This position will result in more complete control over the entirety of the hardware and software platforms AMD is developing including desktop discrete, mobile and APU/SoC designs.  This marks the second major returning visionary executive in recent memory to AMD, the first of which was Jim Keller in August of 2012 (also returning from a period with Apple). 

koduri1.JPG

It will take some time for Koduri to have effect on AMD's current roadmap

Having known Raja Koduri for quite a long time I have always seen the man as an incredibly intelligent engineer that was able to find strengths in designs that others could not.  Much of the success of the ATI/AMD GPU divisions during the 2000s was due to Koduri's leadership (among others of course) and I think having him back at AMD at an even more senior role is great news for both discrete graphics fans and APU users. 

In a discussion with Koduri recently, Anandtech got some positive feedback for PC gamers:

Raja believes there’s likely another 15 years ahead of us for good work in high-end discrete graphics, so we’ll continue to see AMD focus on that part of the market.

koduri2.jpg

Koduri sees 15 years more GPU evolution

So even though this hiring isn't going to change AMD's position on the APU and SoC strategy, it is good to have someone at the CVP level that sees the importance and value of discrete, high power GPU technology. 

In many talks with AMD over the last 6 months we kept hearing about the healthy influx of quality personnel though much of it was still under wraps.  Keller was definitely one of them and Koduri is another and both of the hires give a lot of hope for AMD as a company going forward.  Some in the industry have already written AMD off but I find it hard to believe that this caliber of executive would return to a sinking ship. 

Source: CNET

SEIKI SE50UY04 50-in 4K 3840x2160 TV Unboxing and Preview

Subject: General Tech, Graphics Cards, Displays | April 18, 2013 - 08:52 PM |
Tagged: video, seiki, se50UY04, hdtv, hdmi 1.4, displays, 4k, 3840x2160

This just in!  We have a 4K TV in the PC Perspective Offices!

seiki1.jpg

While we are still working on the ability to test graphics card performance at this resolution with our Frame Rating capture system, we decided to do a live stream earlier today as we unboxed, almost dropped and then eventually configured our new 4K TV. 

The TV in question?  A brand new SEIKI SE50UY04 50-in 3840x2160 ready display.  Haven't heard of it?  Neither have we.  I picked it up over the weekend from TigerDirect for $1299, though it actually a bit higher now at $1499.

seiki2.jpg

The TV itself is pretty unassuming and other than looking for the 4K label on the box you'd be hard pressed to discern it from other displays.  It DID come with a blue, braided UHD-ready HDMI cable, so there's that.

seiki3.jpg

One point worth noting is that the stand on the TV is pretty flimsy; there was definitely wobble after installation and setup.

seiki4.jpg

Connecting the TV to our test system was pretty easy - only a single HDMI cable was required and the GeForce GTX 680s in SLI we happened to have on our test bed recognized it as a 3840x2160 capable display.  Keep in mind that you are limited to a refresh rate of 30 Hz though due to current limitations of HDMI 1.4.  The desktop was clear and sharper and if you like screen real estate...this has it. 

The first thing we wanted to try was some 4K video playback and we tried YouTube videos, some downloaded clips we found scattered across the Internet and a couple of specific examples I had been saving.  Isn't that puppy cute?  It was by far the best picture I had seen on a TV that close up - no other way to say it.

We did have issues with video playback in some cases due to high bit rates.  In one case we had a YUV uncompressed file that was hitting our SSD so hard on read speeds that we saw choppiness.  H.265 save us!

seiki5.jpg

And of course we demoed some games as well - Battlefield 3, Crysis 3, Skyrim and Tomb Raider.  Each was able to run at 3840x2160 without any complaints or INI hacks.  They all looked BEAUTIFUL when in a still position but we did notice some flickering on the TV that might be the result of the 120 Hz interpolation and possibly the "dynamic luminance control" feature that SEIKI has. 

We'll definitely test some more on this in the coming days to see if we can find a solution as I know many PC gamers are going to be excited about the possibility of using this as a gaming display!  We are working on a collection of benchmarks on some of the higher end graphics solutions like the GeForce TITAN, GTX 680s, HD 7990 and HD 7970s!

If you want to check out the full experience of our unboxing and first testing, check out the full live stream archived below!!

A good value card that can become great once overclocked, ASUS' HD 7850 DirectCU II

Subject: Graphics Cards | April 18, 2013 - 04:15 PM |
Tagged: asus, HD 7850 DirectCU II

With a custom cooler, 1 DP port, 2 DVI-I connectors, and 1 HDMI connector and only requiring a single PCIe 6 pin power connector the ASUS 7850 DirectCU II is a great blend of efficiency and flexibility for those looking for a card which costs around $200.  On the other hand if you have no plans to overclock the card, the GTX660 which [H]ard|OCP compared this card to is slightly more powerful, costs the same and is a better choice for those who are planning on running dual GPUs.  Check out the overclocked performance of this HD7850 in the full review. 

H_cu2.jpg

"ASUS has refreshed its AMD Radeon HD 7850 DirectCU II video card with DirectCU and DIGI+ VRM with Super Alloy Power, poised to give you a robust video card with an improved overclocking experience. We will see whether this new revision brings new value to the Radeon HD 7850 GPU and we will compare it to the NVIDIA GeForce GTX 660."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Bundles Metro: Last Light with GTX 660 and Higher

Subject: Graphics Cards | April 16, 2013 - 10:24 PM |
Tagged: nvidia, metro last light, Metro

Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards.  Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.

metro.jpg

Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did. 

metro2.jpg

This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.

NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.

The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.

How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more?  Did NVIDIA step up its game this time around?

Source: NVIDIA