Author:
Manufacturer: NVIDIA

A powerful architecture

In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.

The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.

As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.

NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.

Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.

The GeForce GTX TITAN Z Graphics Card

Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z. 

IMG_0270.JPG

The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.

Continue reading our review of the NVIDIA GeForce GTX Titan Z 12GB Graphics Card!!

NVIDIA Finally Launches GeForce GTX Titan Z Graphics Card

Subject: Graphics Cards | May 28, 2014 - 11:19 AM |
Tagged: titan z, nvidia, gtx, geforce

Though delayed by a month, today marks the official release of NVIDIA's Titan Z graphics card, the dual GK110 beast with the $3000 price tag. The massive card was shown for the first time in March at NVIDIA's GPU Technology Conference and our own Tim Verry was on the grounds to get the information

The details remain the same:

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). For the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

The difference now of course is that all the clock speeds and pricing are official. 

titanzspecs.png

A base clock speed of 705 MHz with a Boost rate of 876 MHz places it well behind the individual GPU performance of a GeForce GTX 780 Ti or GTX Titan Black (rated at 889/980 MHz). The memory clock speed remains the same at 7.0 Gbps and you are still getting a massive 6GB of memory per GPU.

Maybe most interesting with the release of the GeForce GTX Titan Z is that NVIDIA seems to have completely fixated on non-DIY consumers with the card. We did not receive a sample of the Titan Z (nor did we get one of the Titan Black) and when I inquired as to why, NVIDIA PR stated that they were "only going to CUDA developers and system builders."

geforce-gtx-titan-z-3qtr.png

I think it is more than likely that after the release of AMD's Radeon R9 295X2 dual GPU graphics card on April 8th, with a price tag of $1500 (half of the Titan Z), the target audience was redirected. NVIDIA already had its eye on the professional markets that weren't willing to dive into the Quadro/Tesla lines (CUDA developers will likely drop $3k at the drop of a hat to get this kind of performance density). But a side benefit of creating the best flagship gaming graphics card on the planet was probably part of the story - and promptly taken away by AMD.

geforce-gtx-titan-z-bracket.png

I still believe the Titan Z will be an impressive graphics card to behold both in terms of look and style and in terms of performance. But it would take the BIGGEST NVIDIA fans to be able to pass up buying a pair of Radeon R9 295X2 cards for a single GeForce GTX Titan Z. At least that is our assumption until we can test one for ourselves.

I'm still working to get my hands on one of these for some testing as I think the ultra high end graphics card coverage we offer is incomplete without it. 

Several of NVIDIA's partners are going to be offering the Titan Z including EVGA, ASUS, MSI and Zotac. Maybe the most intersting though is EVGA's water cooled option!

evgatitanz.jpg

So, what do you think? Anyone lining up for a Titan Z when they show up for sale?

NVIDIA Joins the Bundle Game: Up to $150 in credit on Free-to-Play games for all GTX buyers

Subject: Graphics Cards | February 11, 2013 - 12:33 PM |
Tagged: world of tanks, planetside 2, nvidia, Hawken, gtx, geforce, bundle

AMD has definitely been winning the "game" of game bundles and bonus content with graphics cards purchases, as is evident from the recent Never Settle Reloaded campaign that includes titles like Crysis 3, Bioshock Infinite and Tomb Raider.  I made comments that NVIDIA was falling behind and may even start to look like they have moved away from a focus on PC gamers since they hadn't made any reply over the last year...

After losing a bidding war with AMD over Crysis 3, today NVIDIA is unveiling a bundle campaign that attack at a different angle; rather than including in bundled games NVIDIA is working free-to-play titles.  How do you give gamers bonuses by including free to play games?  Credits!  Cold hard cash!

bundle1.png

Starting today if you pick up any GeForce GTX graphics card you'll be eligible to get free in-game credit to use in one of the three free-to-play titles partnering with NVIDIA.  A GTX 650 or GTX 650 Ti will net you $25 in each for a total bonus of $75 while buying a GTX 660 or higher, all the way up to the GTX 690 results in $50 per game for a total of $150.

Also, after asking NVIDIA about it, this is a PER CARD bundle so if you get an SLI pair of anything, you'll get double the credit.  A pair of GeForce GTX 660s for an SLI rig results in $100 per game, $300 total!

bundle3.png

This is a very interesting approach that NVIDIA has decided to take and I am eager to get feedback from our readers on the differences between AMD's and NVIDIA's bundles.  I have played quite a bit of Planetside 2 and definitely enjoyed it; it is a graphics showcase as well with huge and expansive levels and hundreds of people per server.  World of Tanks and Hawken I am less familiar with but they also are extremely popular.

bundle2.png

Leave us your comments below!  Do you think NVIDIA's new GeForce GTX gaming bundle for free-to-play game credits can be successful! 

If you are looking for a new GeForce GTX card today and this bundle convinced you to buy, feel free to use the links below. 

Author:
Manufacturer: MSI

Spicing up the GTX 670

The Power Edition graphics card series from MSI is a relatively new addition to its lineup. The Power Edition often mimics that of the higher-end Lightning series, but at a far lower price (and perhaps a smaller feature set). This allows MSI to split the difference between the reference class boards and the high end Lightning GPUs.

msi_gtx670_pe_01.jpg

Doing this allows users a greater variety of products to choose from, and to better tailor users' purchases by their needs and financial means. Not everyone wants to pay $600 for a GTX 680 Lightning, but what if someone was able to get similar cooling, quality, and overclocking potential for a much lower price?  This is what MSI has done with one of its latest Power Edition cards.

The GTX 670 Power Edition

The NVIDIA GTX 670 cards have received accolades throughout the review press. It is a great combination of performance, power consumption, heat production, and price. It certainly caused AMD a great amount of alarm, and it hurriedly cut prices on the HD 7900 series of cards in response. The GTX 670 is a slightly cut-down version of the full GTX 680, and it runs very close to the clock speed of its bigger brother. In fact, other than texture and stream unit count, the cards are nearly identical.

Continue reading the entire review!

NVIDIA Announces New GTX 680M, The World’s Fastest Mobile GPU

Subject: Mobile | June 4, 2012 - 08:15 PM |
Tagged: nvidia, laptop, gtx, gpu

gtx680m.jpg

NVIDIA recently revised its notebook GPU lineup, but there was one part notably missing – the GTX 680M. The x80M part has been NVIDIA’s fastest mobile GPU in each generation for some time, so we knew that a GTX 680M was coming. We had only two questions. When? And what architecture will it be based on?

Now we have the answers to both questions. NVIDIA has pulled the wraps off its flagship component. The new GTX 680M is a Kepler component (unlike other high-end 600 series parts, which are Fermi) packing 1344 CUDA cores and up to 4GB of GDDR5. 

gtx680slide1.jpg

The green team is laying out some big numbers in its press release by claiming that performance is up about 80% in comparison to the GTX 580M and 30% in comparison to the AMD Radeon 7970M. NVIDIA also says that the new part will play every game available today at 1080p with maximum in-game settings. 

Other selling points include NVIDIA’s FXAA and TXAA, Adaptive V-Sync, Optimus, SLI, PhysX and 3D Vision. The company is clearly making a strong effort to distinguish itself from AMD not only with performance but also with features. 

gtx680mslide2.jpg

Five launch laptops were announced. They include the Alienware M17x and M18x, the MSI GT70 and the Clevo P150EM/P170EM. The Clevo units are the chassis used by companies like Maingear, Origin and other boutiques. Only the M18x has confirmed SLI support and only the M17x has confirmed 3D Vision support. Pricing has not been announced. 

We will of course be looking to obtain a review unit so we can see if the GTX 680M is as mightly as claimed. 

Source: Nvidia

Podcast #187 - Our thoughts on Ultrabooks, the Radeon HD 7950, ASUS DirectCU GTX cards, and more!

Subject: Editorial, General Tech | February 2, 2012 - 03:11 PM |
Tagged: ssd, sandforce, radeon, podcast, patriot, nvidia, Intel, gtx, arm, amd, 7950

PC Perspective Podcast #187 - 02/02/2012

Join us this week as we talk about our thoughts on Ultrabooks, the Radeon HD 7950, ASUS DirectCU GTX cards, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, and Allyn Malvantano

This Podcast is brought to you by MSI Computer, and their all new Sandy Bridge Motherboards!

Program length: 58:02

Program Schedule:

  1. 0:00:40 Introduction
  2. 1-888-38-PCPER or podcast@pcper.com
  3. http://pcper.com/podcast
  4. http://twitter.com/ryanshrout and http://twitter.com/pcper
  5. 0:01:20 Ultrabooks: Intel Knows What's Good For You
  6. 0:08:30 Patriot Pyro and Wildfire SSD Review - IMFT Async vs. Toshiba Toggle-mode Flash
  7. 0:14:20 AMD Radeon HD 7950 3GB Graphics Card Review
  8. 0:25:50 This Podcast is brought to you by MSI Computer, and their all new Sandy Bridge Motherboards!
  9. 0:26:38 Asus DirectCU II Roundup: ENGTX560, ENGTX570, and ENGTX580 Review
  10. 0:40:35 Raspberry Pi Linux Computer Will Have Fast GPU For The Price
  11. 0:44:20 If you thought Intel did well wait until you see ARM
  12. 0:47:00 AMD 7700 and 7800 Release Dates Leak To Web
  13. 0:51:20 Live Blog: AMD Financial Analyst Day
  14. 0:52:20 Hardware / Software Pick of the Week
    1. Ryan: Radeon HD 7950 Cards
    2. Jeremy: I'm giddy as a schoolgirl, albeit a very mercenary one
    1. Josh: And it is on sale! $770 off!
    2. Allyn: Corsair Force 3 - very good pricing.
  15. 1-888-38-PCPER or podcast@pcper.com
  16. http://pcper.com/podcast   
  17. http://twitter.com/ryanshrout and http://twitter.com/pcper
  18. Closing

Source:
Author:
Manufacturer: Asus

3 NV for DCII

The world of video cards is a much changed place over the past few years.  Where once we saw only “sticker versions” of cards mass produced by a handful of manufacturers, we are now seeing some really nice differentiation from the major manufacturers.  While the first iterations of these new cards are typically mass produced by NVIDIA or AMD and then distributed to their partners for initial sales, these manufacturers are now more consistently getting their own unique versions out to retail in record time.  MSI was one of the first to put out their own unique designs, but now we are seeing Asus becoming much more aggressive with products of their own.

adcII_01.jpg

The DirectCU II line is Asus’ response to the growing number of original designs from other manufacturers.  The easiest way to categorize these designs is that they straddle nicely the very high end and extreme products like the MSI Lightning series and those of the reference design boards with standard cooling.  These are unique designs that integrate features and cooling solutions that are well above that of reference cards.

DirectCU II applies primarily to the cooling solutions on these boards.  The copper heatipipes in the DirectCU II cooler are in direct contact with the GPU.  These heatpipes then are distributed through two separate aluminum fin arrays, each with their own fan.  So each card has either a dual slot or triple slot cooling solution with two 80 mm fans that dynamically adjust to the temperature of the chip.  The second part of this is branded “Super Alloy Power” in which Asus has upgraded most of the electrical components on the board to match higher specifications.  Hi-C caps, proadlizers, polymer caps, and higher quality chokes round out the upgraded components which should translate into more stable overclocked performance and a longer lifespan.

Read the entire article here.

Battlefield 3 Frame Rate Drop Issue with GeForce GPUs

Subject: Graphics Cards | December 28, 2011 - 09:24 PM |
Tagged: gtx, geforce, bf3

Every once in a while we come across some gaming issue that when we approach those responsible for it, NVIDIA, AMD, the game developer, they seem as lost as we do.  For the last few days I have been banging my head on the table trying to figure out an issue with GeForce GTX graphics cards and Battlefield 3 and I am hoping that some of YOU might have seen it and can confirm.

While testing our new X79-based GPU test bed we continued to find that while playing Battlefield 3, frame rates would drop from 30+ to ~10 while running at 2560x1600 and Ultra quality presets.  It could happen when walking down an empty hallway or in the middle of a huge dramatic shootout with some enemies.  And sometimes, the issue would reverse and the frame rate would again jump back up to 30+ FPS.

08.jpg

A 10 frame per second tank?  No thanks...

Even more odd, and something the normal user doesn't monitor, the power consumption of the system would drop significantly during this time.  At 30+ FPS the power draw might be 434 watts while when running at the ~10 FPS level it would draw 100 watts less!  The first theory was that this was the GPU going into a lower "p-state" due to overheating or some other bug, but when monitoring our GPU-Z logs we saw no clock speed decreases and temperatures never went above 75C - pretty tame for a GPU.

To demonstrate this phenomenon we put together a quick video. 

In the video, you are seeing the "tearing" of Vsync in a much more dramatic fashion because of of our capture method.  We actually were outputing a 2560x1600 signal (!!) to an external system to be recorded localy at a very high bit rate.  Unfortunately, we could only muster a ~30 FPS capture frame rate which, coupled with the 60 Hz signal being sent, results in a bit of double up on the tearing you might usually see.  Still, the FRAPS-reported frame rates are accurate and we use an external system to capture to video to remove the possibility of any interference on performance during the capture process.

The hardware used in this video was actually based on an ASUS X58 motherboard and a Nehalem Core i7-965 processor.  But wasn't I just talking about an X79 rig?  Yes, but I rebuilt our old test bed to make sure this bug was NOT related to X79 or Sandy Bridge-E.  The systems that exhibited the issue were:

  • Intel Core i7-3960X
  • ASUS P9X79 Pro
  • 16GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

Also:

  • Intel Core i7-965
  • ASUS X58 WS 
  • 6GB DDR3-1600
  • 600GB VelociRaptor HDD
  • Windows 7 x64 SP1
  • GeForce GTX 580 (two different cards tested)
  • 290.53 Driver

For me, this is only occurring at 2560x1600 though I am starting to see more reports of the issue online.

  • Another 560 ti and BF3 FPS Low Or Drop!
    • Well I just Installed my 2nd evga 560 ti DS running SLI and When I play battlefield 3 i get about 60 to 90 fps then drops at
      20 to 30. Goes Up and down, I look at the evga precision looks like each gpu is running at 40% each and changes either up or down.
      Temp. is under 60 degrees c.
  • GTX 560 Ti dramatic FPS drops on BF3 only
    • "having any setting on Ultra will cue dramatic and momentary fps drops into the 30's. if i set everything to High, i will stay above 70 fps with the new beta 285.79 drivers released today (which i thought would fix this problem but didn't). i've been monitoring things with Afterburner and i've noticed that GPU usage will also drop at the same time these FPS drops happen. nothing is occurring in the game or on the screen to warrant these drops, FPS will just drop even when nothing is going on or exploding and i'm not even moving or looking around, just idle. they occur quite frequently as well."
  • BF3 Frame Drops
    • "When i use 4xAA i get abnormal framedrops, even while nothing is going on, on the screen.
      The weird thing is that, when it drops, it always drops to 33/32fps, not higher, not lower.
      It usually happens for a few seconds."
  • BF3 @ 2560x1600 Ultra Settings Preset Unplayable
    • "I know its a beta, but i haven't heard any problems yet about framedrops.
      Sometimes my frames drop from 75fps way back to 30/20 fps, even when nothing is going on, on the screen."

So what gives?  Is this a driver issue?  Is it a Battlefield 3 issue?  Many of these users are running at resolutions other than the 2560x1600 that I am seeing it at - so either there is another problem for them or it affects different cards at different quality levels.  It's hard to say, but doing a search for "radeon bf3 frame drop" pulls up much less incriminating evidence that gamers on that side of the fence are having similar discussions.  

I have been talking with quite a few people at NVIDIA about this and while they are working hard to figure out the source of the frame rate inconsistencies, those of us with GeForce GTX cards may just want to back off and play at a lower resolution or lower settings until the fix is found.