Author:
Manufacturer: NVIDIA

A powerful architecture

In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.

The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.

As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.

NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.

Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.

The GeForce GTX TITAN Z Graphics Card

Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z. 

IMG_0270.JPG

The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.

Continue reading our review of the NVIDIA GeForce GTX Titan Z 12GB Graphics Card!!

E3 2014: NVIDIA SHIELD Tablet Will Exist?

Subject: General Tech, Mobile, Shows and Expos | June 9, 2014 - 02:10 PM |
Tagged: shield tablet, shield, nvidia, E3 14, E3

The Tech Report had their screenshot-fu tested today with the brief lifespan of NVIDIA's SHIELD Tablet product page. As you can see, it is fairly empty. We know that it will have at least one bullet point of "Features" and that its name will be "SHIELD Tablet".

nvidia-shield-tablet-il.jpg

Image Credit: The Tech Report

Of course, being the first day of E3, it is easy to expect that such a device will be announced in the next couple of days. This is expected to be based on the Tegra K1 with 2GB of RAM and have a 2048x1536 touch display.

It does question what exactly is a "SHIELD", however. Apart from being a first-party device, how would they be any different from other TegraZone devices? We know that Half Life 2 and Portal have been ported to the SHIELD product line, exclusively, and will not be available on other Tegra-powered devices. Now that the SHIELD line is extending to tablets, I wonder how NVIDIA will handle this seemingly two-tier class of products (SHIELD vs Tegra OEM devices). It might even depend on how many design wins they achieve, along with their overall mobile market share.

Source: Tech Report

Google's Project Tango Announced, Uses NVIDIA Tegra K1

Subject: General Tech, Mobile | June 5, 2014 - 02:51 PM |
Tagged: tegra k1, tegra, project tango, nvidia, google, Android

Today, Google announced their "Project Tango" developer kit for tablets with spatial awareness. With a price tag of $1,024 USD, it is definitely aimed at developers. In fact, the form to be notified about the development kit has a required check box that is labeled, "I am a developer". Slightly above the form is another statement, "These development kits are not a consumer device and will be available in limited quantities".

So yes, you can only buy these if you are a developer.

The technology is the unique part. Project Tango is aimed at developers to make apps which understand the 3D world around the tablet. Two examples categories they have already experimented with are robotics and computer vision. Of course, this could also translate to alternate reality games and mapping.

While Google has not been too friendly with OpenCL in its Android platform, it makes sense that they would choose a flexible GPU with a wide (and deep) range of API support. While other SoCs are probably capable enough, the Kepler architecture in the Tegra K1 is about as feature-complete as you can get in a mobile chip, because it is basically a desktop chip.

google-project-tango.jpg

Google's Project Tango is available to developers, exclusively, for $1,024 and ships later this month.

Also, that price is clearly a pun.

Source: Google

NVIDIA Launches GeForce Experience 2.1

Subject: General Tech, Graphics Cards | June 2, 2014 - 05:52 PM |
Tagged: nvidia, geforce, geforce experience, ShadowPlay

NVIDIA has just launched another version of their GeForce Experience, incrementing the version to 2.1. This release allows video of up to "2500x1600", which I assume means 2560x1600, as well as better audio-video synchronization in Adobe Premiere. Also, because why stop going after FRAPS once you start, it also adds an in-game framerate indicator. It also adds push-to-talk for recording the microphone.

nvidia-geforce-experience.png

Another note: when GeForce Experience 2.0 launched, it introduced streaming of the user's desktop. This allowed recording of OpenGL and windowed-mode games by simply capturing an entire monitor. This mode was not capable of "Shadow Mode", which I believed was because they thought users didn't want a constant rolling video to be taken of their desktop in the event that they wanted to save a few minutes of it at some point. Turns out that I was wrong; the feature was coming and it arrived with GeForce Experience 2.1.

GeForce Experience 2.1 is now available at NVIDIA's website, unless it already popped up a notification for you.

Source: NVIDIA

AMD and NVIDIA get into a hairy argument

Subject: General Tech | May 29, 2014 - 07:43 PM |
Tagged: nvidia, gameworks, dirty pool, business as usual, amd

The topic on NVIDIA Gameworks was discussed at great length on last night's PCPer Podcast and from the live comments as well as the comments on Ryan's original story this is obviously a topic which draws strong opinions.  As it is always best to limit yourself to debating topics of which you are familiar with the facts The Tech Report's article on the aftereffects of the Forbes story is well worth a read.  Cyril had a chance to speak with a rep from NVIDIA's driver development team about Hallock's comments pertaining to NVIDIA's Gameworks and the legitimacy of AMD's complaints.  As you might expect there is a lot of denial and finger pointing from both sides; what long time enthusiasts might describe as 'business as usual'.  Both sides of this argument have vehemently denied ever attempting to undermine each others business but yet both sides can point to specific instances in which the competition has used questionable methods to get a leg (or hair) up on the competition.  

6a00d8341c795b53ef010535c10aa8970b.jpg

"Earlier today, I spoke with Cem Cebenoyan, Director of Engineering for Developer Technology at Nvidia, who offered a rebuttal to a Forbes story we covered yesterday. In that story, AMD's Robert Hallock alleged that Nvidia's GameWorks program prevents AMD from working with game developers on GPU optimizations."

Here is some more Tech News from around the web:

Tech Talk

Podcast #302 - ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

Subject: General Tech | May 29, 2014 - 02:51 PM |
Tagged: video, podcast, asus, 4k, pb287q, nvidia, amd, gameworks, ubisoft, watch dogs, crucial, mx100, tegra k1, gsync

PC Perspective Podcast #302 - 05/29/2014

Join us this week as we discuss the ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, and Allyn Maleventano

Program length: 1:29:01
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Allyn: For Josh - the Wenger Giant Knife
  4. Closing/outro

 

Author:
Manufacturer: Various

The AMD Argument

Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:

Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.

The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).

It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.

watch_dogs_ss9_99866.jpg

Watch_Dogs is the latest GameWorks title released this week.

I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.

The AMD Stance

Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.

Continue reading our editorial on the verbal battle between AMD and NVIDIA about the GameWorks program!!

NVIDIA Finally Launches GeForce GTX Titan Z Graphics Card

Subject: Graphics Cards | May 28, 2014 - 11:19 AM |
Tagged: titan z, nvidia, gtx, geforce

Though delayed by a month, today marks the official release of NVIDIA's Titan Z graphics card, the dual GK110 beast with the $3000 price tag. The massive card was shown for the first time in March at NVIDIA's GPU Technology Conference and our own Tim Verry was on the grounds to get the information

The details remain the same:

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). For the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

The difference now of course is that all the clock speeds and pricing are official. 

titanzspecs.png

A base clock speed of 705 MHz with a Boost rate of 876 MHz places it well behind the individual GPU performance of a GeForce GTX 780 Ti or GTX Titan Black (rated at 889/980 MHz). The memory clock speed remains the same at 7.0 Gbps and you are still getting a massive 6GB of memory per GPU.

Maybe most interesting with the release of the GeForce GTX Titan Z is that NVIDIA seems to have completely fixated on non-DIY consumers with the card. We did not receive a sample of the Titan Z (nor did we get one of the Titan Black) and when I inquired as to why, NVIDIA PR stated that they were "only going to CUDA developers and system builders."

geforce-gtx-titan-z-3qtr.png

I think it is more than likely that after the release of AMD's Radeon R9 295X2 dual GPU graphics card on April 8th, with a price tag of $1500 (half of the Titan Z), the target audience was redirected. NVIDIA already had its eye on the professional markets that weren't willing to dive into the Quadro/Tesla lines (CUDA developers will likely drop $3k at the drop of a hat to get this kind of performance density). But a side benefit of creating the best flagship gaming graphics card on the planet was probably part of the story - and promptly taken away by AMD.

geforce-gtx-titan-z-bracket.png

I still believe the Titan Z will be an impressive graphics card to behold both in terms of look and style and in terms of performance. But it would take the BIGGEST NVIDIA fans to be able to pass up buying a pair of Radeon R9 295X2 cards for a single GeForce GTX Titan Z. At least that is our assumption until we can test one for ourselves.

I'm still working to get my hands on one of these for some testing as I think the ultra high end graphics card coverage we offer is incomplete without it. 

Several of NVIDIA's partners are going to be offering the Titan Z including EVGA, ASUS, MSI and Zotac. Maybe the most intersting though is EVGA's water cooled option!

evgatitanz.jpg

So, what do you think? Anyone lining up for a Titan Z when they show up for sale?

Acer Announces World's First 4K Display with NVIDIA G-SYNC Technology

Subject: Displays | May 22, 2014 - 11:30 PM |
Tagged: nvidia, monitor, g-sync, acer, 4k

We've been talking about the benefits 4K for a while, most recently with the Samsung U28D590D, which added single-stream 60Hz support to the mix, but there have certainly been some drawbacks with 4K monitors to date. Between usually low refresh rates and the general problem of getting smooth images on the screen (not to mention the high price of entry into 4K) there have been some legitimate questions about when to upgrade. Well, an interesting new product announcement from a surprising source might change things.

Acer-logo.jpg

With a logo like that, who needs product photos?

Today, Acer is announcing an interesting alternative: the world’s first 4K monitor with integrated NVIDIA G-SYNC technology.

The XB280HK will be a 28" display, and (provided you have an NVIDIA graphics card and were looking to make the move to 4K) the benefits of G-SYNC - which include minimizing stutter and eliminating tearing - seem ideal for extremely high-res gaming.

We’ll be eagerly awaiting a look at the performance of this new monitor. (Or even a look at it, since Acer did not release a product photo!)

The details are scarce, but Acer says this will be a part of their “XB0” series of gaming monitors. Here are some specs for this 28” 3840x2160 display, which features three proprietary technologies from Acer:

  • “Flicker-less” which Acer says is implemented at the power supply level to reduce screen flicker
  • “Low-dimming” which sounds like an ambient light sensor to dim the monitor in low light
  • “ComfyView” non-glare screen

Of interest, the Acer XB280HK is likely using a TN panel given the claimed "170/170 degree" viewing angle.

The hardware needed for good 4K frame rates are definitely up there, and with G-SYNC onboard the XB280HK will probably not be in the low-end of the 4K price range, but we shall see!

Source: Acer

NVIDIA Tegra K1 Benchmarks Spotted

Subject: General Tech, Graphics Cards, Mobile | May 22, 2014 - 04:58 PM |
Tagged: tegra k1, nvidia, iris pro, iris, Intel, hd 4000

The Chinese tech site, Evolife, acquired a few benchmarks for the Tegra K1. We do not know exactly where they got the system from, but we know that it has 4GB of RAM and 12 GB of storage. Of course, this is the version with four ARM Cortex-A15 cores (not the upcoming, 64-bit version based on Project Denver). On 3DMark Ice Storm Unlimited, it was capable of 25737 points, full system.

nvidia-k1-benchmark.jpg

Image Credit: Evolife.cn

You might remember that our tests with an Intel Core i5-3317U (Ivy Bridge), back in September, achieved a score of 25630 on 3DMark Ice Storm. Of course, that was using the built-in Intel HD 4000 graphics, not a discrete solution, but it still kept up for gaming. This makes sense, though. Intel HD 4000 (GT2) graphics has a theoretical performance of 332.8 GFLOPs, while the Tegra K1 is rated at 364.8 GFLOPs. Earlier, we said that its theoretical performance is roughly on par with the GeForce 9600 GT, although the Tegra K1 supports newer APIs.

Of course, Intel has released better solutions with Haswell. Benchmarks show that Iris Pro is able to play Battlefield 4 on High settings, at 720p, with about 30FPS. The HD 4000 only gets about 12 FPS with the same configuration (and ~30 FPS on Low). This is not to compare Intel to NVIDIA's mobile part, but rather compare Tegra K1 to modern, mainstream laptops and desktops. It is getting fairly close, especially with the first wave of K1 tablets entering at the mid-$200 USD MSRP in China.

As a final note...

There was a time where Tim Sweeney, CEO of Epic Games, said that the difference between high-end and low-end PCs "is something like 100x". Scaling a single game between the two performance tiers would be next-to impossible. He noted that ten years earlier, that factor was more "10x".

Now, an original GeForce Titan is about 12x faster than the Tegra K1 and they support the same feature set. In other words, it is easier to develop a game for the PC and high-end tablet than it was to develop an PC game for high-end and low-end machines, back in 2008. PC Gaming is, once again, getting healthier.

Source: Evolife.cn