AMD and NVIDIA get into a hairy argument

Subject: General Tech | May 29, 2014 - 04:43 PM |
Tagged: nvidia, gameworks, dirty pool, business as usual, amd

The topic on NVIDIA Gameworks was discussed at great length on last night's PCPer Podcast and from the live comments as well as the comments on Ryan's original story this is obviously a topic which draws strong opinions.  As it is always best to limit yourself to debating topics of which you are familiar with the facts The Tech Report's article on the aftereffects of the Forbes story is well worth a read.  Cyril had a chance to speak with a rep from NVIDIA's driver development team about Hallock's comments pertaining to NVIDIA's Gameworks and the legitimacy of AMD's complaints.  As you might expect there is a lot of denial and finger pointing from both sides; what long time enthusiasts might describe as 'business as usual'.  Both sides of this argument have vehemently denied ever attempting to undermine each others business but yet both sides can point to specific instances in which the competition has used questionable methods to get a leg (or hair) up on the competition.  

6a00d8341c795b53ef010535c10aa8970b.jpg

"Earlier today, I spoke with Cem Cebenoyan, Director of Engineering for Developer Technology at Nvidia, who offered a rebuttal to a Forbes story we covered yesterday. In that story, AMD's Robert Hallock alleged that Nvidia's GameWorks program prevents AMD from working with game developers on GPU optimizations."

Here is some more Tech News from around the web:

Tech Talk

Podcast #302 - ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

Subject: General Tech | May 29, 2014 - 11:51 AM |
Tagged: video, podcast, asus, 4k, pb287q, nvidia, amd, gameworks, ubisoft, watch dogs, crucial, mx100, tegra k1, gsync

PC Perspective Podcast #302 - 05/29/2014

Join us this week as we discuss the ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, and Allyn Maleventano

Program length: 1:29:01
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Allyn: For Josh - the Wenger Giant Knife
  4. Closing/outro

 

Author:
Manufacturer: Various

The AMD Argument

Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:

Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.

The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).

It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.

watch_dogs_ss9_99866.jpg

Watch_Dogs is the latest GameWorks title released this week.

I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.

The AMD Stance

Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.

Continue reading our editorial on the verbal battle between AMD and NVIDIA about the GameWorks program!!

NVIDIA Finally Launches GeForce GTX Titan Z Graphics Card

Subject: Graphics Cards | May 28, 2014 - 08:19 AM |
Tagged: titan z, nvidia, gtx, geforce

Though delayed by a month, today marks the official release of NVIDIA's Titan Z graphics card, the dual GK110 beast with the $3000 price tag. The massive card was shown for the first time in March at NVIDIA's GPU Technology Conference and our own Tim Verry was on the grounds to get the information

The details remain the same:

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). For the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

The difference now of course is that all the clock speeds and pricing are official. 

titanzspecs.png

A base clock speed of 705 MHz with a Boost rate of 876 MHz places it well behind the individual GPU performance of a GeForce GTX 780 Ti or GTX Titan Black (rated at 889/980 MHz). The memory clock speed remains the same at 7.0 Gbps and you are still getting a massive 6GB of memory per GPU.

Maybe most interesting with the release of the GeForce GTX Titan Z is that NVIDIA seems to have completely fixated on non-DIY consumers with the card. We did not receive a sample of the Titan Z (nor did we get one of the Titan Black) and when I inquired as to why, NVIDIA PR stated that they were "only going to CUDA developers and system builders."

geforce-gtx-titan-z-3qtr.png

I think it is more than likely that after the release of AMD's Radeon R9 295X2 dual GPU graphics card on April 8th, with a price tag of $1500 (half of the Titan Z), the target audience was redirected. NVIDIA already had its eye on the professional markets that weren't willing to dive into the Quadro/Tesla lines (CUDA developers will likely drop $3k at the drop of a hat to get this kind of performance density). But a side benefit of creating the best flagship gaming graphics card on the planet was probably part of the story - and promptly taken away by AMD.

geforce-gtx-titan-z-bracket.png

I still believe the Titan Z will be an impressive graphics card to behold both in terms of look and style and in terms of performance. But it would take the BIGGEST NVIDIA fans to be able to pass up buying a pair of Radeon R9 295X2 cards for a single GeForce GTX Titan Z. At least that is our assumption until we can test one for ourselves.

I'm still working to get my hands on one of these for some testing as I think the ultra high end graphics card coverage we offer is incomplete without it. 

Several of NVIDIA's partners are going to be offering the Titan Z including EVGA, ASUS, MSI and Zotac. Maybe the most intersting though is EVGA's water cooled option!

evgatitanz.jpg

So, what do you think? Anyone lining up for a Titan Z when they show up for sale?

Acer Announces World's First 4K Display with NVIDIA G-SYNC Technology

Subject: Displays | May 22, 2014 - 08:30 PM |
Tagged: nvidia, monitor, g-sync, acer, 4k

We've been talking about the benefits 4K for a while, most recently with the Samsung U28D590D, which added single-stream 60Hz support to the mix, but there have certainly been some drawbacks with 4K monitors to date. Between usually low refresh rates and the general problem of getting smooth images on the screen (not to mention the high price of entry into 4K) there have been some legitimate questions about when to upgrade. Well, an interesting new product announcement from a surprising source might change things.

Acer-logo.jpg

With a logo like that, who needs product photos?

Today, Acer is announcing an interesting alternative: the world’s first 4K monitor with integrated NVIDIA G-SYNC technology.

The XB280HK will be a 28" display, and (provided you have an NVIDIA graphics card and were looking to make the move to 4K) the benefits of G-SYNC - which include minimizing stutter and eliminating tearing - seem ideal for extremely high-res gaming.

We’ll be eagerly awaiting a look at the performance of this new monitor. (Or even a look at it, since Acer did not release a product photo!)

The details are scarce, but Acer says this will be a part of their “XB0” series of gaming monitors. Here are some specs for this 28” 3840x2160 display, which features three proprietary technologies from Acer:

  • “Flicker-less” which Acer says is implemented at the power supply level to reduce screen flicker
  • “Low-dimming” which sounds like an ambient light sensor to dim the monitor in low light
  • “ComfyView” non-glare screen

Of interest, the Acer XB280HK is likely using a TN panel given the claimed "170/170 degree" viewing angle.

The hardware needed for good 4K frame rates are definitely up there, and with G-SYNC onboard the XB280HK will probably not be in the low-end of the 4K price range, but we shall see!

Source: Acer

NVIDIA Tegra K1 Benchmarks Spotted

Subject: General Tech, Graphics Cards, Mobile | May 22, 2014 - 01:58 PM |
Tagged: tegra k1, nvidia, iris pro, iris, Intel, hd 4000

The Chinese tech site, Evolife, acquired a few benchmarks for the Tegra K1. We do not know exactly where they got the system from, but we know that it has 4GB of RAM and 12 GB of storage. Of course, this is the version with four ARM Cortex-A15 cores (not the upcoming, 64-bit version based on Project Denver). On 3DMark Ice Storm Unlimited, it was capable of 25737 points, full system.

nvidia-k1-benchmark.jpg

Image Credit: Evolife.cn

You might remember that our tests with an Intel Core i5-3317U (Ivy Bridge), back in September, achieved a score of 25630 on 3DMark Ice Storm. Of course, that was using the built-in Intel HD 4000 graphics, not a discrete solution, but it still kept up for gaming. This makes sense, though. Intel HD 4000 (GT2) graphics has a theoretical performance of 332.8 GFLOPs, while the Tegra K1 is rated at 364.8 GFLOPs. Earlier, we said that its theoretical performance is roughly on par with the GeForce 9600 GT, although the Tegra K1 supports newer APIs.

Of course, Intel has released better solutions with Haswell. Benchmarks show that Iris Pro is able to play Battlefield 4 on High settings, at 720p, with about 30FPS. The HD 4000 only gets about 12 FPS with the same configuration (and ~30 FPS on Low). This is not to compare Intel to NVIDIA's mobile part, but rather compare Tegra K1 to modern, mainstream laptops and desktops. It is getting fairly close, especially with the first wave of K1 tablets entering at the mid-$200 USD MSRP in China.

As a final note...

There was a time where Tim Sweeney, CEO of Epic Games, said that the difference between high-end and low-end PCs "is something like 100x". Scaling a single game between the two performance tiers would be next-to impossible. He noted that ten years earlier, that factor was more "10x".

Now, an original GeForce Titan is about 12x faster than the Tegra K1 and they support the same feature set. In other words, it is easier to develop a game for the PC and high-end tablet than it was to develop an PC game for high-end and low-end machines, back in 2008. PC Gaming is, once again, getting healthier.

Source: Evolife.cn

Xiaomi MiPad Tablet is Tegra K1 Powered

Subject: General Tech, Graphics Cards, Processors, Mobile | May 15, 2014 - 02:02 PM |
Tagged: nvidia, xaiomi, mipad, tegra k1

Tegra K1 is NVIDIA's new mobile processor and this first to implement the Kepler graphics architecture. In other words, it has all of the same graphics functionality as a desktop GPU with 364 GigaFLOPs of performance (a little faster than a GeForce 9600 GT). This is quite fast for a mobile product. For instance, that amount of graphics performance could max out Unreal Tournament 3 to 2560x1600 and run Crysis at 720p. Being Kepler, it supports OpenGL 4.4, OpenGL ES 3.1, DirectX 11 and 12, and GPU compute languages.

Xiaomi is launching their MiPad in Beijing, today, with an 8-inch 2048x1536 screen and the Tegra K1. They will be available in June (for China) starting at $240 USD for the 16GB version and going up to $270 for the 64GB version. Each version has 2GB of RAM, an 8MP rear-facing camera, and a 5MP front camera.

Now, we wait and see if any Tegra K1 devices come to North America and Europe - especially at that price point.

Source: NVIDIA

The Crowbar *WAS* for Half-Life 2 (and Portal) on SHIELD

Subject: General Tech | May 12, 2014 - 06:42 PM |
Tagged: nvidia, shield, half-life 2, Portal

What would Gordon Freeman do? He would tell everyone to... ... oh right.

crowbarhl22.jpg

Well, apparently he is available on the NVIDIA SHIELD, now, along with Portal. I am not talking about GameStream. These two games have been ported to Android, but only through the SHIELD. From their screenshots, the mobile games look pretty good, especially Portal with its look-through mechanics.

nvidia-portal-screenshot3.png

As usual, whenever NVIDIA really wants something, they will often parachute engineers through your skylights to do it for you. The company revolves around delivering experiences to their customers, which is a good mindset for a company to have. This is one of the main reasons for Microsoft and the success of PC gaming, especially in the late 90's with their DirectX efforts.

If you have an NVIDIA SHIELD, Half Life 2 and Portal are available now for $9.99, through TegraZone.

Source: NVIDIA

NVIDIA Titan Z Missed Its Release Date

Subject: General Tech, Graphics Cards | May 12, 2014 - 05:00 PM |
Tagged: titan z, nvidia, gtx titan z, geforce

To a crowd of press and developers at their GTC summit, NVIDIA announced the GeForce GTX Titan Z add-in board (AIB). Each of the two, fully unlocked, GK110 GPUs would each have access to 6GB of GDDR5 memory (12GB total). The card was expected to be available on May 8th but has yet to surface. As NVIDIA has yet to comment on the situation, many question whether it ever will.

nvidia-titan-z-where.png

And then we get what we think are leaked benchmarks (note: two pictures).

One concern about the Titan Z was its rated 8 TeraFLOPs of compute performance. This is a fairly sizable reduction from the theoretical maximum of 10.24 TeraFLOPs of two Titan Black processors and even less than two first-generation Titans (9 TeraFLOPs combined). We expected that this is due to reduced clock rates. What we did not expect is for benchmarks to show the GPUs boost way above those advertised levels, and even beyond the advertised boost clocks of the Titan Black and the 780 Ti. The card was seen pushing 1058 MHz in some sections, which leads to a theoretical compute performance of 12.2 TeraFLOPs (6.1 TeraFLOPs per GPU) in single precision. That is a lot.

These benchmarks also show that NVIDIA has a slight lead over AMD's R9 295X2 in many games, except Battlefield 4 and Sleeping Dogs (plus 3DMark and Unigine). Of course, these benchmarks measure the software reported frame rate and frame times and those may or may not be indicative of actual performance. While I would say that the Titan Z appears to have a slight performance lead over the R9 295X2, although a solid argument for an AMD performance win exists, it does so double the cost (at its expected $3000 USD price point). That is not up for debate.

Whichever card is faster, AMD's is half the price and available for purchase right now.

So, until NVIDIA says anything, the Titan Z is in limbo. I am sure there exists CUDA developers who await its arrival. Personally, I would just get three Titan Blacks since you are going to need to manually schedule your workloads across multiple processors anyway (or 780 Tis if 32-bit arithmetic is enough precision). That is, of course, unless you cannot physically fit enough GeForce Titan Blacks in your motherboard and, as such, you require two GK110 chips per AIB (but not enough to bother writing a cluster scheduling application).

Source: Unknown

Podcast #299 - ASUS Z97-Deluxe, NCASE M1 Case, AMD's custom ARM Designs and more!

Subject: General Tech | May 8, 2014 - 08:57 AM |
Tagged: podcast, video, asus, z97, Z97-Deluxe, ncase, m1, amd, seattle, arm, nvidia, Portal, shield

PC Perspective Podcast #299 - 05/08/2014

Join us this week as we discuss ASUS Z97-Deluxe, NCASE M1 Case, AMD's custom ARM Designs and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, Allyn Malventano, and Morry Tietelman

Program length: 1:27:11
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
  4. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!