DirectX 10 Benchmark Fiasco - For the good of the gamer...
DX10 Benchmark Fiasco
NVIDIA and ATI (now part of AMD) are in this business to make money. I think we should go ahead and clear the air by starting with that very simple, yet often overlooked point of contention between those companies, the media and the enthusiast gamers they cater to. That doesn’t mean that neither company has any responsibility to the community that pays them; on the contrary I think that you will agree that these companies owe you quite a bit. And some, if not all, of the employees at AMD and NVIDIA agree and that is why programs like NVIDIA’s "The Way It’s Meant to be Played" and AMD’s "Get in the Game" exist at all. But are they actually doing gamers any good or does this service often find itself becoming a weapon towards the opposing corporation?
In the past weeks some events have unfolded both behind the scenes and in the public eye of the enthusiasts that have brought with them a lot of questions and not a lot of answers. I’ll attempt to give you my thoughts on what these programs are actually doing to our community.
First, a VERY recent history of the events alluded to above needs to be explained for those not privy to the emails we have seen here at PC Perspective.
Before the recent launch of AMD’s Radeon HD 2900 XT card, the AMD marketing staff distributed what was basically the first DirectX 10 benchmark most media and readers had ever seen in the form of the "Call of Juarez" demo. It was originally presented "as-is" with a brief synopsis of the title and basic benchmarking directions.
Shortly thereafter, and still before the HD 2900 XT launch, NVIDIA found out about this "Juarez" demo that AMD sent out and claimed that the code in it was out of date. NVIDIA had already seen this build of the demo and had fixed a part in TechLand’s (the "Juarez" developer) code that was mishandling anti-aliasing buffers and causing the NVIDIA driver to crash. NVIDIA said in fact that this actually violated the DX10 specifications, but that NVIDIA found the bug, fixed it and resubmitted it back to TechLand; something that NVIDIA’s and AMD’s driver teams will often do. Here's the direct quote from NVIDIA:
NVIDIA: "NVIDIA has a long standing relationship with Techland and their publisher Ubisoft. In fact, the original European version Call Of Juarez that was launched in September 2006 is part of the "The Way Its Meant To Be Played" program <http://www.nzone.com/object/nzone_thecallofjuarez_home.html> As a result of the support Techland and Ubisoft receives for being part of the "The Way Its Meant To Be Played" program, NVIDIA discovered that the early build of the patch that was distributed to the press has an application bug that violates DirectX 10 specifications by mishandling MSAA buffers which causes DirectX 10 compliant hardware to crash. Our DevTech team has worked with Techland to fix that and other bugs in the Call of Juarez code.
Benchmark testing for reviews should be performed with the latest build of the game, one that incorporates those DirectX 10 fixes. Previous builds of the game should not be used for benchmarking with or without AA. For details on how to get the patch and for further information please contact Techland or Ubisoft."
Call of Juarez - TechLand
NVIDIA told the media that they recommended getting the latest build code directly from TechLand if they wanted to test the "Juarez" demos in their upcoming HD 2900 XT reviews; of course this is easier said than done and with only a day or so before the NDA was released, very few media outlets took the time to do it. In fact, I myself left the "Juarez" demo completely out of the discussion on the HD 2900 XT article simply because of these accusations of bad code.
AMD later released an email to the media with the same basic information: they had found a bug in the "Juarez" demo that kept AA from working correctly on NVIDIA cards and thus they recommended we only test it without the AA enabled.
AMD: "The ATI Radeon HD X2900 XT reviewers guide shows an “N/A” score for the Nvidia GeForce 8800 GTS 640MB when testing DX10 Call of Juarez under Balanced Mode. The game’s developer, Techland has found that this was caused by an application issue when MSAA is enabled. The benchmark can be run on both ATI Radeon and NVIDIA hardware with MSAA disabled. This application issue will be fixed in an upcoming patch available in the next couple of weeks. The full DirectX 10 version of Call of Juarez will be available to the public sometime in June with full functionality for all DirectX 10 capable graphics cards."
NVIDIA didn’t even want us to do that with the original build that AMD provided.
Some online media presented various levels of benchmarks using the "Juarez" demo and most of those attached a disclaimer about the above interactions between AMD, NVIDIA and TechLand. The results showed the AMD HD 2900 XT cards outperforming the NVIDIA 8800 GTS 640MB cards by a decent margin even without AA enabled.
Now, just several days ago, NVIDIA came to us (and a bunch of other media outlets I presume) with information on an upcoming release of a demo for the PC version of "Lost Planet" that was going to be released in both DX9 and DX10 versions. NVIDIA provided a way for us to contact the developer’s PR company to get a hold of the demo before it was released to the public (though only a day early) so that we could run some tests and present them when the launch occurred. NVIDIA was calling this the "first public DX10 game code" and the title seems to fit.
NVIDIA: "This week, Capcom will be releasing two versions of its Lost Planet: Extreme Condition demo for the PC. Both versions will contain the same content (i.e., no differences in game play), but one version will be DirectX 9 and the other DX 10. The latter version will be the world’s first downloadable DX 10 game demo. Both demos will weigh in at roughly 350 MB.
The demo also contains a system performance test. The test is an in-engine run-through of two levels of the game, complete with enemies, explosions and 3D meshes. The performance test shows your current FPS, average FPS, relevant system info (CPU and speed, graphics card, resolution and DX version) and, after a complete pass through of both levels, an average FPS for both the “Snow” and “Cave” sections. We think that this tester will be a useful tool for your testing purposes, as well as for your community.
For those who haven’t seen it, Lost Planet has been a huge hit on the Xbox 360 since its release in January, and the PC game is even better. NVIDIA has been working closely with Capcom on the development of the PC game, providing technical expertise and GeForce 8-series graphics cards. As a result, Lost Planet will support resolutions up to 2560x1600, higher resolution special effects, and improved lighting and shadows."
Lost Planet - Capcom
Not even a day later, AMD had a response to us on this: AMD said that they had not had a chance to look at the demo for "Lost Planet" and as such their driver team had not had time to implement any code for the game and thus their benchmark results were not going to be in-line with performance we should see later. AMD even used the term "NVIDIA sponsored title" in their email in a way indicating that the developer might have given preferential treatment to NVIDIA’s hardware in their initial code release.
AMD: "Today Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a “Lost Planet” benchmark. Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."
Again, because of the complete lack of credibility the benchmark had already attained before its release, I thought it was better to NOT provide any kind of test results that might confuse readers or misrepresent either NVIDIA’s or AMD’s hardware. Some media DID run these tests and the results are what you might expect: the AMD HD 2900 XT loses badly to the 8800 GTS cards from NVIDIA.
The Implied Accusations
While neither company would really come out and say it, both NVIDIA and AMD are hoping that the idea will come across that the other "cheated" in each particular DX10 benchmark scenario. AMD kept the "Juarez" demo from performing up to the level it could on NVIDIA hardware and NVIDIA kept the "Lost Planet" demo from running properly on AMD hardware; or at least that’s what the general reader might gather from all of this.
First, let’s discuss why these developer relations programs (TWIMTBP and GITG) exist in the first place. The overall corporate goal is to have better support for their respective hardware in more game titles, and more KEY game titles, than the competition. This provides an opportunity for their own hardware to perform better in benchmarks and evaluations and thus presumably they would sell more cards to gamers and net more income and profit.
Both NVIDIA and AMD have a secondary goal though of appealing to the community and at the very least APPEARING to be looking out for the best interests of the gamers themselves. While this might be true on some levels, the cynical beast in all of us should tell us that most of the time corporations are only doing good by others when it will do good by them. By helping game developers and supporting them with these programs both hardware companies are making the gaming experience better overall by improving performance, increasing image quality and finding and fixing bugs. All of these are key benefits to the gamer of course, but NVIDIA and AMD both have the ulterior motives of selling more hardware.
What we don’t know for sure though is exactly WHAT is changing hands when NVIDIA or AMD partner up with any particular game developer. AMD is the most closed door about their GITG program while NVIDIA’s TWIMTBP is marketed towards media (in presentations and discussions) as being one of an exchange of support for support. The game developer puts the NVIDIA logo on their game box and in a splash screen and in return NVIDIA will provide the developer with testing data from a wide variety of hardware as well as software tools and data sets that can make creating new shaders and optimizing code much easier. I am quite sure that these agreements differ greatly from developer to developer, if only because getting the NVIDIA logo into a copy of "Madden 08" is much more valuable than seeing it in "Molly Maid 2". What we don’t know is if money is exchanging hands; though NVIDIA often states that it doesn’t, it has not defined that as a rule for its TWIMTBP program and AMD offers up even less information on the su
bject. When money gets involved things tend to get sticky.
NVIDIA Shader Performance Application
In reality, I don’t think that an exchange of money from either hardware vendor to the developer for the purposes of a partnership extends over any bounds, but what WOULD is an exchange of money or services for a developer to LESSEN their support for the competing hardware vendor. Even if the idea is insinuated rather than explicitly stated that is the one down fall that the developer relations programs can bring to light.
To be completely blunt about it: we have no specifics on if or when this kind of exchange has happened, but it’s the basic possibility of it occurring that causes anger and distrust in the hardware community. No one wants to think that their hard-earned $500 graphics card purchase is being hindered because the other team spent a little extra on the developer of their favorite title.
What can we do about it?
Unfortunately this debate will probably continue on as long as there are multiple competitors in the field that follow different paths in hardware design. The code from a developer will not run the same on both pieces of hardware, especially considering how much work has to be done by NVIDIA and AMD in the drivers before the code reaches the metal. As such, the performance on the green cards is going to be higher on some titles than the red cards and vice versa, leading some to believe that their favorite team is somehow being cheated.
Even more importantly, as hardware gets more complicated and programming for them becomes more complex and more time consuming, developers are actually going to need MORE of this kind of assistance from the likes of NVIDIA and AMD. The technical assistance and shortcuts that both teams provide are essential, especially to those smaller development teams that lack the resources of hundreds of technical staff.
In my view, the only people that can possibly "police" these developer relations programs are the developers themselves; they are the only ones that see everything both sides are offering up. But because so many of the medium to small developers depend on this help from NVIDIA and AMD, they are unlikely to stand up against them individually. It will take one of the larger, more independent companies like Valve, id or even Blizzard to be able to take the stance on the gamer’s side and stick up for the benefit of all PC enthusiasts.
For the time being, both the faith of the media and the faith of PC gamers have been jarred by the recent street fights that AMD and NVIDIA are getting in over the initial batch of DX10 benchmarks. While these aren’t the full retail games that are causing the commotion, it isn’t too difficult to see this kind of debate reaching the finished titles in the coming months. All we want to see is the two developer relations programs actually do what they were intended: create the best possible gaming experience for our readers.
This is a very controversial subject and I am eager to get the community's thoughts on it, join in the discussion at this thread of our forums!
Other Graphics Articles of Interest:
Be sure to use our price checking engine to find the best prices on the ATI Radeon HD 2900 XT and NVIDIA GeForce 8800 GTS 640MB, and anything else you may want to buy!