AMD on StarCraft II: Anti-aliasing will come when performance is better
Subject: Graphics Cards | July 27, 2010 - 01:32 AM | Ryan Shrout
If you haven't checked out my wicked-awesome StarCraft II Performance Review, you definitely should, and where the hell have you been? The game is out now so you might already have it in your system, but now you should be even more interested in my results and the news we have for you tonight.
One of the big keys to the SCII article linked above was the issue surrounding AA (anti-aliasing): Blizzard didn't feel the need to include it (and why the hell not is beyond me) but NVIDIA enabled it in their driver using a brute force method that, by the company's own admission, was less efficient than it could be with developer assistance.
|1920x1200 1xAA||1920x1200 4xAA|
Here is one example of AA at work in the StarCraft II game - surely you can see the advantages of having the ability to enable it on your system can provide, given appropriate GPU horse power.
When I wrote the original article I said this:
Anti-aliasing is another bonus for NVIDIA card users who pick up SC2 on
launch day as it is something that only they will have unless ATI's
driver team really gets on the ball and can integrate support in the
next week or so.
It would appear ATI took some of that to heart, but unfortunately, as we sit here on launch night, ATI users are still going to be without AA support in StarCraft II for some time longer. I got this official note from AMD tonight:
Yes, this is the case; we proved it in our performance review from last week. I also find it interesting that AMD credited Blizzard's goal of "game play for all" rather than AMD's own hardware or software. Honesty is great, but not enough marketing/PR people read this note I guess.
Yes, it definitely does have an impact on the NVIDIA card performance as we mentioned in our performance article.
Again, also very true. They may be referring to our very piece. But I think the word "concerns" might be inappropriate. We have "concerns" about a great many things in life but what I had about AA in StarCraft II were "discoveries" that allowed me to recommend options to our users based on their hardware and our performance results.
Hrrm. So AMD says it is not a technical limit but rather a deliberate decision made based on weighing options.
AMD is basically telling us that enabling AA in the control panel didn't live up to their standards in terms of performance; fair enough I guess but wouldn't it be more fair for the user (and the press) to decide if it met "standards"? If I have a Radeon HD 5870 in my system (and I do) and I want to run at 1920x1080 at 4xAA with half the performance (still about 50 FPS) then I would consider that an "acceptable level" for me. So either AMD is being really picky here or there performance drops were going to be more noticeable than NVIDIA's.
Here is my theory: if AMD's cards had their performance cut in half with the enabling of anti-aliasing then ATI's options would have looked even further behind the NVIDIA cards available today. From a marketing stand point I can see it being more beneficial to wait and integrate the feature later rather than implement it now and have another "loss" on the record books going into the game's official launch. It's disappointing for all of those gamers with ATI cards that might want to TRY to enable AA and see if they like the experience, but for now all we can do is wait.
Happy gaming and good luck getting to work on time!!
UPDATE 8/2/10 - AMD has released the Catalyst 10.7 beta driver that enables AA support. Check out our updated performance results for more information!