Feedback

Left 4 Dead 2 Demo AMD and NVIDIA Graphics Performance

Author:
Manufacturer: General
Tagged:

More zombies!!!

This week the demo of Left 4 Dead 2 was made available to those PC gamers that pre-ordered the game via Steam.  If you aren't aware of what Left 4 Dead 2 (or even Left 4 Dead the original!) are, then you should be ashamed.  In my opinion this was the best game of the year for the PC as it created one of the most unique multiplayer gaming environments I have ever experienced. 

Chances are most of you have played it and enjoyed it and are excited about the fun that will come when the second installment is available in mid-November.  I thought since we had some early access to the game I would provide some quick performance results for Left 4 Dead 2 with a handful of graphics cards for those interested. 



Yes, there really is a zombie getting a piggy-back ride here.



"First down!"



These are the settings we tested at (oops, we actually ran at 16x AF!) in the benchmarks below.

If you are curious to see what section of the game we used for our testing, I have included a short video here:

Our benchmarking was done on our current graphics test bed that includes an Intel Core i7-965 processor, ASUS P6T6 WS motherboard, 6GB of Corsair DDR3-1333 memory, 160GB Intel X25-M SSD, PC Power and Cooling 1200w power supply and the appropriate graphics card all running on Windows 7 x64 RTM.  The NVIDIA 191.07 driver was used for all NVIDIA cards while for AMD we used the Catalyst 9.10 driver on the HD 5870 and HD 5850 cards and the Juniper beta driver on the HD 5770 and HD 5750 cards. 

One testing note for curious gamers: there seemed to be a bug with AMD's drivers that disabled AA if you changed resolutions inside the game even though it still showed up as being enabled.  It was obvious to me while playing that this was happening - a simple exit and re-entry into the game fixed the AA issue and hopefully AMD will address the bug in their next driver release. 

Obviously the fastest configuration was the the pair of Radeon HD 5870 cards in CrossFire mode - they were able to average 216 FPS while never dropping below 118 FPS - very impressive.  Next in line was the GeForce GTX 295 dual-GPU graphics card followed very closely by the Radeon HD 5870.  Notice that the HD 5870 and GTX 295 had the same minimum frame rate which is equally as important.  The HD 5850 was able to topple the GTX 285 and the HD 5770 was able to match and beat the GTX 260+ and GTS 250 cards from NVIDIA. 

At 2560x1600 the performance deltas remain mostly the same though the GTX 295 does have a larger edge over the HD 5870 in terms of single graphics card performance.  It is great to see that even the Radeon HD 5850 and GTX 285 are producing fantastic results at this resolution without dropping below the 40 FPS minimum mark. 

Closing Thoughts

This was really just a quick run through of Left 4 Dead 2 with the latest cards from NVIDIA and AMD: not much to really evaluate here.  If you are like most people and have a monitor that runs at 1920x1200 or 1920x1080 then obviously you are going to be able to play L4D2 without any issue using the lowest end cards tested here that will start at about $130.  If you are one of the lucky few that owns a 30" monitor and can play at 2560x1600 then you'll need to step a little for the best experience but you should not HAVE to pay more than $250 or so for the privilege. 

Of course, if you are one of those interested in doing some Eyefinity multi-monitor gaming then you might want to fork out the dough for the Radeon HD 5870.  Speaking of which, I think I have some more "testing" to do...

Related Reading:

No comments posted yet.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.