XFX GeForce 8600 GTS and GT Review - SLI Tested
New Testing Setup: Vista 64-bit
Graphics card testing has become the most hotly debated issue in the hardware enthusiast community recently. Because of that, testing graphics cards has become a much more complicated process than it once was. Before you might have been able to rely on the output of a few synthetic, automatic benchmarks to make your video card purchase, that is just no longer the case. Video cards now cost up to $500 and we want to make sure that we are giving the reader as much information as we can to aid you in your purchasing decision. We know we can't run every game or find every bug and error, but we try to do what we can to aid you, our reader, and the community as a whole.
With that in mind, all the benchmarks that you will see in this review are from games that we bought off the shelves just like you. Of these games, there are two different styles of benchmarks that need to be described.
The first is the "timedemo-style" of benchmark. Many of you may be familiar with this style from games like Quake III; a "demo" is recorded in the game and a set number of frames are saved in a file for playback. When playing back the demo, the game engine then renders the frames as quickly as possible, which is why you will often see the "timedemo-style" of benchmarks playing back the game much more quickly than you would ever play the game. In our benchmarks, the FarCry tests were done in this matter: we recorded four custom demos and then played them back on each card at each different resolution and quality setting. Why does this matter? Because in these tests where timedemos are used, the line graphs that show the frame rate at each second, each card may not end at the same time precisely because one card is able to play it back faster than the other -- less time passes and thus the FRAPs application gets slightly fewer frame rates to plot. However, the peaks and valleys and overall performance of each card is still maintained and we can make a judged comparison of the frame rates and performance.
The second type of benchmark you'll see in this article are manual run throughs of a portion of a game. This is where we sit at the game with a mouse in one hand, a keyboard under the other, and play the game to get a benchmark score. This benchmark method makes the graphs and data easy to read, but adds another level of difficulty to the reviewer -- making the manual run throughs repeatable and accurate. I think we've accomplished this by choosing a section of each game that provides us with a clear cut path. We take three readings of each card and setting, average the scores, and present those to you. While this means the benchmarks are not exact to the most minute detail, they are damn close and practicing with this method for many days has made it clear to me that while this method is time consuming, it is definitely a viable option for games without timedemo support.
The second graph is a bar graph that tells you the average framerate, the maximum framerate, and the minimum framerate. The minimum and average are important numbers here as we want the minimum to be high enough to not affect our gaming experience. While it will be the decision of each individual gamer what is the lowest they will allow, comparing the Min FPS to the line graph and seeing how often this minimum occurs, should give you a good idea of what your gaming experience will be like with this game, and that video card on that resolution.
Our tests are completely based around the second type of benchmark method mentioned above -- the manual run through.
System Setup and Benchmarks
A big area of contention for me with this review was my choice to use Vista 64-bit instead of Windows XP. While indeed most gamers are probably still on Windows XP, with the DX10 games right around the corner, the move to Vista is going to be picking up for the enthusiast crowd. As such, I thought it was only responsible for me to start testing on Vista full time with gaming so that my results with it are more reliable and stable; in addition, since we have been spending so much time listening to both graphics companies tout their dominance in Vista, we really should be putting it to the test on a regular basis.
In addition to my move to Vista 64-bit for our testing, you'll also notice a DRASTIC change in the game titles that are used in benchmarking. Only a couple of titles (Prey and FEAR) remain from the previous set and we have thrown SIX new games to reflect the gaming of our users. I have included FEAR, Prey, HL2 Episode 1, Supreme Commander, Company of Heroes, Oblivion, Battlefield 2142 and Rainbow Six Vegas. This of course added a TON of time to our testing cycle so a lot more work had to go into getting the test setups ready for the new onslaught of performance metrics. I think it turned out very well though and pushes the hardware and drivers to areas that other reviews might not investigate.
The XFX cards are here on their own today; if you want to see how the 8600 GTS and 8600 GT compare to cards from ATI's lineup check out our initial reviews of them. For this review I decided to include some SLI testing using the XFX cards and a card from a different vendor: BFG for the GTS and EVGA for the GT. I didn't run into any compatibility problems using different vendor boards -- so all went well!
GeForce 8600 GTS / GT 256MB Test System Setup
Sound Blaster Audigy 2 Value
Forceware 158.14 - NVIDIA
|Power Supply||PC Power and Cooling 1000 watt|
DX10 / DX9c
Windows Vista Ultimate 64-bit
Company of Heroes
Rainbow Six Vegas
You'll notice that I took both FEAR and HL2: Episode 1 out of the testing suite for this article -- with the big hitching problems we are seeing in the current driver release, benchmarking them is basically pointless. Once the new driver that NVIDIA says is only days away shows up, we'll put them back in the set of games tested. Sorry!