How Games Work

 

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

 

Introduction

The process of testing games and graphics has been evolving even longer than I have been a part of the industry: 14+ years at this point. That transformation in benchmarking has been accelerating for the last 12 months. Typical benchmarks test some hardware against some software and look at the average frame rate which can be achieved. While access to frame time has been around for nearly the full life of FRAPS, it took an article from Scott Wasson at the Tech Report to really get the ball moving and investigate how each frame contributes to the actual user experience. I immediately began research into testing actual performance perceived by the user, including the "microstutter" reported by many in PC gaming, and pondered how we might be able to test for this criteria even more accurately.

The result of that research is being fully unveiled today in what we are calling Frame Rating – a completely new way of measuring and validating gaming performance.

The release of this story for me is like the final stop on a journey that has lasted nearly a complete calendar year.  I began to release bits and pieces of this methodology starting on January 3rd with a video and short article that described our capture hardware and the benefits that directly capturing the output from a graphics card would bring to GPU evaluation.  After returning from CES later in January, I posted another short video and article that showcased some of the captured video and stepping through a recorded file frame by frame to show readers how capture could help us detect and measure stutter and frame time variance. 

card4.jpg

Finally, during the launch of the NVIDIA GeForce GTX Titan graphics card, I released the first results from our Frame Rating system and discussed how certain card combinations, in this case CrossFire against SLI, could drastically differ in perceived frame rates and performance while giving very similar average frame rates.  This article got a lot more attention than the previous entries and that was expected – this method doesn’t attempt to dismiss other testing options but it is going to be pretty disruptive.  I think the remainder of this article will prove that. 

Today we are finally giving you all the details on Frame Rating; how we do it, what we learned and how you should interpret the results that we are providing.  I warn you up front though that this is not an easy discussion and while I am doing my best to explain things completely, there are going to be more questions going forward and I want to see them all!  There is still much to do regarding graphics performance testing, even after Frame Rating becomes more common. We feel that the continued dialogue with readers, game developers and hardware designers is necessary to get it right.

Below is our full video that features the Frame Rating process, some example results and some discussion on what it all means going forward.  I encourage everyone to watch it but you will definitely need the written portion here to fully understand this transition in testing methods.  Subscribe to your YouTube channel if you haven't already!

Continue reading our analysis of the new Frame Rating performance testing methodology!!

Podcast #240 - GTX TITAN Benchmarks, Frame Rating, Tegra 4 Details and more!

Subject: General Tech | February 28, 2013 - 03:45 PM |
Tagged: video, titan, sli, R5000, podcast, nvidia, H90, H110, gtx titan, frame rating, firepro, crossfire, amd

PC Perspective Podcast #240 - 02/28/2013

Join us this week as we discuss GTX TITAN Benchmarks, Frame Rating, Tegra 4 Details and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

This Podcast is brought to you by MSI!

Program length: 1:24:28

Podcast topics of discussion:

  1. 0:01:18 PCPer Podcast BINGO!
  2. Week in Reviews:
    1. 0:03:00 GeForce GTX TITAN Performance Review
    2. 0:21:55 Frame Rating Part 3: First Results from the New GPU Performance Tools
    3. 0:38:00 Corsair Hydro Series H90 and H110 140mm Liquid Cooler Review
  3. 0:40:30 This Podcast is brought to you by MSI!
  4. News items of interest:
    1. 0:41:45 New Offices coming for NVIDIA
    2. 0:45:00 Chromebook Pixel brings high-res to high-price
    3. 0:48:00 GPU graphics market updates from JPR
    4. 0:55:45 Tegra 4 graphics details from Mobile World Congress
    5. 1:01:00 Unreal Engine 4 on PS4 has reduced quality
    6. 1:04:10 Micron SAS SSDs
    7. 1:08:25 AMD FirePro R5000 PCoIP Card
  5. Closing:
    1. 1:13:35 Hardware / Software Pick of the Week
      1. Ryan: NOT this 3 port HDMI switch
      2. Jeremy: Taxidermy + PICAXE, why didn't we think of this before?
      3. Josh: Still among my favorite headphones
      4. Allyn: Cyto
  1. 1-888-38-PCPER or podcast@pcper.com
  2. http://pcper.com/podcast
  3. http://twitter.com/ryanshrout and http://twitter.com/pcper
  4. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

3 displays, 1 GPU

Subject: Displays | February 26, 2013 - 06:26 PM |
Tagged: eyefinity, nvidia surround, crossfire, sli

If you are going to set up a multimonitor display at 5760x1200 or 5040x1050, but only have a single GPU or a pair of low powered ones, just what kind of performance can you expect?  That is the question Techgage wanted to answer and to that purpose they tested frame rates at those resolutions with NVIDIA's GTX680 and two different 660 Ti's in SLI as well as an HD7970 and two different 7850s in Crossfire.  As you might expect the game tested makes a lot of difference in the results, with many seeing the SLI'd 660 Ti's in the lead while other memory hungry games preferred the large cache of the Radeons.  Check out the individual results of your favourite games in the full article.

TG_LCD_02_T.jpg

"Considering next-gen cards are still months away, we didn't expect to bring any more GPU reviews until the second quarter of 2013. However, we realized there was a gap in our current-gen coverage: triple-monitor gaming. In fact, it's been almost two years since we last stress tested games at resolutions of up to 7680x1600.

We're going to mix things up a little this time. Instead of using each camp's ultra-pricey dual-GPU card (or the new $999 Titan), we're going to see how more affordable Crossfire and SLI setups handle triple-monitor gaming compared to today's single-GPU flagships."

Here are some more Display articles from around the web:

Displays

Source: TechSpot

Max Payne 3 at Max resolution

Subject: General Tech | June 6, 2012 - 03:20 PM |
Tagged: max payne 3, crossfire, sli, gtx680, HD 7970, gaming

For the tests they ran, [H]ard|OCP used the latest Catalyst beta, 12.6 and ForceWare 301.42 WHQL as both drivers proved able to provide proper multi-GPU performance on Max Payne 3.  In AMD's case it provided improvements to single card gaming as well.  The games graphics options provide a nice tool which displays how much VRAM your configuration will require so that you can get an idea if your card will be able to handle the settings before you even play the game.  SLI did scale better than Crossfire but even still, both multi GPU rigs could handle the max settings at 2560x1600 and when used singly could still sit around the 60fps mark.  Check out the full review here.

H_eadshot.png

"HardOCP is on top of Max Payne 3 to find out what graphics options it supports and how a GTX 680 and a Radeon HD 7970 perform. We also wanted to know if SLI and CrossFireX worked, and how performance scales. In this preview of performance and image quality we take a look at all of this in the first chapter of this game."

Here is some more Tech News from around the web:

Gaming

 

Source: [H]ard|OCP
Author:
Manufacturer: XFX

Retail-ready HD 7970

We first showed off the power of the new AMD Radeon HD 7970 3GB graphics card in our reference review posted on December 22nd.  If you haven't read all about the new Southern Islands architecture and the Tahiti chip that powers the HD 7970 then you should already be clicking the link above to my review to get up to speed. Once you have done so, please return here to continue.

...

Welcome back, oh wise one.  Now we are ready to proceed.  By now you already know that the Radeon HD 7970 is the fastest GPU on the planet, besting the NVIDIA GTX 580 by a solid 20-30% in most cases.  For our first retail card review we are going to be looking at the XFX Black Edition Double Dissipation that overclocks the GPU and memory clocks slightly and offers a new cooler that promises to be more efficient and quieter.  

Let's put XFX to the test!

The XFX Radeon HD 7970 3GB Black Edition Double Dissipation

01.jpg

Because of the use of a completely custom cooler, the XFX HD 7970 Black Edition Double Dissipation looks completely different than the reference model we tested last month though the feature set remains identical.  The silver and black motif works well here.

Continue reading our review of the XFX Radeon HD 7970 3GB Black Edition Double Dissipation!!

Alienware asks why you shouldn't have CrossFire on a laptop

Subject: Mobile | October 18, 2011 - 02:26 PM |
Tagged: alienware, Alienware M18x, 18.4, crossfire, hd 6990m

Why shouldn't you stick a pair of HD 6990M's in an 18.4" laptop, as long as you don't mind lugging around a 12lb laptop from power outlet to power outlet.  Seeing as just a few years ago 12lbs was not an uncommon weight for a laptop that does represent a great design on Alienware's part.  The comparison that AnandTech was most interested in was between NVIDIA's GTX 580M and AMD's 6990M to see who can hold onto the dual GPU mobile performance crown.  Who shall triumph?  Read on to see.

AAT_amdglam.jpg

"In our first run with the Alienware M18x, we sat down and took a look at the notebook itself along with NVIDIA's current top shelf mobile graphics part, the GeForce GTX 580M. We came away from the experience with mixed impressions of the M18x itself, a notebook that is by all means incredibly powerful but also seems to lose a lot of the balance that made the M17x R3 so desirable. On the other hand, the GeForce GTX 580M wound up being the fastest mobile GPU we'd yet tested, made only more formidable through the SLI configuration the M18x enables."

Here are some more Mobile articles from around the web:

Mobile

 

Source: AnandTech

Benchmarking Bulldozer and taking the GPU out of the picture

Subject: Processors | October 17, 2011 - 05:06 PM |
Tagged: bulldozer, fx-8150, crossfire, gaming

One of the questions we have been asking about Bulldozer is how much it effects game play performance.  We know that for non-multithreaded applications th FX-8150 falls behind the top SandyBridge processors and barely breaks even on heavily multithreaded apps.  That doesn't necessarily mean that it will lag behind SandyBridge in gaming as many games do not utilize the CPU enough to make a huge difference, though that premise needs to be proved.  Enter Tweaktown who have taken the top Bulldozer and SandyBridge CPUs along with three Sapphire HD 6970 video cards, and placed them in a Maximus IV Extreme-Z and  Crosshair V Formula motherboard respectively.  With that much graphical power, it is possible to see the performance difference that the CPU and the motherboard chipset have on performance.  Read on to see how Bulldozer fared.

sadface.jpg

"We've already provided a fair bit of coverage on the new FX-8150 CPU from AMD and it hasn't all been favorable for the team over at AMD. If you haven't looked yet, I highly recommend you check out our other pieces that cover the VGA testing side of things and my editorial Shi**y Marketing Killed the Bulldozer Star which has really gained traction over the last few days.

Today we test the video card side of things a bit more and see what goes on when we start to make use of CrossFireX on the 990FX platform. The 990FX chipset shows some good potential and it's going to be interesting to see what happens when we start to make use of all those PCIe lanes that are on offer."

Here are some more Processor articles from around the web:

Processors

 

Source: Tweaktown

Silent CrossFire?

Subject: Graphics Cards | September 19, 2011 - 02:00 PM |
Tagged: gigabyte, HD6770 Silent Cell, crossfire, hd6770

The GIGABYTE HD6770 Silent Cell is a different take on your normal HD6770, it has no fans but does sport a double wide heatsink.  Sporting all of GIGABYTE's Ultra Durable components you can pick up this card for a hair under $140; or more importantly for [H]ard|OCP's review you can pick up two for about the same price as an HD6950 or a GTX 570 if you can find one on sale.  Their testing showed that at a resolution of 1920x1200 the two HD6770's could hold their own but they quickly fell behind the competition when resolution was raised beyond that point, which would include EyeFinity.  If your case has the space for the coolers and you have an overwhelming urge to play games at a 1080p resolution on a silent machine, then these cards are for you.  Apart from that goal, you are better served with a single higher power GPU.

H_silent_6770.jpg

"Today we'll find out if two of GIGABYTE's HD6770 Silent Cell cards can match the performance of a Radeon HD 6950. We were impressed with how this no-fan silent configuration CrossFireX setup performed. However, is it worth $280 against the falling prices of the Radeon HD 6950?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP
Author:
Subject: Editorial
Manufacturer: AMD

The Dirty Laggard

 

It may seem odd, but sometimes reviewers are some of the last folks to implement new technology.  This has been the case for myself many a time.  Yes, we get some of the latest and greatest components, but often we review them and then keep them on the shelf for comparative purposes, all the while our personal systems run last generation parts that we will not need to re-integrate into a test rig ever again.  Or in other cases, big money parts, like the one 30” 2560x1600 LCD that I own, are always being utilized on the testbed and never actually being used for things like browsing, gaming, or other personal activities.  Don’t get me wrong, this is not a “woe-is-me” rant about the hardships of being a reviewer, but rather just an interesting side effect not often attributed to folks who do this type of work.  Yes, we get the latest to play with and review, but we don’t often actually use these new parts in our everyday lives.

One of the technologies that I had only ever seen at trade shows is that of Eyefinity.  It was released back in the Fall of 2009, and really gained some momentum in 2010.  Initially it was incompatible with Crossfire technology, which limited it to a great degree.  A single HD 5970 card could push 3 x 1920x1080 monitors in most games, but usually only with details turned down and no AA enabled.  Once AMD worked a bit more on the drivers were we able to see Crossfire setups working in Eyefinity, which allowed users to play games at higher fidelity with the other little niceties enabled.  The release of the HD 6900 series of cards also proved to be a boon to Eyefinity, as these new chips had much better scaling in Crossfire performance, plus were also significantly faster than the earlier HD 5800 series at those price points.

eye_fin.jpg

Continue on to the rest of the story for more on my experiences with AMD Eyefinity.

Video Perspective: AMD A-series APU Dual Graphics Technology Performance

Subject: Graphics Cards, Processors | July 13, 2011 - 02:13 PM |
Tagged: llano, dual graphics, crossfire, APU, amd, a8-3850, 3850

Last week we posted a short video about the performance of AMD's Llano core A-series of APUs for gaming and the response was so positive that we have decided to continue on with some other short looks at features and technologies with the processor.  For this video we decided to investigate the advantages and performance of the Dual Graphics technology - the AMD APU's ability to combine the performance of a discrete GPU with the Radeon HD 6550D graphics integrated on the A8-3850 APU.

For this test we set our A8-3850 budget gaming rig to the default clock speeds and settings and used an AMD Radeon HD 6570 1GB as our discrete card of choice.  With a price hovering around $70, the HD 6570 would be a modest purchase for a user that wants to add some graphical performance to their low-cost system but doesn't stretch into the market of the enthusiast.

The test parameters were simple: we knew the GPU on the Radeon HD 6570 was a bit better than that of the A8-3850 APU so we compared performance of the discrete graphics card ALONE to the performance of the system when enabling CrossFire, aka Dual Graphics technology.  The results are pretty impressive:

You may notice that these percentages of scaling are higher than those we found in our first article about Llano on launch day.  The reasoning is that we used the Radeon HD 6670 there and found that while compatible by AMD's directives, the HD 6670 is overpowering the HD 6550D GPU on the APU and the performance delta it provides is smaller by comparison.  

So, just as we said with our APU overclocking video, while adding in a discrete card like the HD 6570 won't turn your PC into a $300 graphics card centered gaming machine it will definitely help performance by worthwhile amounts without anyone feeling like they are wasting the silicon on the A8-3850.  

Source: AMD