Review Index:

NVIDIA Multi-Frame Sampled Anti-Aliasing (MFAA) Tested on GTX 980

Manufacturer: NVIDIA

MFAA Technology Recap

In mid-September NVIDIA took the wraps off of the GeForce GTX 980 and GTX 970 GPUs, the first products based on the GM204 GPU utilizing the Maxwell architecture. Our review of the chip, those products and the package that NVIDIA had put together was incredibly glowing. Not only was performance impressive but they were able to offer that performance with power efficiency besting anything else on the market.

Of course, along with the new GPU were a set of new product features coming along for the ride. Two of the most impressive were Dynamic Super Resolution (DSR) and Multi-Frame Sampled AA (MFAA) but only one was available at launch: DSR. With it, you could take advantage of the extreme power of the GTX 980/970 with older games, render in a higher resolution than your panel, and have it filtered down to match your screen in post. The results were great. But NVIDIA spent as much time talking about MFAA (not mother-fu**ing AA as it turned out) during the product briefings and I was shocked when I found out the feature wouldn't be ready to test or included along with launch.

View Full Size

That changes today with the release of NVIDIA's 344.75 driver, the first to implement support for the new and potentially important anti-aliasing method.

Before we dive into the results of our testing, both in performance and image quality, let's get a quick recap on what exactly MFAA is and how it works.

Here is what I wrote back in September in our initial review:

While most of the deep, architectural changes in GM204 are based around power and area efficiency, there are still some interesting feature additions NVIDIA has made to these cards that depend on some specific hardware implementations.  First up is a new antialiasing method called MFAA, or Multi-Frame Sampled AA. This new method alternates the AA sample pattern, which is now programmable via software, in both temporal and spatial directions.

View Full Size

The goal is to change the AA sample pattern in a way to produce near 4xMSAA quality at the effective cost of 2x MSAA (in terms of performance). NVIDIA showed a couple of demos of this in action during the press meetings but the only gameplay we saw was in a static scene. I do have some questions about how this temporal addition is affected by fast motion on the screen, though NVIDIA asserts that MFAA will very rarely ever fall below the image quality of standard 2x MSAA.

View Full Size

That information is still correct but we do have a little bit more detail on how this works than we did before. For reasons pertaining to patents NVIDIA seems a bit less interested in sharing exact details than I would like to see, but we'll work with what we have.

Continue reading our look at the new MFAA technology from NVIDIA's Maxwell GPUs!!

Previous-generation GPUs include fixed sample patterns for anti-aliasing (AA) that are stored in Read Only Memory (ROM). When gamers selected 2x or 4x MSAA for example, fixed sample patterns were used. With Maxwell, we have introduced programmable sample positions for rasterization that are stored on Random Access Memory (RAM), creating opportunities for new, more flexible, more inventive AA techniques that uniquely address the challenges of modern game engines, such as the increased performance cost of high-quality anti-aliasing. Maxwell's new RAM-based sample position technology can still be programmed with standard MSAA and TXAA patterns, but now the driver or application may also load the RAM with custom positions that are free to vary from frame to frame, or even within a frame.

While temporal anti-aliasing has definitely been done before, the key point appears to be its locality in graphics memory and the flexibility for the driver or game developer to specify these AA sample points on-demand. MFAA, as a technology, takes advantage of this implementation by alternates sample patterns on a frame-to-frame basis, offering potential image quality and performance advantages.

Interestingly, taking screenshots of MFAA doesn't work as it has with most other AA implementations. Users or reviewer's taking screenshots of MFAA with Fraps or similar capture utilities are not going to be the final image sent to the display; this same thing occurs with DSR-enabled games and screen captures. Instead, you need to use hardware capture to see the results of MFAA and we are already well-versed in that with our Frame Rating performance capture capabilities.

There are some disturbing limitations to MFAA today as well, the most daunting of which is the white-list of games that support it with this driver:

  • Assassin's Creed IV Black Flag
  • Assassin's Creed: Unity
  • Battlefield 4
  • Civilization V
  • Civilization: Beyond Earth
  • Crysis 3
  • DiRT 3
  • DiRT Showdown
  • F1 2013
  • F1 2014
  • Far Cry 3
  • Far Cry: Blood Dragon
  • GRID 2
  • GRID Autosport
  • Hitman: Absolution
  • Just Cause 2
  • Saints Row IV
  • Splinter Cell: Blacklist
  • Titanfall
  • Wargame: European Escalation

That's right, only 20 games will be able to take advantage of MFAA as of today. When initially discussed, NVIDIA stated pretty directly that this technology would be compatible any game and any engine, but that seems to have not worked out for them. Quite a few of NVIDIA partner games are missing: the Borderlands series, Far Cry 4, Call of Duty, etc. Though NVIDIA won't talk directly about, it seems likely that they ran into some image quality issues when doing long-term testing with MFAA in real-world gaming scenarios, and decided to go with the white list approach until it can all be ironed out.

Speaking of the white list, another disappointing point with this approach is that the white list is silent. If you enable MFAA in the control panel but start up a game that doesn't support it, you will receive no warning and no message to tell you that won't be getting the improvements associated with the feature. (You won't see any performance penalties either.) Instead, you'll have to research on your own if MFAA is at work to help decide if you want enable 2x MSAA or 4x MSAA in the game's settings. NVIDIA tells us that GeForce Experience support for MFAA will be here soon so only games that are on that approved list will have the feature enabled.

Users that purchased multiple GTX 980 or GTX 970 cards will be disappointed to find that MFAA is not supported with SLI yet; that will apparently have to wait for another driver update in the future. (Though how far in the future hasn't been said.)

Hiccups aside, enabling MFAA in your control panel is a simple prospect.

View Full Size

Another new line item in the control panel shows up if you have supported hardware and the options are simply on and off. Other than the over-crowding of the NVIDIA CP in recent months, leaving MFAA on will likely be an easy decision for GTX 980 and GTX 970 owners.

November 18, 2014 | 10:12 AM - Posted by Master Chen (not verified)


November 18, 2014 | 08:51 PM - Posted by schulmaster

Of course those AA methods offer superior image quality. A less divine aspect is that they also bring dramatic framtime augmentations. No one is saying MFAA is explicitly superior to FSAA, or even MSAA; they do say, however, that MFAA's frametime:imagequality ratio is very appealing. If I had had a Titan Black back when I had a GTX580, I'd be supersampling my heartout in my "Yes I can run Crysis T-shirt." If you want to clean up a Frostbite 3 frame, or load a Ubisoft game's title screen, efficient aliasing elimination is lucrative, and therefore worthy of Nvidia's attention.

November 19, 2014 | 04:51 AM - Posted by Master Chen (not verified)

MFAA is trash simply because it's actually worse than MSAA (which is already being a piece of crap to begin with).
MFAA soapens up the image like hell, just like that FXAA garbage did. And for a PC any soap ever is a very big no-no. Leave soap to consoles.
It does NOT matter if MFAA has "better" consumption rates in comparison to MSAA or something else. Because the soapy trash is still soapy trash, no matter how much you try to justify it's still-born "existence" or how hard you try to defend it in comparison to truly GODLIKE methods such as FSAA/SSAA.
The turd doesn't become any less turd if you make it out of gold and cover it with gems - it's still a turd in the end.

As for DSR...I've already said it many times before and I will never get tired of repeating this:
DSR effing sucks, seriously.
It tries to recreate FSAA/SSAA while making lesser hit on performance than the actual FSAA/SSAA does, but fails because it soaps up details rendered further away and produces glitches like the ones which AMD's CFAA had. Don't get me wrong though - it's utterly useless trash in comparison to the full-blown FSAA/SSAA only, by itself (if you don't consider that FSAA/SSAA exists out there) it's not all that bad. It's just fugly when you compare it to a pure FSAA/SSAA, not if you compare it to other methods. To put it out simply: there is still no technology better than FSAA/SSAA out there, FSAA/SSAA is STILL the most best method, there is nothing better. DSR is completely useless and just outright sucks, in that regard.
To clarify it somewhat easier: it's fine by itself. It's only truly bad if there's full-blown FSAA/SSAA around (like in Witcher 2, for example), because in those cases DSR's utter ugliness becomes extremely apparent and very easily noticeable. As for the glitches caused by DSR - go Google up "AMD CFAA glitches", DSR has pretty much same exact problems.

November 20, 2014 | 07:55 AM - Posted by Spunjji

The first 2 Google results for your search terms are your posts about DSR... The rest are not really useful.

November 20, 2014 | 11:55 AM - Posted by Master Chen (not verified)

Stop producing autism.

December 7, 2014 | 03:58 AM - Posted by Daniel W (not verified)

SSAA/FSAA doesn't work properly with deferred lighting. UE3 is the worst offender...

If I use 4xSSAA @ 2560x1080, theres lighting jaggies everywhere and the framerate goes down to ~20.

With 4xDSR the engine literally multi-samples everything, including the deferred lighting passes. While it doesn't result in a the complete elimination of sparkling for UE3, it does heavily reduce it compared to SSAA.

The frame rate is also about 3x better (averaging 58 vs 20 in mass effect 3). Also now with MFAA you can use 2x MSAA on top of DSR for free.

With engines that don't heavily rely on deferred lighting, SSAA works great too.

December 7, 2014 | 04:27 AM - Posted by Daniel W (not verified)

Okay disregard that. Looks like for ages I hadn't been setting up SGSSAA or TRSSAA properly and it wasn't handling the deferred lighting. But it appears it is the case that it does.

November 30, 2014 | 04:28 PM - Posted by Anonymous (not verified)

Wtf :D
Do you even know what DSR technology act. does ?
I suppose not, but if u are open minded - like every racíonal person ,learn it!.
Google: "GeDoSaTo Tool".
Super-sampling is something else than just some shitty api to smooth edges.
And I didn't even mention that before, SSAA & FSAA are JUST "golden shits with diamonds", because they're using supersampling method to begin with.

January 1, 2015 | 11:59 AM - Posted by Anonymous (not verified)

you have no clue what you are talking about, MFAA does not soften the image at all, it sounds like you are confusing it with DSR.

November 18, 2014 | 10:43 AM - Posted by Shortwave (not verified)

Again, isn't "DSR" something I've been doing for 2+ years now?
Yes, I have to actually ask that question!

Still can't for the life of me figure out if they are doing anything unique here..

Consistently for months before the 900's launched...
I would usually run older/lower end games at 2X+ my screens native resolution, by forcing custom desktop resolutions through Nvidia CP, then in windows setting my resolution back to normal, then in game the new higher resolutions are selectable, but when returning to desktop goes back to normal. Really easy.

But since then all I've been able to do is 1.5x.
I do have a dual-link DVI cable on it's way to see if my single-link that I found laying around is what's limiting it to 1.5..

But if I discover that it's not the cable locking it to 1.5..
Well.. That's some sad and shady business right there.

I quite miss 2X+ resolution.

Am I the only person who ever did this or what?
I can use some insight.

November 18, 2014 | 11:48 AM - Posted by Jesse (not verified)

I did the same until DSR was enabled. It's the 13-tap Gaussian filter that makes it superior to Custom Resolutions - Sharpness/Smoothness slider.

November 19, 2014 | 11:45 AM - Posted by Shortwave (not verified)

Cool, thanks man. I've not had time to go and research it more and when I did I didn't find any answers.

I'll look into that.

Though even with 2/4X and it's awesome AA effect I'd find myself using 2/4x AA still to really make it pop.

Perhaps that dithering can sort of replace the need for 2x AA.

November 18, 2014 | 10:52 AM - Posted by godrilla (not verified)

So Nvidia doesn't even add support for their own games that are in 2014? Watchdogs and farcry 4 not on the list? Assasins creed unity is the only one i noticed.

November 18, 2014 | 10:54 AM - Posted by Ryan Shrout

You are correct - these are pretty glaring omissions. Hopefully NVIDIA can ramp up the white list pretty quickly.

November 18, 2014 | 11:02 AM - Posted by godrilla (not verified)

Thanks im currently playing watchdogs with either txaa x4 os msaa x4 both are just horrible experiences at 1080p on new gtx 980 g1 @ 1500/7700 max everything, so i am in need of less taxing anti aliasing that doesn't have my frames jumping all over the place 70 to 50 to 70 and back again 50 causes screen tearing on my 120hz monitor.

November 18, 2014 | 10:56 AM - Posted by godrilla (not verified)

As well as redux

November 18, 2014 | 10:53 AM - Posted by InsidiousBoot (not verified)

MFAA shows promise but I do not own a 900 series card so I'll stick with other methods for now.

Regarding DSR , I was aware of Down Sampling before , but DSR adds an additional special filter which improves image quality.

November 18, 2014 | 10:55 AM - Posted by Ryan Shrout

Correct. Previous iterations of down sampling have used very basic filters that were not optimized for the feature they were being used.

November 19, 2014 | 11:46 AM - Posted by Shortwave (not verified)

I wonder if it's possible to create our our custom filters and just inject them into DX and again, use any card.

Or even rip their method entirely.

November 18, 2014 | 11:22 AM - Posted by Ti133700N (not verified)

So how does this work in terms of performance optimization ? Why does MFAA in Crysis 3 performs that much better than in GRID 2 ? I'm guessing none of those games have MFAA optimizations from the actual game developers ? So the implementation for each game has been done in the driver only ? Does this mean NVIDIA spent more time optimizing it for Crysis 3, can we expect that performance impact in future games that will take the time to optimize MFAA for their specific engine ? Who has access to the MFAA API, is NVIDIA contacting specific game developers to implement that feature in collaboration with them ?

November 18, 2014 | 11:50 AM - Posted by Orjon (not verified)

As a gamer AA is the last option I turn on in any game after I have every other setting to the Max. With BF4 for example I could not stand MSAA. I loved resolution scaling however. To me the quality comes first then performance. MFAA is cheap mens MSAA. If you don't have the GPU power for 4X then 2X will not be much better either.

July 21, 2015 | 02:21 AM - Posted by Branthog

July 2015 - MFAA still not available for SLI users.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.