Review Index:
Feedback

Frame Rating: High End GPUs Benchmarked at 4K Resolutions

Author:
Manufacturer: Various

Our 4K Testing Methods

You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office.  Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160.  For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays.  Oh, and this TV only cost us $1300.

View Full Size

In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable.  You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz.  That doesn't mean we are limited to 30 FPS of performance though, far from it.  As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.

I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others.  Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome.  The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations.  Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle.  This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise. 

View Full Size

Image from Digital Trends

I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.

Continue reading our results from testing 4K 3840x2160 gaming on high end graphics cards!!

 

Throughout the story I'll have videos of our 4K footage on YouTube and to download natively.  The videos include our Frame Rating overlay on them but otherwise are simple H.264 nearly 100 mbps 3840x2160 videos.

View Full Size

If you just want some screenshots, I have put together a ZIP file of them that you can download right here. 

Download 4K Game Screenshots

 

Also worth noting is our continued use of our Frame Rating capture-based performance testing.  As far as I know, no other outlet or company (including AMD or NVIDIA) has figured out how to capture video reliably at 3840x2160 @ 30 Hz.  The current maximum that was supported by the FCAT-ready reviewers was 2560x1440 @ 60 Hz.  Here is a comparison:

  • 1920x1080 @ 60 Hz - 124.4 Mpix/s
  • 2560x1440 @ 60 Hz - 221.1 Mpix/s
  • 2560x1600 @ 60 Hz - 245.7 Mpix/s
  • 3840x2160 @ 30 Hz - 248.8 Mpix/s

So for the time being at least, we think we are the only ones providing you with accurate, capture-based performance testing results for high end graphics cards at 4K resolutions.  I hope you find the results informative!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card AMD Radeon HD 7990 6GB
AMD Radeon HD 7970 GHz Edition 3GB
NVIDIA GeForce GTX TITAN 6GB
NVIDIA GeForce GTX 690 4GB
EVGA GeForce GTX 680 4GB
NVIDIA GeForce GTX 680 2GB
Graphics Drivers AMD: 13.5 beta
AMD: Frame Pacing Prototype 2 (HD 7990)
NVIDIA: 320.00
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

What you should be watching for

  1. HD 7970 vs GTX 680 vs GTX Titan - We have combined the top end single GPU solutions in a single graph to see how they stack up and included the Titan even though it is twice the price of the HD 7970 and GTX 680.
  2. GTX 680 2GB vs GTX 680 4GB - Our friends at EVGA were kind enough to send us some 4GB versions of the GTX 680 so we could test how much the additional frame buffer affects performance and potentially stutter in SLI configurations.
  3. HD 7970 CrossFire vs GTX 680 SLI vs GTX Titan SLI - In reality, we should only be comparing the HD 7970 in CF and GTX 680s in SLI but we tossed in the Titan as well just to mix things up.
  4. HD 7990 vs GTX 690 vs GTX Titan - For this test we are calling the "$999 Level" I wanted to see how all the currently available thousand dollar graphics cards held up against one another, regardless of their single or dual-GPU status.
  5. HD 7990 13.5 beta vs HD 7990 Prototype 2 - Even though we know the frame pacing problems with the HD 7990 will continue at 4K resolutions, I have included results using the very early prototype driver from AMD with the HD 7990 as well as the currently available 13.5 beta driver.

 

April 30, 2013 | 12:56 PM - Posted by ervinshiznit (not verified)

How come you're running these tests with AA enabled? We use AA at standard resolutions because the pixels are "too big" and makes edges look jagged. At 4k I don't think it's necessary. Yes, a 55" screen at 4k has a lower ppi than 2560x1440 at 27". But you're going to be sitting much further back from the TV so you can actually see enough of the screen, making the pixels take up less radians of what you see, reducing the perceived pixel size. If we get a 4k monitor at 27" then I don't see AA being of any use at all.

April 30, 2013 | 12:59 PM - Posted by Ryan Shrout

I could still tell a difference between AA enabled and disabled in terms of quality.

April 30, 2013 | 01:53 PM - Posted by Randomoneh

Anti-aliasing will never become obsolete, no matter how high the [angular] resolution is, because Moiré patterns will appear otherwise.

May 4, 2013 | 06:53 AM - Posted by Wendigo (not verified)

Yes, but only SSAA was capable with this defect (moire pattern), the MSAA doesn´t work with the internal areas of the triangles.

September 8, 2014 | 01:14 PM - Posted by Anonymous (not verified)

MSAA does entire fucking screen idiot.

April 30, 2013 | 01:22 PM - Posted by Anonymous (not verified)

Aliasing(in whatever form it comes, e.g. shader aliasing) is still an issue on any resolution, no matter how high it gets.

Playing without AA is a no-go for me.
First I try if I can enable 4x or 8x SGSSAA and if performance is okay. Works great if you find the right compatibility bits to force it(DX9 games) or if the game offers ingame-MSAA to enhance via driver(mostly DX10 and 11 games).

If that fails, I try to downsample it via custom resolution in the NVControlPanel (3840x2160 or 2880x1620 to 1080p, which is essentially OGSSAA), maybe add post processing-AA too if it doesn't blur the textures too much.

Using post processing-AA(FXAA, MLAA, SMAA, TXAA) alone is the last resort.
Though recently some games (for example Max Payne 3's FXAA, Crysis 3's 4x SMAA mode which includes an MSAA-part) had some good implementations where not much sharpness was lost due to blur.

April 30, 2013 | 01:35 PM - Posted by DeadOfKnight

These "you can't tell the difference" comments gotta go. If you can't tell the difference then just say "I can't tell the difference" so we can all point at you and laugh, and then frown at our empty wallets.

April 30, 2013 | 01:10 PM - Posted by RuffeDK

Great article! I'm excited about the upcoming 4K displays :)

April 30, 2013 | 01:29 PM - Posted by rpdmatt (not verified)

Very awesome. I'm looking forward to seeing results from newer cards based on these resolutions. Are you guys ever going to do a less impromptu review of that TV or is the un-boxing all we get?

April 30, 2013 | 02:08 PM - Posted by Ryan Shrout

Honestly, that's probably going to be it for us.  We are not TV reviewers and I'll leave that to those that can do much better.  Check out HD Nation's information: http://revision3.com/hdnation/seiki-cheap-4k-tv

April 30, 2013 | 02:00 PM - Posted by w_km (not verified)

Can the current non-$999 GPUs run windows or surf the web smoothly (i.e. upscale 720p or 1080p, display website text smooth while scrolling, etc...)??? If not at 30 or 60fps, when would you expect to see sub-$500 cards capable of running office/non-gaming tasks at 4K?

May 1, 2013 | 05:35 PM - Posted by Anonymous (not verified)

Office apps etc and other 2D graphics take practically no effort at all for a graphics card. Thats already been true for years. Even a basic graphics card can already do it easily. Maybe movies in full-screen might be a load, but if you're not gaming, thats about the only thing that would be.

April 30, 2013 | 03:42 PM - Posted by SetiroN

Sorry to be that guy, but 3840x2160 is not 4K. Having pointed out both 1080p and 2K in the diagram, you should know the difference; this standards confusion is going to be problematic for the customer and the press isn't helping.

Anyway, 30Hz is absolutely a no-no. Too bad Display Port isn't common on TVs. Or in general.

April 30, 2013 | 04:30 PM - Posted by Randomoneh

If I remember correctly, a number of TV manufacturers changed "4K" to QFHD (Quad Full HD) in fear of possible lawsuits because similar thing happened before: manufacturers were sued for false advertising for size (example: 26.7 in as "27 in and similar).

Anyway, I prefer "2160p".

May 2, 2013 | 03:42 AM - Posted by Ryrynz (not verified)

LOL. SetiroN Why did you say that? If you're not 100% sure don't say anything.. You sir got OWNED.
WIKIPEDIA
BOOM!

July 25, 2013 | 08:08 AM - Posted by Dan (not verified)

Your Wikipedia link refers to the 4K UHD marketing term not the display standard. The corresponding standard for that resolution (3840×2160) is QFHD. The 4K standard is (4096×2160).

April 30, 2013 | 04:35 PM - Posted by Nick Ben (not verified)

Excellent article Ryan!

Have you had any success using a "dual mode" displayport to HDMI adaptor to get 4K?

I know there's a new connector type 2 that will allow it in the future, wasn't sure if you had any success with any of these cards getting 4K by conversion from displayport to HDMI?

April 30, 2013 | 05:54 PM - Posted by Ryan Shrout

I have only tried direct HDMI connection right now.  Why would you want a DP connection exactly for this implementation of 4K?  You won't be able to drive more than 30 Hz to any way.

April 30, 2013 | 06:09 PM - Posted by Nick Ben (not verified)

all recent nvidia Kepler Quadro cards and laptops only have displayport on them, so while not for gaming, they would make sense for 4K content creation

April 30, 2013 | 06:12 PM - Posted by Nick Ben (not verified)

the whole reason for displayport was supporting 10 bit color for color grading at 1 billion colors, but HDMI supports it now anyway, so that's why it's useful to convert from type 2 dual mode Displayport to HDMI at 1 billion colors for content creation

April 30, 2013 | 06:00 PM - Posted by Anonymous (not verified)

How do crossfire setup perform at 4k resolutions? I have also seen that running at higher resolutions with some configuration can alleviate micro-stuttering, can you test this?

Also why no crossfire tests in Far Cry 3 when you use 3/2-way sli?

April 30, 2013 | 07:48 PM - Posted by Ryan Shrout

We have HD 7970 CrossFire results for each game...

April 30, 2013 | 06:16 PM - Posted by traumadisaster

"I think the quad-HD resolution of 4K is a sight to behold." This is your last comment, I take that as you LOVE 4k gaming? My friend watched the preview and said he didn't see anybody jumping up and down. The first HD broadcast nfl game I saw, I was shocked. The first time in crisis seeing the jungle, my jaw dropped. The first 1440p game I was like I love this. My first dvd, then Blu-ray, etc.

I interpreted your non-verbal behavior as this was a jaw dropping experience for you guys, but since there was little verbal praise I'm wondering is this awesome. I explained to him that old, crusty, grizzled, seen it all, never give praise vets do love it.

April 30, 2013 | 07:49 PM - Posted by Ryan Shrout

The first impressions weren't as good as the extended impressions, because the TV had to be tuned a bit.  Turned down sharpness, turned off de-noise, etc.

April 30, 2013 | 06:25 PM - Posted by misschief (not verified)

Now call me daft (or stupid) but if the settings in the game are only showing 1080p (or 1200p) surely that means the game itself is being rendered by the PC at that resolution and the game is then being upscaled by the display? I'm probably missing something by only looking at the pretty pictures but until I see the actual resolution being played as a 1:1 ratio then I won't fully believe that the game is being 'played' at 4K, it's only being displayed at 4k. I'm happy to be corrected though!

April 30, 2013 | 06:32 PM - Posted by traumadisaster

Good question. I also wonder if the tv does automatic upscalling or is there options to pick 1440p or others. I wouldn't mind just using it as a big 1440p monitor since my 580 won't be able to keep up anyway.

April 30, 2013 | 07:47 PM - Posted by Randomoneh

I believe it really is 3840x2160. I've downloaded the video (horribly compressed though)and captured frames for close inspection. You can also take a look at the full-sized images that come with the article. There is AA so I can't be 100% sure but I think it's rendered at 2160p.

April 30, 2013 | 07:50 PM - Posted by Ryan Shrout

The settings screenshots are just there to show you the image quality settings, not the resolution.  All tests were run at 3840x2160.

April 30, 2013 | 08:16 PM - Posted by Mangix

You guys should try bumping up the refresh rate to something like 48Hz. 60Hz is most likely out of the question but with driver patching, going above 400Hz pixel clock should be possible. See:

http://www.monitortests.com/forum/Thread-NVIDIA-Pixel-Clock-Patcher
http://www.monitortests.com/forum/Thread-AMD-ATI-Pixel-Clock-Patcher

I believe EVGA Precision also allows bumping up the pixel clock to get higher refresh rates. Otherwise, using something like CRU(found in the links above) will help with making custom timing parameters which will lower the pixel clock further.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.