PCPer Live! Frame Rating and FCAT - Your Questions Answered!

Subject: Editorial, Graphics Cards | May 8, 2013 - 11:37 PM |
Tagged: video, nvidia, live, frame rating, fcat

Update: Did you miss the live stream?  Watch the on-demand replay below and learn all about the Frame Rating system, FCAT, input latency and more!!

I know, based solely on the amount of traffic and forum discussion, that our readers have really adopted and accepted our Frame Rating graphics testing methodology.  Based on direct capture of GPU output via an external system and a high end capture card, our new systems have helped users see GPU performance a in more "real-world" light that previous benchmarks would not allow.

I also know that there are lots of questions about the process, the technology and the results we have shown.  In order to try and address these questions and to facilitate new ideas from the community, we are hosting a PC Perspective Live Stream on Thursday afternoon.

Joining me will be NVIDIA's Tom Petersen, a favorite of the community, to talk about NVIDIA's stance on FCAT and Frame Rating, as well as just talk about the science of animation and input. 

fr-2.png

The primary part of this live stream will be about education - not about bashing one particular product line or talking up another.  And part of that education is your ability to interact with us live, ask questions and give feedback.  During the stream we'll be monitoring the chat room embedded on http://pcper.com/live and I'll be watching my Twitter feed for questions from the audience.  The easiest way to get your question addressed though will be to leave a comment or inquiry here in this post below.  It doesn't require registration and this will allow us to think about the questions before hand, giving it a better chance of being answered during the stream.

Frame Rating and FCAT Live Stream

11am PT / 2pm ET - May 9th

PC Perspective Live! Page

So, stop by at 2pm ET on Thursday, May 9th to discuss the future of graphics performance and benchmarking!

Zotac Announces Overclocked GTX TITAN AMP! Edition Graphics Card

Subject: Graphics Cards | May 1, 2013 - 05:41 AM |
Tagged: zotac, nvidia, gtx titan, AMP!

Zotac has announced a new GTX TITAN graphics card that will fall under the company’s AMP! Edition branding. This new Titan graphics card will feature factory overclocks on both the GPU and GDDR5 memory. However, due to NVIDIA’s restrictions, the Zotac GeForce GTX TITAN AMP! Edition does not feature a custom cooler or PCB.

The Zotac TITAN AMP! Edition card features a single GK110 GPU with 2,688 CUDA cores clocked at 902MHz base and 954MHz boost. That is a healthy boost over the reference TITAN’s 836MHz base and 876MHz boost clock speeds. Further, while Zotac’s take on the TITAN continues the reference specification’s 6GB of GDDR5 memory, it is impressively overclocked to 6,608Mhz (especially since Zotac has not changed the cooler). The GPU clocks might be able to be replicated by many of the reference cards though. For example, Ryan managed to get his card up to 992MHz boost in his review of the NVIDIA GTX TITAN.

Zotac GeForce GTX Titan AMP! Edition.jpg

The card has two DL-DVI, one HDMI, and one DisplayPort video output(s). The cooler, PCB, and PCI-E power specifications are still the same as the reference design. You can find more details on the heatsink in the TITAN review. Not allowing vendors to use custom coolers is disappointing and possibly limiting the factory GPU overclocks that they are able/willing to offer and support, but within that restriction the Zotac AMP! Edition looks to be a decent card so long as the (not yet announced) price premium over the $999 NVIDIA reference card is minimal.

Source: Zotac
Author:
Manufacturer: Various

Our 4K Testing Methods

You may have recently seen a story and video on PC Perspective about a new TV that made its way into the office.  Of particular interest is the fact that the SEIKI SE50UY04 50-in TV is a 4K television; it has a native resolution of 3840x2160.  For those that are unfamiliar with the new upcoming TV and display standards, 3840x2160 is exactly four times the resolution of current 1080p TVs and displays.  Oh, and this TV only cost us $1300.

seiki5.jpg

In that short preview we validated that both NVIDIA and AMD current generation graphics cards support output to this TV at 3840x2160 using an HDMI cable.  You might be surprised to find that HDMI 1.4 can support 4K resolutions, but it can do so only at 30 Hz (60 Hz 4K TVs won't be available until 2014 most likely), half the refresh rate of most TVs and monitors at 60 Hz.  That doesn't mean we are limited to 30 FPS of performance though, far from it.  As you'll see in our testing on the coming pages we were able to push out much higher frame rates using some very high end graphics solutions.

I should point out that I am not a TV reviewer and I don't claim to be one, so I'll leave the technical merits of the monitor itself to others.  Instead I will only report on my experiences with it while using Windows and playing games - it's pretty freaking awesome.  The only downside I have found in my time with the TV as a gaming monitor thus far is with the 30 Hz refresh rate and Vsync disabled situations.  Because you are seeing fewer screen refreshes over the same amount of time than you would with a 60 Hz panel, all else being equal, you are getting twice as many "frames" of the game being pushed to the monitor each refresh cycle.  This means that the horizontal tearing associated with Vsync will likely be more apparent than it would otherwise. 

4ksizes.png

Image from Digital Trends

I would likely recommend enabling Vsync for a tear-free experience on this TV once you are happy with performance levels, but obviously for our testing we wanted to keep it off to gauge performance of these graphics cards.

Continue reading our results from testing 4K 3840x2160 gaming on high end graphics cards!!

Podcast #248 - AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!

Subject: General Tech | April 25, 2013 - 02:13 PM |
Tagged: video, Xe, seiki, raidr, podcast, nvidia, Never Serttle, hd 7990, GA-Z77N-WiFi, frame rating, crossfire, amd, 4k

PC Perspective Podcast #248 - 04/25/2013

Join us this week as we discuss AMD HD 7990, CrossFire Frame Rating improvements, 4K TVs and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:16:34

  1. 0:01:20 Update on Indiegogo: You guys rock!
  2. Week in Review:
  3. News items of interest:
    1. Ryan: Seiki 4K TV - more support from enthusiasts! and wet puppies
    2. Jeremy: This is not news people, NFC is a feature but if you are paranoid you can check with this app
    3. Allyn: Put your bits on an ioSafe. Put your 'papers' here.
    4. Tim: BT Sync, it's in public alpha now so go grab it!
  4. 1-888-38-PCPER or podcast@pcper.com
  5. Closing/outro

 

NVIDIA's plans for Tegra and Tesla

Subject: General Tech | April 24, 2013 - 01:38 PM |
Tagged: Steve Scott, nvidia, HPC, tesla, logan, tegra

The Register had a chance to sit down with Steve Scott, once CTO of Cray and now CTO of NVIDIA's Tesla projects to discuss the future of their add-in cards as well as that of x86 in the server room.  They discussed Tegra and why it is not receiving the same amount of attention at NVIDIA as Tegra is, as well as some of the fundamental differences in the chips both currently and going forward.  NVIDIA plans to unite GPU and CPU onto both families of chips, likely with a custom interface as opposed to placing them on the same die, though both will continue to be designed for very different functions.  A lot of the article focuses on Tegra, its memory bandwidth and most importantly its networking capabilities as it seems NVIDIA is focused on the server room and providing hundreds or thousands of interconnected Tegra processors to compete directly with x86 offerings.  Read on for the full interview.

ELreg_nvidia_echelon_system.jpg

"Jen-Hsun Huang, co-founder and CEO of Nvidia has been perfectly honest about the fact that the graphics chip maker didn't intend to get into the supercomputing business. Rather, it was founded by a bunch of gamers who wanted better graphics cards to play 3D games. Fast forward two decades, though, and the Nvidia Tesla GPU coprocessor and the CUDA programming environment have taken the supercomputer world by storm."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register
Author:
Manufacturer: Various

A very early look at the future of Catalyst

Today is a very interesting day for AMD.  It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board.  Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver. 

If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed.  The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet.  That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised. 

Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February.  Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology.  We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light.  Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD. 

Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered. 

If you are just catching up on the story, you really need some background information.  The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically.  From that piece:

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand.  We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern.  Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case.  Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?”  My answer based on the below graph would be no.

runt.jpg

An example of a runt frame in a CrossFire configuration

NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen.  For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.

Continue reading our article on the new prototype driver from AMD to address frame pacing issues in CrossFire!!

New GeForce Game-Ready Drivers Just in Time for 'Dead Island: Riptide,' 'Star Trek', 'Neverwinter'; Boost Performance up to 20%

Subject: Graphics Cards | April 23, 2013 - 03:53 PM |
Tagged: nvidia, graphics drivers, geforce, 320.00 beta

NVIDIA rolled out a new set of beta drivers that provide up to 20% faster performance and have been optimized for a handful of new titles, including Dead Island: Riptide, Star Trek, and Neverwinter.

GeForce 320.00 beta drivers are now available for automatic download and installation using GeForce Experience, the easiest way to keep your drivers up to date.

With a single click in GeForce Experience, gamers can also optimize the image quality of top new games like Dead Island: Riptide and have it instantly tuned to take full advantage of their PC’s hardware.

Here are examples of the performance increases in GeForce 320.00 drivers (measured with GeForce GTX 660):

  • Up to 20% in Dirt: Showdown
  • Up to 18% in Tomb Raider
  • Up to 8% in StarCraft II
  • Up to 6% in other top games like Far Cry 3

For more details, refer to the release highlights on the driver download pages and read the GeForce driver article on GeForce.com.

Enjoy the new GeForce Game Ready drivers and let us know what you think.

nvidia-geforce-320-00-beta-drivers-gtx-680-performance.png

Windows Vista/Windows 7 Fixed Issues

The Windows 7 Magnifier window flickers. [1058231]

Games default to stereoscopic 3D mode after installing the driver. [1261633]

[GeForce 330M][Notebook]: The display goes blank when rebooting the notebook after installing th e driver. [1239252]

[Crysis 3]: There are black artifacts in the game. [1251495]

[Dirt 3]: When ambient occlusion is enabled, there is rendering corruption in the game while in split-screen mode. [1253727]

[3DTV Play][Mass Effect]: The NVIDIA Cont rol Panel “override antialiasing” setting does not work when stereoscopic 3D is enabled [1220312]

[Microsoft Flight Simulator]: Level D Simulations add-on aircraft gauges are not drawn correctly. [899771]

[GeForce 500 series][Stereoscopic 3D][Two World 2]: The application crashes when switching to windowed mode with stereoscopic 3D enabled. [909749]

[GeForce 660 Ti][All Points Bulletin (APB) Reloaded]: The game crashes occasionally, followed by a black/grey/red screen. [1042342]

[Geforce GTX 680][Red Orchestra 2 Heroes of Stalingrad]: Red-screen crash occurs after exiting the game. [1021046]

[GeForce 6 series][Final Fantasy XI]: TDR crash occurs in the game when using the Smite of Rage ability. [1037744]

[SLI][Surround][GeForce GTX Titan][Tomb Raider]: There is corruption in the game and the system hangs when played at high resolution and Ultra or Ultimate settings. [1254359]

[3D Surround, SLI], GeForce 500 Series: With Surround enabled, all displays may not be activated when selecting Activate All Displays from the NVIDIA Control Panel- > Set SLI Configuration page. [905544]

[SLI][Starcraft II][3D Vision]: The game crashes when run with 3D Vision enabled. [1253206]

[SLI][GeForce GTX 680][Tomb Raider (2013)]: The game crashes and TDR occurs while running the game at Ultra settings. [1251578]

[SLI][Starcraft II][3D Vision]: The game cras hes when played with 3D Vision and SLI enabled. [1253206]

SLI][Call of Duty: Black Ops 2]: The player emblems are not drawn correctly.

Source: NVIDIA

NVIDIA Bundles Metro: Last Light with GTX 660 and Higher

Subject: Graphics Cards | April 16, 2013 - 10:24 PM |
Tagged: nvidia, metro last light, Metro

Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards.  Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.

metro.jpg

Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did. 

metro2.jpg

This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.

NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.

The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.

How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more?  Did NVIDIA step up its game this time around?

Source: NVIDIA

NVIDIA Rumored To Release 700-Series GeForce Cards At Computex 2013

Subject: Graphics Cards | April 15, 2013 - 03:34 PM |
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex

Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.

NVIDIA GeForce Logo.jpg

According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.

Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.

What do you make of this rumor? Do you think a refreshed Kepler is coming this year?

Source: Fudzilla

A New Gigabyte WindForce 450W GPU Cooler May Be Coming to a GTX 680 and Titan Near You

Subject: Graphics Cards | April 14, 2013 - 07:59 PM |
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte

Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.

The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.

Gigabyte WindForce 450W GPU Cooler for NVIDIA GTX Titan and GTX 680 Graphics Cards.jpg

ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.

The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670  or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).

What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).