Fermi, Kepler, Maxwell, and Pascal Comparison Benchmarks

Subject: Graphics Cards | June 21, 2016 - 05:22 PM |
Tagged: nvidia, fermi, kepler, maxwell, pascal, gf100, gf110, GK104, gk110, GM204, gm200, GP104

Techspot published an article that compared eight GPUs across six, high-end dies in NVIDIA's last four architectures: Fermi to Pascal. Average frame rates were listed across nine games, each measured at three resolutions:1366x768 (~720p HD), 1920x1080 (1080p FHD), and 2560x1600 (~1440p QHD).

View Full Size

The results are interesting. Comparing GP104 to GF100, mainstream Pascal is typically on the order of four times faster than big Fermi. Over that time, we've had three full generational leaps in fabrication technology, leading to over twice the number of transistors packed into a die that is almost half the size. It does, however, show that prices have remained relatively constant, except that the GTX 1080 is sort-of priced in the x80 Ti category despite the die size placing it in the non-Ti class. (They list the 1080 at $600, but you can't really find anything outside the $650-700 USD range).

It would be interesting to see this data set compared against AMD. It's informative for an NVIDIA-only article, though.

Source: Techspot

Video News

June 21, 2016 | 05:37 PM - Posted by Murica (not verified)

nice they even did Cheap laptop resolution!
my 980 is still good enough for 1440p.

June 21, 2016 | 05:52 PM - Posted by Topinio

Yeah, the article is a based on a nice idea but doesn't really execute well; properly comparing the big and medium chips of each generation would've included GF104 and maybe GF114.

Don't like that it has GTX 580 as a GF100 card either, or the double GK110 GTX 780 and 780 Ti, yet no GTX 770 -- either doing the best or the first card built on each chip would've been better IMO.

June 21, 2016 | 05:58 PM - Posted by Scott Michaud

Yeah, I noticed those issues, too. Still thought it was interesting.

June 21, 2016 | 07:36 PM - Posted by Anonymous (not verified)

How about taking the same Nvidia cards and looking at their MSRP at new product release to market. And for historical reference measure what retail premium pricing the retailers charged when the cards where just introduced and the time it actually took for the retailers to actually reduce the pricing to MSRP or lower. And do the same for AMD's line of SKUs going back as far as GCN 1.0! And see which GPU maker's reference SKUs and GPU maker's partner/custom SKUs stayed above MSRP for what amount of time and how much retailer pricing/price gouging happened for how long and how much(in dollars) the over MSRP pricing lasted.

Those MSRP figures when the retail channel pricing premiums/out right price gouging are figured into the overall pricing equation are making things much more expensive. I'd like to see just what has been the main causes for these supply chain shortages/pricing over these last 4 GPU generations for both Nvidia's and AMD's last 4 generations of GPU SKUs.

June 22, 2016 | 12:26 AM - Posted by Michael McAllister

I wonder why they disabled AA in their testing. Odd.

June 22, 2016 | 07:40 PM - Posted by extide

Yeah, GTX580 was GF110

June 21, 2016 | 07:30 PM - Posted by Anonymous Nvidia User (not verified)

They are simply using the upper mainstream **80 class. Nice article by Techspot. All Nvidia I can dig it.

June 22, 2016 | 01:24 AM - Posted by Anonymous (not verified)

"All Nvidia I can dig it."

No one is surprised.

June 22, 2016 | 02:21 AM - Posted by Anonymous (not verified)

So you are more interested in paying for Nvidia's brand than you are interested in gaming. So it's a trophy thing with you and your bragging about spending your money on Nvidia's Brand and not for any GPU price/performance metric. Why not put a Nvidia 1080 founders edition on a gold chain and wear it around your neck, it's a jewelry purchase for you.

June 22, 2016 | 03:07 AM - Posted by Anonymous (not verified)

He's using a GTX 760 to power a 4k monitor.

So it's not even a trophy thing. He is a true Nvidia fanboy. Spends hours upon hours on comment threads trashing AMD and touting Nvidia, making arguments that completely contradict each other in the same threads, deliberately lying, misinterpreting data shown to him that proves he's lying, and - my favorite - accusing people he's decided are AMD fanboys of using old-gen AMD cards while hyping AMD's new-gen cards, while ignoring the fact that he's using an old-gen Nvidia card while hyping Nvidia's new-gen cards.

He's a hypocrite and a liar.

June 22, 2016 | 03:08 AM - Posted by Anonymous (not verified)

See some of his comments in the GTX 1080 Review VBIOS article - and the barrage of smackdowns he earned himself.

June 22, 2016 | 03:57 AM - Posted by Anonymous (not verified)

Witcher 3 resutls are soo fake. 73fps on 1080p on the 980 ? Maxed out (without hairworks) ? Hahahaa

They tested in cities. I finished the game twice, i assure you in deep woods, the fps drops below 60 on the 980.

All the gaming fps results need to be posted with some info about what game area they tested it.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.