GeForce GTX 1080 and 1070 3-Way and 4-Way SLI will not be enabled for games

Subject: Graphics Cards | June 8, 2016 - 08:44 PM |
Tagged: sli, pascal, nvidia, GTX 1080, GP104, geforce, 4-way sli, 3-way sli

IMPORTANT UPDATE: After writing this story, but before publication, we went to NVIDIA for comment. As we were getting ready to publish, the company updated me with a shift in its stance on multi-GPU configurations. NVIDIA will no longer require an "enthusiast key" to enable SLI on more than two GPUs. However, NVIDIA will also only be enabling 3-Way and 4-Way SLI for a select few applications. More details are at the bottom of the story!

You'll likely recall that during our initial review of the GeForce GTX 1080 Founders Edition graphics card, we mentioned that NVIDIA was going to be moving people towards the idea that "only 2-Way SLI will be supported" and promoted. There would still be a path for users that wanted 3 and 4 GPU configurations anyway, and it would be called the Enthusiast Key.

As it turns out, after returning from an AMD event focused on its upcoming Polaris GPUs, I happen to have amassed a total of four GeForce GTX 1080 cards.

View Full Size

Courtesy of some friends at EVGA and two readers that were awesome enough to let me open up their brand new hardware for a day or so, I was able to go through the 3-Way and 4-Way SLI configuration process. Once all four were installed, and I must point out how great it is that each card only required a single 8-pin power connector, I installed the latest NVIDIA driver I had on hand, 368.19.

View Full Size

Knowing about the need for the Enthusiast Key, and also knowing that I did not yet have one and that the website that was supposed to be live to enable me to get one is still not live, I thought I might have stumbled upon some magic. The driver appeared to let me enable SLI anyway. 

View Full Size

Enthusiasts will note however that the green marker under the four GPUs with the "SLI" text is clearly only pointing at two of the GTX 1080s, leaving the remaining two...unused. Crap.

At this point, if you have purchased more than two GeForce GTX 1080 cards are simply out of luck and are waiting on NVIDIA to make good on it's promise to allow for 3-Way and 4-Way configurations via the Enthusiast Key. Or some other way. It's way too late now to simply say "we aren't supporting it at all." 

View Full Size

While I wait...what is there for a gamer with four GeForce GTX 1080 cards to do? Well, you could run Ashes of the Singularity. It's multi-GPU mode uses MDA mode, which means the game engine itself accesses each GPU on its own, without the need for the driver to handle anything regarding GPU load balancing. Unfortunately, Ashes only supports two GPUs today. could run an OpenCL based benchmark like LuxMark that access all the GPUs independently as well.

View Full Size

I did so, and the result is an impressive score of 17,127!!

View Full Size

How does that compare to some other products?

View Full Size

The four GTX 1080 cards produce a score that is 2.57x the result provided by the AMD Radeon Pro Duo and 2.29x the score of SLI GeForce GTX 980 Ti cards. Nice!

View Full Size

So there you go! We are just as eager to get our hands on the ability to test 3-Way and 4-Way SLI with new Pascal GPUs as some of the most extreme and dedicated enthusiasts out there are. With any luck, NVIDIA will finally figure out a way to allow it - no matter how it finally takes place.

IMPORTANT UPDATE: Before going to press with this story I asked NVIDIA for comment directly: when was the community finally going to get the Enthusiast Key website to unlock 3-Way and 4-Way SLI for those people crazy enough to have purchased that many GTX 1080s? The answer was quite surprising: NVIDIA is backing away from the idea of an "Enthusiast Key" and will no longer require it for enabling 3-Way and 4-Way SLI. 

Here is the official NVIDIA statement given to PC Perspective on the subject:

With the GeForce 10-series we’re investing heavily in 2-way SLI with our new High Bandwidth bridge (which doubles the SLI bandwidth for faster, smoother gaming at ultra-high resolutions and refresh rates) and NVIDIA Game Ready Driver SLI profiles.  To ensure the best possible gaming experience on our GeForce 10-series GPUs, we’re focusing our efforts on 2-way SLI only and will continue to include 2-way SLI profiles in our Game Ready Drivers.
DX12 and NVIDIA VR Works SLI technology also allows developers to directly implement and control multi-GPU support within their games.  If a developer chooses to use these technologies then their game will not need SLI profiles.  Some developers may also decide to support more than 2 GPUs in their games. We continue to work with all developers creating games and VR applications that take advantage of 2 or more GPUs to make sure they’ll work great on GeForce 10-series GPUs.
For our overclocking community, our Game Ready Drivers will also include SLI profiles for 3- and 4-way configurations for specific OC applications only, including Fire Strike, Unigine and Catzilla.

NVIDIA clearly wants to reiterate that only 2-Way SLI will get the attention that we have come to expect from the GeForce driver dev team. As DX12 and Vulkan next-generation APIs become more prolific, the game developers will still have the ability to directly access more than two GeForce GTX 10-series GPUs, though I expect that be a very narrow window of games simply due to development costs and time.

NVIDIA will enable support for three and four card configurations in future drivers (without a key) for specific overclocking/benchmarking tools only, as a way to make sure the GeForce brand doesn't fall off the 3DMark charts. Only those specific applications will be able operate in the 3-Way and 4-Way SLI configurations that you have come to know. There are no profiles to change manually and even the rare games that might have "just worked" with three or four GPUs will not take advantage of more than two GTX 10-series cards. It's fair to say at this point that except for the benchmarking crowd, NVIDIA 3-Way and 4-Way SLI is over.

We expect the "benchmark only" mode of 3-Way and 4-Way SLI to be ready for consumers with the next "Game Ready" driver release. If you happened to get your hands on more than two GTX 1080s but aren't into benchmarking, then find those receipts and send a couple back.

So there you have it. Honestly, this is what I was expecting from NVIDIA with the initial launch of Pascal and the GeForce GTX 1080/1070 and I was surprised when I first heard about the idea of the "enthusiast key." It took a bit longer than expected, and NVIDIA will get more flak for the iterated dismissal of this very niche, but still pretty cool, technology. In the end, this won't have much impact on the company's bottom line as the quantity of users that were buying 3+ GTX GPUs for a single system was understandably small.

Video News

June 8, 2016 | 09:04 PM - Posted by Anonymous (not verified)

So it appears to us that the screen disparity on Ashes snow rendering was caused by NV driver 368.19. Driver 368.25 should be fixed.

Fun FACT of the day: Async Compute is NOT enabled on the driver-side with public Game Ready Drivers. You need app-side + driver-side!

June 8, 2016 | 09:17 PM - Posted by Red (not verified)

Wow, just when I planned on keeping nvidia. Well perhaps AMD this Gen.

June 9, 2016 | 02:13 AM - Posted by renz (not verified)

did you use 3 way or 4 way CF?

June 9, 2016 | 09:45 AM - Posted by yup (not verified)

This such a stupid response. But that doesn't really surprise me.

June 9, 2016 | 10:34 AM - Posted by PhelanPKell (not verified)

You do realize that the likely reason this isn't being promoted moving forward is likely due to a combination of it being a niche market and it's complete lack of necessity.
With each new generation of cards the PCIe bus is getting handled better and better per card, making 3 and 4-way configurations less efficient or useful.

AMD would suffer from the same problem if their cards were as powerful, but cheap cards coupled with generally weak performance makes multi-GPU AMD solutions almost a necessity.

June 9, 2016 | 11:42 AM - Posted by xthetenth (not verified)

No, the problem is NV still relying on an archaic set of fingers for inter-card communication rather than using some of that giant wellspring of bandwidth that is the modern PCIe bus.

It's really obvious that you started with your conclusion and then tried to justify it, because it's flagrantly untrue.

They needed more bandwidth to catch up to modern CF, and they decided to use both fingers for it. It's really simple, and pretty excusable considering the niche that 3/4 card setups are, but they also for some reason felt like cementing a reputation for dishonesty.

June 9, 2016 | 01:08 PM - Posted by Anonymous (not verified)

Total lack of knowledge here. You sir, should not make comments. You embarrass your family back the dark ages with your stupidity.

June 9, 2016 | 02:12 PM - Posted by Anonymous (not verified)

He evaluated the previous posters statements, disagreed, described why and gave an alternate theory that better fit the perspective he has on the situation.

You, on the other hand, dismissed his entire post outright, provided no dialogue as to what you disagreed with or justification as to why, provided no alternate perspective, and proceeded to insult him as if adding in a few insults would somehow support your position.

He may in fact be wrong, but nothing about your post supports that point of view. It would seem your post applies better to you than him.

June 9, 2016 | 05:15 PM - Posted by Titan_Y (not verified)

Exactly. SLI bridges are a kludge no longer required by modern multiGPU solutions.

But hey... it is something else you can buy with the Nvidia logo on it.

June 10, 2016 | 02:36 PM - Posted by Allyn Malventano

When running at resolutions exceeding what the bridge can handle, NV cards *are* using PCIe bandwidth. The catch is that you can't stream a frame over an interruptable bus (halting the feed half way through the frame would be bad news), so you have to finish sending the frame before you can present it, meaning judder and frame pacing issues, which is why NV favors the bridge.

June 9, 2016 | 10:57 AM - Posted by Anonymous (not verified)

Damn, those fanboys just rage because you decided to go to AMD..

June 9, 2016 | 10:57 AM - Posted by Anon (not verified)

From someone who tossed a 390 CF (2 cards) out of his system because of all the issues: "Good luck."

June 9, 2016 | 08:59 PM - Posted by Zyxtomatic (not verified)

That was my thought. I originally had a pair of 290X gpus (I had always been a bit of an AMD fan), and game compatibility seemed to be subpar. I was always having graphical glitches or other oddities with all sorts of AAA games, not to mention the lack of quick driver updates with new Crossfire profiles for new games. I've since moved to a pair of GTX 980s, and I've had much better luck. I'm still not opposed to AMD at all, but I'd want to hear from folks with modern Crossfire experience before moving back to that platform.

June 8, 2016 | 09:27 PM - Posted by Zurv (not verified)

As part of the "small" group that uses 3 and 4 way SLI - this is pretty F'd up.

I already have a systems with 3 cards and a system with 3/4 1080 cards.

Nvidia documents clearly say that it was going to be unlocked and the limits were going to be for dx12. (not might... but will "at launch" Hell, even that video you guys did with the NVidia guy only talked about dx12. There is no reason why 3-4 way can't work with dx 11 games.
Sure there was no promise that future games will get NVidia crafted profiles.. but that is why there is nvidiaInspector.

Soooooo Nvidia .. will you give me money for these 4 extra waterblocked 1080s?

grumble.... i was under no illusion that 3/4 was going to be great.. it never was.. but.. blah

June 9, 2016 | 01:47 AM - Posted by Silent Scone (not verified)

Lol you idiot!

June 9, 2016 | 08:50 AM - Posted by John H (not verified)

Which games were you planning on using them for ? Star citizen seems like the only one that might want 4 today and them at 4k?

June 9, 2016 | 12:16 PM - Posted by Beagle (not verified)

I will buy one lol

June 9, 2016 | 02:30 PM - Posted by Anonymous (not verified)

The statement was made in the context of Pascal cards, but the statement seemed to reference nVidia GPUs in general. I assume this means Tri-SLI 980Ti setups have just lost their value going forward. Can someone confirm?

Didn't think I'd need to worry about upgrades for a good while. Seems the end date just moved forward a bit.

June 8, 2016 | 09:34 PM - Posted by Anonymous (not verified)

Where is the outrage?

June 9, 2016 | 08:09 AM - Posted by Bob Dole (not verified)

I'm outraged.

June 8, 2016 | 09:41 PM - Posted by D T (not verified)

Nvidia, the Apple style company in the GPU world!

June 8, 2016 | 09:46 PM - Posted by Keven Harvey (not verified)

Of the few people that actually run 3 and 4 way, I've heard at least some turn it off for gaming, and only use it for benchmarking, as the microstutter is a bigger issue than the few extra frames they get. Still, kind of a dick move to change things after launch.

June 8, 2016 | 10:53 PM - Posted by J Nevins (not verified)

SLI is bad enough to get working, any one who goes with 3 or 4 cards even with "full" support is kidding themselves due the the rapid diminishing returns and bug city central.

June 8, 2016 | 11:21 PM - Posted by axiumone (not verified)

Redicuolus and underhanded way to deal with your customers. Backpedaling on an anouncement? That's plain unaceptable, escpecially when you constantly tout the billions of dollars you've spent on r&d and that you have armies of engineers working around the clock.

They're more focused on selling us their shield tablets and tv's, quadro cards to rendering farms and putting mobile chips into our next car. Games just aren't important to them any longer.

June 9, 2016 | 05:35 AM - Posted by Martin Trautvetter

"They're more focused on selling us their shield tablets and tv's, quadro cards to rendering farms and putting mobile chips into our next car. Games just aren't important to them any longer."

That'd be strange, as most of the product lines you just listed are now defunct or, in case of the automotive stuff, going nowhere fast.

June 9, 2016 | 01:07 AM - Posted by Orthello (not verified)

Jeez the Early adopters tax eg Founders edition bollocks and now this. No 3-4 way gaming SLI support , you have to wonder what the 2 way will be like with that stance.

I wonder if TX and 980ti setups Tri and Quad will now get nerfed ?? I seriously think that could happen now. If i was a Tri sli owner in a maxwell gen card i'd be worried right now. Can't have 1080 SLI shown up with older cards.

That's on top of the great history NV has of supporting its aging product stack. I wonder how long before a 390x is as fast as a 980ti ...

Multiple nails in the coffin .. AMD bring Vega soon please.

June 9, 2016 | 05:50 AM - Posted by Martin Trautvetter

"I wonder if TX and 980ti setups Tri and Quad will now get nerfed ?? I seriously think that could happen now. If i was a Tri sli owner in a maxwell gen card i'd be worried right now. Can't have 1080 SLI shown up with older cards."

Nvidia have decided for this round that it was enough to have reviewers ignore their previously vaunted partner 980 Ti cards and only test the nerfed 980 Ti reference version to make the 1080 look better.

Those sites who had the nerves to include partner 980 Tis with higher clocks found themselves curiously lacking 1070s to review.

But I'm sure that was an honest oversight.

June 9, 2016 | 08:10 AM - Posted by Orthello (not verified)

Yeah its just so obvious with some sites. Following Nvidias review guide to a T. 1070 being a 980ti/titan x killer and all lolz.

Claiming great overclocking - when results are generally around 10-14% increase . This would be considered poor for Maxwell. My own TX overclocks about 35% above reference , putting it well above 1070 OC (22 % ahead in AOTS 4k) and above GTX 1080 stock (6% ahead in AOTS 4k). Meaning i've had this level of performance for over a year now , with 4 gigs more vram. Suddenly the TX seems like a good investment - particularly against the 1080 FE.

Trying to justify founders edition cards as "premium" pcbs lol .. give me a break , bog standard pcbs with bog standard cooler slapped on it. The kind of likes AMD was lambasted for not so long ago. Even when it makes more noise than said AMD cooler its apparently not an issue these days.

I just hope us Maxwell owners aren't about to be keplared in performance.

June 9, 2016 | 01:28 AM - Posted by quest4glory

I mean, if you want triple and quad SLI just for benchmarking, you can still do that, but what a waste of cash. Why don't you just send your cash directly to Nvidia as a paycheck deduction and they can send you a file with your weekly benchmark numbers. Would save everyone a lot of time.

June 9, 2016 | 01:54 AM - Posted by deadviny (not verified)

But, what about rendering, video editing software? Will those programs still be limited at two cards only?

June 9, 2016 | 02:19 AM - Posted by renz (not verified)

read the statement.

June 9, 2016 | 02:57 AM - Posted by Knutgrus (not verified)

Is this comment correct?

"...and 2.29x the score of a single GeForce GTX 980 Ti..." Under LuxMark results. Is it not 2.29 x the score of two 980 Ti in SLI?

June 9, 2016 | 11:41 AM - Posted by Ryan Shrout

Doh, yeah, thanks for catching that.

June 9, 2016 | 05:28 AM - Posted by Martin Trautvetter

I feel like benchmark vendors should disallow results from 3- and 4-way SLI machines manipulated in this way, as they don't correctly represent the capabilities of the hardware, and Nvidia are basically resorting to benchmark-specific drivers to gain an advantage not available outside of very specific scenarios.

June 9, 2016 | 08:52 AM - Posted by Zak Killian (not verified)

I agree, but doesn't this also apply to the Ashes benchmark AMD was showing off at Computex? Ashes is essentially the only game to use DX12 async compute OR explicit multi-adapter mode, which means it performs better on AMD hardware than basically any other game.

I mean, it's a little different, since Ashes is at least a GAME and not just a synthetic benchmark (isn't it?), but it still misrepresents the capabilities of the hardware pretty grossly.

June 9, 2016 | 12:10 PM - Posted by Martin Trautvetter

I'm fine with Ashes because, as you note, it's a glorified tech demo -errrr- "game" that someone, supposedly, might chose to "play."

These multi-GPU benchmark cheats don't even rise to that tentative level of legitimacy.

June 9, 2016 | 07:02 AM - Posted by Anonymous (not verified)

AMD take note; This is how you run a business. Don't help your customers, ride them like $2 hookers.

June 9, 2016 | 01:58 PM - Posted by Titan_Y (not verified)


June 9, 2016 | 07:40 AM - Posted by Anonymous (not verified)

"two readers that were awesome enough to let me open up their brand new hardware for a day or so"

Made my day. Always nice to see this kind of generosity in the community.

June 9, 2016 | 10:34 AM - Posted by Anonymous (not verified)

Do we know roughly what percentage of gaming computers are single GPU, 2, 3, and 4 Way SLI? My assumption is that single GPU PCs are most common followed by 2-Way SLI. The percentage of PCs that use 3/4-Way SLI has to be very low. From a business standpoint Nvidia had to be losing money keeping the drivers up-to-date for such a small fraction of their consumer base. From a consumer standpoint, Nvidia's decision to support single GPU and 2-way SLI drivers makes me happy, since all that time previously spent on 3/4-way configurations will now be utilized to make drivers better and more stable for the majority of gamers. This is not a bad business move, contrary to many on this thread.

June 9, 2016 | 11:01 AM - Posted by xthetenth (not verified)

Cutting support isn't a bad move at all, especially considering it'd be a different tech from what 2 cards use. It's the decision to cement their reputation as inherently untrustworthy that's a mistake.

June 9, 2016 | 11:33 AM - Posted by Anonymous (not verified)

I get that, companies sometimes need to be more transparent, but are fearful for whatever reason. Being so close guarded sometimes can have the opposite effect from what they were hoping.

June 9, 2016 | 10:46 AM - Posted by Anonymous (not verified)

Ridiculous decision, nice way to make yourselves irrelevant. Still, AMD are about to make Nvidia even more irrelevant with the 480X anyway.

June 9, 2016 | 10:56 AM - Posted by Anonymous (not verified)

How does this make them irrelevant? How many people out there were planning on using 3/4-Way SLI in their gaming rigs?

For the gamers using 1 or 2 GPUs, this decision has to mean better drivers. So for the vast majority of gamers out there, better support...isn't that the opposite of irrelevant?

June 9, 2016 | 11:06 AM - Posted by Anonymous (not verified)

See reply at 3:05pm

June 9, 2016 | 02:03 PM - Posted by Titan_Y (not verified)

Wait... better support of DX12/Vulkan? No full hardware support of async compute on the Nvidia side. And the Crimson drivers on the AMD side are quite excellent these days, despite the fallacious claims of green zealots.

Why would anyone buy a 980 for even $250 when the RX480 is less expensive and has full async compute support?

June 9, 2016 | 10:59 AM - Posted by xthetenth (not verified)

So is NV just allergic to honesty or something? This is such a stupid thing to lie over. So few people actually use it that it doesn't make sense to do the work for, so just actually say it and be done with it. But no, they have to come out say they support more cards. NV just flat out doesn't deserve trust any more.

June 9, 2016 | 11:08 AM - Posted by Anonymous (not verified)

Well said but you better beware the Nvidia fanboys/trolls trying to jusify every stupid decision they make.

June 10, 2016 | 09:08 AM - Posted by Anonymous (not verified)

The only thing I'll concede that I'm a fanboy of is Nintendo, but I'm hoping to break that curse. For everything else, I typically buy the best product available. Maybe I'm a PC fanboy, but I'll never buy a Mac because of the game selection. I have an iPhone and probably won't go Android because my wife loves her iPhone. I'm not an apologist for companies because I know they all have their faults. I'm a realist, which is why I understand NVidia's stance on the driver support.

February 16, 2018 | 09:59 PM - Posted by Travis (not verified)

nVidia has a more underhanded way of doing business for a while now. I would never support their business practices for just a marginal difference in performance, as often is the case.

June 9, 2016 | 11:05 AM - Posted by Anonymous (not verified)

They've cut out the whole high end market where all the revenue is! They'll know how many of them there are when their sales figures drop!
For gamers using 1-2 GPU's it doesn't mean crap. They are now encouraged to only buy 1 more powerful GPU instead of going SLI or tri/quad SLI. Nvidia have shot themselves in the foot with this one!

June 9, 2016 | 11:22 AM - Posted by quest4glory

I think you mean "profit" and not revenue, and I also think you need to look at what they charge for their GPUs and chips, and then rethink pretty much everything you just said.

June 9, 2016 | 10:52 PM - Posted by Anonymous (not verified)

What a surprise. In a thread critical of Nvidia their fanboys are out in force doing their usual BS mental gymnastics routine to rationalise and defend their arrogance and greed.

Look, if you're not being paid to defend their arrogance and greed you certainly should be.

Nvidia, sign this "quest4glory" tosser up!

June 9, 2016 | 11:31 AM - Posted by Anonymous (not verified)

Let's be generous and say that 10% of their consumer base uses 3 or 4-Way SLI. Then lets say a total of 20% are being used in these setups (more cards sold per user). Then that means 80% of cards sold are being used in single or 2-way GPU setups. To keep drivers up to date, 25% of their time is spent on each configuration. When 50% of your efforts are being spent on updating drivers for 10-20% of your consumers, that's wasted effort which is detracting from supporting the majority. If Nvidia dedicates all this time and effort on optimizing drivers for single and 2-way GPU setups, that means better support for the majority of their consumer base, which will translate into more sales. It's better for everyone except the enthusiast. While that sucks for those individuals, its better for most and likely a good business decision.

Another selling point they are trying to make, you know so people buy their cards, is that you don't need more than 2 GPUs to do 4K gaming or VR.

Note: Being generous on those percentages, I actually assume the percentage of gamers using 3/4-way SLI is likely 1% or less.

June 9, 2016 | 01:03 PM - Posted by Jeremy Hellstrom

Less than 0.5% according to any of the stats we can find.

June 9, 2016 | 02:22 PM - Posted by Anonymous (not verified)

What about just using the multi-adaptor support in the DX12/Vulkan APIs anyways and not having it in the Nvidia/AMD dirvers, but as part of the standard graphics APIs. That way it's up to the games developers to use as mny GPUs as needed and the Graphics API developers to offer a more standardized way for using more than one GPU from any GPU maker for Graphics/Compute workloads.

The Multi-GPU adaptor support really should have been inplemented in the OS/graphics API in the first place, as it is the computer/OS's/Graphics API's responsibility for offering up any and all of the hardware plugged into a computer for use, with the GPU/Other processors' drivers simply allowing the Graphics API/OS to command the GPU/any GPU or other processor to do the needed work.

June 9, 2016 | 10:58 PM - Posted by Anonymous (not verified)

Great, another paid Nvidia tosser doing the mental gymnastics routine.

"To keep drivers up to date, 25% of their time is spent on each configuration."
"When 50% of your efforts are being spent on updating drivers for 10-20% of your consumers"

Look, when you can justify any of the percentages you've used to then maybe people will listen, but if you're just going to pull figures from you know where well you can go and stick them in you know what.

June 10, 2016 | 08:58 AM - Posted by Anonymous (not verified)

Actually, I'll by the new AMD GPU if it's worth it. Can't wait for the reviews. I've never spent more than $220 on a GPU.

I did make up those numbers, but I used logic and common sense regarding the amount of effort updating drivers. You tell me what you think, what percentage of PC gamers out there are rocking 3 or 4-way SLI?

June 9, 2016 | 01:47 PM - Posted by Titan_Y (not verified)

So a gaming site uses a non-gaming benchmark to show off the prowess of a GPU? Seems legit.

June 10, 2016 | 12:51 AM - Posted by Anonymous (not verified)

Hey that's good for folks doing rendering, now I'd like to see what the results will be for 4 RX 480's, and there should be more non gaming graphics benchmarks run for Blender 3D also, because a lot of open source graphics software users need the information. I'm looking at building a Linux OS based Blender rendering system and I'm really interested in seeing what the RX 480 will do for some Blender rendering across 2 or 4 RX 480s.

Most of the graphics oriented websites do not get any review samples like the gaming websites do, so just to help out the independent games developers the gaming websites need to do more Graphics Rendering benchmarks! Those fancy 3d gaming models do not grow on trees, nor do the high resolution Textures baked onto the 3d meshes. So any gaming websites do a great service to the independent games makers by doing more rendering benchmarks on the lower cost hardware. Those pro level graphics cards are out of reach for most folks developing independent games, so doing more non gaming graphics benchmarks for the independent games developers helps all of gaming.

June 10, 2016 | 02:54 PM - Posted by Allyn Malventano

We're a gaming site? Huh. Who knew.

June 10, 2016 | 03:39 PM - Posted by Titan_Y (not verified)

The ads on the site certainly seem to think you are. Take away the gaming hardware and how well do you think you'd do?

June 11, 2016 | 12:37 PM - Posted by Dvon-E (not verified)

Well, I only read it for the storage, but it's hard not to notice the time spent on gaming rigs, gaming parties, Pick of the Week almost always including something gaming related, not to mention Frame Rating being oh so useful for PowerPoint and Excel.

Seriously, I agree that hardware is your thing, and games are the most widely available software that regularly pushes the limits of the hardware, so it's understandable that a lot of gaming references creep into the text. Would you agree that an outside observer might be forgiven for considering this site of interest to gamers?

June 11, 2016 | 09:43 PM - Posted by Allyn Malventano

No I get it, but our articles aren't about games or how they play, we focus on the hardware :)

June 9, 2016 | 06:38 PM - Posted by Anonymous (not verified)

well we will see what happens when people start buying 3 or 4 rx 480s for 600 or 800 dollars and everyone will just start looking at the bench marking programs as there now even less relevant

June 10, 2016 | 01:38 AM - Posted by Anonymous (not verified)

Most people don't give two craps about it. More than 2 Way SLI has always been terrible scaling anyway and most of those who buy more than 2 cards are there to show off their e-peen or to run Heaven Benchmark and 3DMark.

June 10, 2016 | 09:10 PM - Posted by Steve_H

Here's my shit guys. I watch your podcasts on youtube everyweek. I come here and read Jeremy having a blastasticle time reeming idoit mfgrs, when writely do they SO deserve.. And that creepy guy with every kbow case to mankind. Sebastian. Then there's the other tru nuts, Ryan, and Allyn. Shit I forgot the sex machine of the frozen wasteland. Josh.

Thing is, I love reading your shizz on site. I watch eveery podcast. I so suck!

June 11, 2016 | 01:43 PM - Posted by Anonymous (not verified)

I wonder what meat you are having with that Chianti and Fava beans?

June 10, 2016 | 09:13 PM - Posted by Steve_H

opps..forgot to mention Scott.. you asshole, I hate you. I'll follow you everywhere in this world to tell everone who bad you your cool beans too..bitch

October 25, 2016 | 04:52 PM - Posted by WYSIWYG (not verified)

So by limiting this, there won't be any way to run any game on 16k resolution (4k with DSR)? What rebate will I get for paying for 4-SLI support on my motherboard?

February 25, 2017 | 04:28 AM - Posted by Anonymous (not verified)

hmmm... i see... makes me wonder though. as i am an idiot, and lack the barest of knowledge beyond basic concepts... what is the limit of their restriction compared to the legal limit of firmware programming... its fine and dandy to say "you can't do this because the software won't let you." but fairly sure that legally there isn't much to it. you can't alter the Nvidia code, or create a bypass program (aka hacking it) but there should still be a way dependant on method.

but hey, im the guy who hates selfish rules to the point that i refuse to accept any "concept" that i can't find any viable argument in. EMF reading indicates paranormal activity? idk, but it is true that emf can spike and drop for absolutely no reason. (fact) Nvidia won't let me run a x4 1080 config? mmk, how exactly does it plan to stop me, and what skills would i need to tell you if it can actually do that? (concept lacking fact outside the realm of "they said so.) after all, some crazy idiot could go the hard way and rig a full custom on a linux kernel, and code in his own hardware and firmware configs. it literally boils down to depth. unless Nvidia makes it so that the cards implode using more than 2, there's going to be a way. the only real argument is legal or not. and even saying legal methods only, even an idiot like me knows that there are too many routes to cover, and many more than i can't even comprehend

July 15, 2017 | 02:32 PM - Posted by Anonymous_99 (not verified)

Here is the problem with all the latest new tech GPUs, CPUs, MBs etc. None of the manufacturers spend the time to get it right. They change/add tech and tend to lose functionality like x3/x4 SLI. The end result is buggy systems in general including x2 SLI. It is the year 2017 correct? You would think high end tech would get better, faster, simpler, higher quality etc. Although one could argue some of these things happen in certain areas, in general this is not my experience. As a long time high end enthusiast that use to do x4 SLI, dual CPUs, sub-zero cooling etc. I am crying uncle. The manufactures have ruined my hobby to the point I cannot stand it. Unfortunately my day job requires the use of semi high end computers to develop software. On a high note PCIe/NVMe RAID 0 seemed like a bit of fresh air but the bios issues and complexities early on were just another example of EPIC fail.

August 3, 2017 | 03:05 PM - Posted by pg (not verified)

I wish I read this before investing in 3 1070s... oh well. Guess I can always mine lol.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.