• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Did Nvidia purposely gimp the performance of 50xx series cards with drivers

Except the claim is not about compute performance. Do try to keep up.
It doesn't matter performance is performance, also read the stuff that's written in the link I posted before commenting, this had to do with generic graphics performance as well.
 
I remember generations back people suspected Nvidia gimp the previous gen in drivers so they bought the new gen. This doesnt make sense why nvidia would gimp there new cards on purpose because what is the gain ?
In my honest opinion i dont think Nvidia has as many good chips which is why they are dumping them off now and going to badge the new ones with super charging more
 
It doesn't matter performance is performance, also read the stuff that's written in the link I posted before commenting, this had to do with generic graphics performance as well.
Interesting that there's no benchmarks of that "generic graphics performance" there, then.
 
It's a good question actually.

I believe my 4070 Super is nerfed, but strictly by power limits. 220w, that card is capable of so much more. (Can't speak to the experience of a 5070/ti/Super, but anything.... is possible.
That’s a fine position to take, obviously pumping more power usually gets a performance uplift. I suppose NV reasoning is that the lower in the stack the higher the probability of the user having a sub-par or lower wattage PSU or maybe wanting to use the card in a more restricted environment like SFF. It’s a “nerf” technically, but makes some sense at least. What the article suggests does not, on the other hand.

Interesting that there's no benchmarks of that "generic graphics performance" there, then.
What, you’re unaware that compute-wise “unnerfed” Quadros and now RTX A’s slap the hell out of the puny consumer cards in games and every real gamer should run one? Dirty casual.

I remember generations back people suspected Nvidia gimp the previous gen in drivers so they bought the new gen. This doesnt make sense why nvidia would gimp there new cards on purpose because what is the gain ?
That was Kepler times. Turns out, there was nothing sinister going on, just Kepler REALLY fell off a cliff with the new generation of consoles and the changed way of doing things that came with it. The architecture just basically became rapidly outdated and insufficient, while GCN fared better for obvious reasons.
 
Interesting that there's no benchmarks of that "generic graphics performance" there, then.
They are testing 3D modeling software, what is that if not "graphics performance", it's not just generic compute performance. The SPECapc version that they use is a benchmark specifically aimed at graphics tasks, again, read what's actually written in that link.
 
Except the claim is not about compute performance. Do try to keep up.
It doesn't matter if it's compute performance, Nvidia has been known to nerf performance for years and it wouldn't surprise me if Nvidia did purposely limit cards to upsell people who want more performance to the 5090.
Yeah, "sweeping it under the rug" by acknowledging the problem and replacing the defective units. You AMD fanboys will make up whatever shit you want whenever you want to justify your irrational hatred.
That isn't a full recall, only enthusiasts who know what GPU-Z is would even know they have missing ROPs. You Nvidia fanboys will defend Nvidia for anything to justify your favorite brand.
 
giphy.gif


Lol. This one is almost as good as ziomario's tablet adventure.
 
8800GTS 320,640, 8800GTX, GTX260 192 +216, GTX 970, I am sure I am missing a few.. but those are the ones off the top of my head where Nv tried to pull fast ones.

But that guy.. talking about how he saw the code, unlocked something.. I could see that. Too bad he was so abrasive.
 
That’s a fine position to take, obviously pumping more power usually gets a performance uplift. I suppose NV reasoning is that the lower in the stack the higher the probability of the user having a sub-par or lower wattage PSU or maybe wanting to use the card in a more restricted environment like SFF. It’s a “nerf” technically, but makes some sense at least. What the article suggests does not, on the other hand.
If was as simple as a driver mod, then where's the evidence of that? How is this accomplished? Where's the "how to"?
There isn't one. driver modifications don't typically exceed like 5% performance uptick. That's a known thing since ATI was still a thing. (Omega drivers).
If making a 3rd party driver was that easy today, we'd still see only single digit uplift (all around performance)
The 4070 Super has more cudas (than a standard 5070), more ROPs all that fancy talk, yet is slower only due to clock frequency.

Because 4000 and 5000 series use the same driver, on that note, all the 4000 series users should be able to obtain the same "unlock performance"
 
Guy sounds like Terry Devis but without actual software genius part and some of you still think he's right and nvidia slashed 40% of 5080 performance. One step away from takin pitch forks and trying to burn papa Jensen on the stake because of witchcraft.
 
it's would explain everything that nvidia gimp the rtx 5000

nvidia said that the 5070 is on par with the 4090 but with TPU testing the 5070 only achieve 62% of the 4090 and that contradict what NVidia said. with 40% more performance, the 5070 is closer (86-87%) to the 4090 .

imagine the 5090, with 40% more perf it's would be 89 more powerful than my 4090. if that true then, i don't want the 5090, i need it

1742937898978.png
 
nvidia said that the 5070 is on par with the 4090 but with TPU testing the 5070 only achieve 62% of the 4090 and that contradict what NVidia said. with 40% more performance, the 5070 is closer (86-87%) to the 4090 .
While that was a massive fucking lie on NVidias part, they sort of covered their asses since that figure was (whether it was stated upfront or implied) taking into account the new MFG that is the, uh, selling feature (?) for Blackwell, so they can go “uhhh akshually” on this point. I am sure that there is an absurd scenario where in some game with MFG enabled the 5070 is teeeeeeechnically getting the same reading on the FPS counter as the 4090 without any FG. Still a massive lie, but… you know. Legally sort of not.
 
it's would explain everything that nvidia gimp the rtx 5000

nvidia said that the 5070 is on par with the 4090 but with TPU testing the 5070 only achieve 62% of the 4090 and that contradict what NVidia said. with 40% more performance, the 5070 is closer (86-87%) to the 4090 .

imagine the 5090, with 40% more perf it's would be 89 more powerful than my 4090. if that true then, i don't want the 5090, i need it

View attachment 391602
Jensen just after saying that added that it's thanks to the power of AI. Meaning it's because you can enable MFG on 5070 and fps counter will go brrrr.
Yeah it was bullshit marketing statement but my god don't believe some random guy without any proof that 5070 could actually be as fast as 4090 if only Nvidia didn't block it.
The hardware is simply not there, neither in 5070 or 5080.
Or are you saying that Nvidia made so groundbreaking improvements on their architecture that suddenly a lot smaller chips can beat really big one?
But that goes against the fact that "Nvidia doesn't change anything since RTX 20".
 
Read this thread, There are some posts that are questioning why Nvidia would do this. Someone claims to have fixed a file that allows 50xx series to seriously perform better

That thread has already been deleted, because it's stupid FUD.

Reddit, despite being a cesspool of questionable intellect and morals, is still capable of weeding out truly stupid bullshit.
 
I reverse-engineered the RTX 5080’s architectural lock (sm_120 / Blackwell) that NVIDIA buried in libcuda.so, and after manually patching the driver, I achieved benchmark scores above the RTX 5090 using the same workloads you’re citing.

Definitely makes sense from a business perspective to use all that extra silicon for no reason whatsoever.
 
Last edited by a moderator:

Did Nvidia purposely gimp the performance of 50xx series cards with drivers

Purposely, I doubt, but given the other myriad of things that have gone wrong with the 50 series/launch, I have to assume it's possible they buggered up something else too.

Having said that, I won't be holding my breath for a 30% uplift on the 5080.

Yeah, "sweeping it under the rug" by acknowledging the problem and replacing the defective units. You AMD fanboys will make up whatever shit you want whenever you want to justify your irrational hatred.
The expectations here from that crowd are sky high as usual, they want a public teary apology, a full recall of any models that could possibly be affected, and a pay-out for emotional damage that extends primarily to those who didn't buy the cards but are offended that this even occurred :laugh:
 
The hyperbole aside, yes a full recall of all models possibly affected is how a company that cares about their customers should operate because not even Nvidia knew how many models were potentially defective until people found missing ROPs, but I'm not even going to argue with someone so high on the team green copium they bought a 5080.

Anyway, it doesn't make much sense why Nvidia would lock away a part of the chip, but it seems like the guy was spreading BS here and all over reddit.
 
The hyperbole aside, yes a full recall of all models possibly affected is how a company that cares about their customers should operate because not even Nvidia knew how many models were potentially defective until people found missing ROPs, but I'm not even going to argue with someone so high on the team green copium they bought a 5080.
It's not a safety issue, it's a product performance issue affecting a small %, and you get your RMA if affected. This does not justify a recall.

So nice of you to refer to directly to me without quoting/tagging me, but I didn't buy a 5080 at all, I was bought one as a gift. I was waiting for the 5070Ti and 9070XT to launch, and would have bought a 9070XT for myself. Feels like a dog act to sell a gift now to pocket the difference and take a downgrade.

And high on "team green copium" about what exactly? I readily admit the launch was bad with multiple issues. You can be so ridiculous :slap:
 
It's not a safety issue, it's a product performance issue affecting a small %, and you get your RMA if affected. This does not justify a recall.

So nice of you to refer to directly to me without quoting/tagging me, but I didn't buy a 5080 at all, I was bought one as a gift. I was waiting for the 5070Ti and 9070XT to launch, and would have bought a 9070XT for myself. Feels like a dog act to sell a gift now to pocket the difference and take a downgrade.

And high on "team green copium" about what exactly? I readily admit the launch was bad with multiple issues. You can be so ridiculous :slap:

Stop sniffing that green dust bro it's bad for you. :roll::toast:


A free gpu is awesome no matter which one it is people are just jello.... I like how people will trash us for not liking AMD gpu's even though most are rocking AMD CPUs which clearly means if they just made a half competent product we'd jump on it like a fly on shit.
 
Screenshot from 2025-03-26 05-45-08.png

Frustrated World Cup GIF



Seriously, if you read this guy's replies and don't immediately notice the ChatCrappyTea-style bullshit, you need to replace your brain with one. :laugh:

I mean...
Screenshot from 2025-03-26 05-59-57.png Screenshot from 2025-03-26 05-59-48.png
Screenshot from 2025-03-26 06-01-38.png
Screenshot from 2025-03-26 06-03-33.png

Nvidia has been gimping compute performance on cards in the past via software and then unlocked it after some time, that's not even a speculation, it's matter of fact so this wouldn't be surprising. They did it back when Vega was released so they can beat AMD in professional applications : https://techgage.com/article/quick-...-performance-boosting-385-12-titan-xp-driver/
Gimping application-specific performance in order to promote your own, more expensive product that are marketed for said applications makes sense (from a cynical pov).
Gimping performance in the same applications you're marketing the cards for without having any other product you're trying to boost or gaining anything whatsoever is... Well...

CAD/CAE viewports may use similar APIs to your typical video game, but the renderer is not designed nor do they work the same way. Plus, GPU vendors do provide program-specific optimisations (the so called "profiles"), and said programs are obviously classified into "games" and "stuff you'd expect a professional working for a $$$-loaded engineering firm be using."
 
View attachment 391633

Frustrated World Cup GIF



Seriously, if you read this guy's replies and don't immediately notice the ChatCrappyTea-style bullshit, you need to replace your brain with one. :laugh:

I mean...


Gimping application-specific performance in order to promote your own, more expensive product that are marketed for said applications makes sense (from a cynical pov).
Gimping performance in the same applications you're marketing the cards for without having any other product you're trying to boost or gaining anything whatsoever is... Well...

CAD/CAE viewports may use similar APIs to your typical video game, but the renderer is not designed nor do they work the same way. Plus, GPU vendors do provide program-specific optimisations (the so called "profiles"), and said programs are obviously classified into "games" and "stuff you'd expect a professional working for a $$$-loaded engineering firm be using."
But bruh he reverse-engineered the NV drivers by himself and figured out that enabling SM-120 will give you free FPS and 4 extra inches of E-Peen, my guy even had a Teams talk with high up NV execs...... why the hate :laugh:

In other news a pig was spotted flying over the white house in DC as Biden stormed it and kung-fu kicked Trump back to the 90's and retook his presidency :laugh:
 
Gimping performance in the same applications you're marketing the cards for without having any other product you're trying to boost or gaining anything whatsoever is... Well...
It's not really about the why it's about whether or not they can gimp performance on purpose. And the answer is yes, not only that but they have done it in the past.
 
Last edited:
It's not really about the why it's about whether or not they can gimp performance. And the answer is yes, not only that but they have done it in the past.
Yes. And they can give 1,000,000 lucky people free RTX 5080s. They have done something that looks similar, if you completely overlooked key differences, in the past.

Geohotz suspected tensor core performance is artificially limited on AD102 chips some time ago as well and that guy is definitely not some random idiot that doesn't know what he is talking about.
Dunno about that, but nevertheless, if you're talking about the gaming RTX-bound chips, then that's the same as the Quadro/GTX viewport performance disparity case. Tensor cores are not used in games. For what uses them, Nvidia has other products they sell at a much greater profit margin.
 
Last edited:
if you completely overlooked key differences, in the past.
There is nothing to overlook, they release a product that's artificially limited, wait to see how competition performs, either unlock the performance or leave it as is so that the next generation may appear to have a larger performance uplift. That's what they did when Vega launched, it doesn't matter if it's compute or graphics, the principle is the same.
 
Back
Top