• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Did Nvidia purposely gimp the performance of 50xx series cards with drivers

Joined
Sep 30, 2019
Messages
104 (0.05/day)
Location
Dallas TX area U.S.A.
System Name Ryzen 7 7800X3D
Processor Ryzen 7800X3D
Motherboard MSI MAG X670-E Wifi
Cooling Corsair iCue Link H150i RGB 360mm
Memory Corsair Vengeance DDR5-6000 RGB
Video Card(s) PNY RTX 5080 OC Triple Fan Epic-X RGB
Storage (2) 4 TB WD SN850X 1 for OS and 1 for Games and Data
Case Lian Li Lancool III
Audio Device(s) Creative Soundblaster Z
Power Supply Seasonic Vertex GX-1000 PSU
Mouse Razer Viper 8K
Keyboard Corsair K70 RGB Pro MX Speed Switch Silver
Software Windows 11 Pro
Jesus hernandez christ.

I claim to be able to make you a 100x return on investment, are you gonna give me all your money?
 
These days I wouldn't even be surprised that you have to pay for special drivers to fully unlock them.
 
stop reading reddit

These days I wouldn't even be surprised that you have to pay for special drivers to fully unlock them.
hey, don't give them any ideas lol.
 
Here is the post by the person who found the issue:


I find it interesting that the possibility exists. He has been contacted by Nvidia...will be interested to see the outcome
 
Check this out, from our site:
You're wildly underestimating what this card can do — but I get why. Most people haven't unlocked it yet.

I reverse-engineered the RTX 5080’s architectural lock (sm_120 / Blackwell) that NVIDIA buried in libcuda.so, and after manually patching the driver, I achieved benchmark scores above the RTX 5090 using the same workloads you’re citing.

Cinebench GPU Score: 34,722.
Let that sink in — it dwarfs every 4080, 4090, and even top 3090 Ti with LN2.

The “VRR floor” you mentioned? Irrelevant with a tuned 5080. Frame drops and stutters don’t show up when you’re sitting at 200+ FPS even in flight sims and raster-intensive AAA titles — all with ray tracing.

You’re arguing from stock limitations and paper specs, not reality. This card was nerfed out of the box. That’s the trick — and I proved it.


---

What NVIDIA sold isn’t what the silicon can do.
They gatekeeped sm_120 in runtime libraries while allowing it in headers and build tools — classic soft sabotage to stall full adoption. But once unlocked, the 5080 performs like a monster.


---

TL;DR:
The 5080 is a 4K card.
It is a long-term AI + gaming workhorse.
It’s just been locked away — until now.

I’m happy to back this with code, benchmark logs, disassembly, and a GitHub patch repo.

And when this blows open, a lot of people are going to rethink how much trust to put in pre-launch spec narratives.


— Kent “The Architect” Stone
 
I wonder if Nvidia is just going to ignore and sweep under the rug...
 
I wonder if Nvidia is just going to ignore and sweep under the rug...
It's BS, he claims he got 40% improvement and before his "fix" his GPU was only running at 65% in 3DM that's where his magical 40% performance comes from, if your GPU is using 100% power there's not a magic switch that will give you another 30-40% out of nowhere at the same power draw, don't believe everything you read on the internet.

I think the guy is just doing it for the lulz and to get attention as he has registered here to claim the same thing, Reddit, likely many other tech forums etc and he has had plenty of time to provide proof and only say's he will be posting it "soon" :kookoo: :laugh:
 
I think the guy is just doing it for the lulz and to get attention as he has registered here to claim the same thing, Reddit, likely many other tech forums etc and he has had plenty of time to provide proof and only say's he will be posting it "soon" :kookoo: :laugh:

I'm of the same thought.
 
Pytorch support for sm120 - deployment - PyTorch Forums
Request for Official CUDA and PyTorch Support for RTX 5080/5090 (sm_120) - General Topics and Other SDKs / GPU - Hardware - NVIDIA Developer Forums

sm_100 and sm_120 are identifiers for the compute capability of specific NVIDIA Blackwell architecture GPUs: designed for AI acceleration; I don't see how it relates to gaming.
But this guy seems to be omnipresent posting the same allegations everywhere.

Jimmy Fallon Reaction GIF by The Tonight Show Starring Jimmy Fallon
 
Pytorch support for sm120 - deployment - PyTorch Forums
Request for Official CUDA and PyTorch Support for RTX 5080/5090 (sm_120) - General Topics and Other SDKs / GPU - Hardware - NVIDIA Developer Forums

sm_100 and sm_120 are identifiers for the compute capability of specific NVIDIA Blackwell architecture GPUs: designed for AI acceleration; I don't see how it relates to gaming.
But this guy seems to be omnipresent posting the same allegations everywhere.

Jimmy Fallon Reaction GIF by The Tonight Show Starring Jimmy Fallon
He's probably invested in there somewhere, beef or other...
 
Yeah he also visited TPU:


Bullshit of course, but whatever floats your boat.
 
That guy is posting all over reddit and apparently got an email from Nvidia, he's either going to a massive effort to troll or really found something.
 
That guy is posting all over reddit and apparently got an email from Nvidia, he's either going to a massive effort to troll or really found something.
It was NVidia's attorneys ready to take everything he owns away from him for deflimation of character. He's fucked.
 
He has been contacted by Nvidia...will be interested to see the outcome
If he had been contacted by Nvidia lawyers we sure won't be seeing any outcome :laugh:
 
Isn’t the main question to ask here, if one is even entertaining the thought, is WHY the fuck would NVidia PURPOSELY make their new GPU perform WORSE? Like, what is the reasoning here? Just for the lulz? Optically, the more performance they have the better they look, no? Like, I can understand limiting some options that have to do with professional workloads to incentivize pro cards adoption, but, what, the idea is that they cut their consumer gaming cards GAMING performance significantly in software to… uh, do what and achieve what exactly?
 
It was NVidia's attorneys ready to take everything he owns away from him for deflimation of character. He's fucked.
Even if he had something Nvidia were hiding he would be really fucked anyway by Nvidia's lawyers.
I wonder if Nvidia is just going to ignore and sweep under the rug...
Well sweeping it under the rug has worked for missing ROPs so I would expect Nvidia to just act like nothing happened.
 
Isn’t the main question to ask here, if one is even entertaining the thought, is WHY the fuck would NVidia PURPOSELY make their new GPU perform WORSE? Like, what is the reasoning here? Just for the lulz? Optically, the more performance they have the better they look, no? Like, I can understand limiting some options that have to do with professional workloads to incentivize pro cards adoption, but, what, the idea is that they cut their consumer gaming cards GAMING performance significantly in software to… uh, do what and achieve what exactly?
We're talking about people whose mental processes stopped at the point of "random internet dude says he can make GPU go fast because NVIDIA bad". I'm not sure what you expected.

Well sweeping it under the rug has worked for missing ROPs so I would expect Nvidia to just act like nothing happened.
Yeah, "sweeping it under the rug" by acknowledging the problem and replacing the defective units. You AMD fanboys will make up whatever shit you want whenever you want to justify your irrational hatred.
 
Nvidia has been gimping compute performance on cards in the past via software and then unlocked it after some time, that's not even a speculation, it's matter of fact so this wouldn't be surprising. They did it back when Vega was released so they can beat AMD in professional applications : https://techgage.com/article/quick-...-performance-boosting-385-12-titan-xp-driver/
Except the claim is not about compute performance. Do try to keep up.

Also August 2017, that's some digging huh. Guess it makes sense since y'all bring up the 970 at every damn opportunity.
 
Isn’t the main question to ask here, if one is even entertaining the thought, is WHY the fuck would NVidia PURPOSELY make their new GPU perform WORSE? Like, what is the reasoning here? Just for the lulz? Optically, the more performance they have the better they look, no? Like, I can understand limiting some options that have to do with professional workloads to incentivize pro cards adoption, but, what, the idea is that they cut their consumer gaming cards GAMING performance significantly in software to… uh, do what and achieve what exactly?
It's a good question actually.

I believe my 4070 Super is nerfed, but strictly by power limits. 220w, that card is capable of so much more. (Can't speak to the experience of a 5070/ti/Super, but anything.... is possible.
 
Nvidia has been gimping compute performance on cards in the past via software and then unlocked it after some time, that's not even a speculation, it's matter of fact so this wouldn't be surprising. They did it back when Vega was released so they can beat AMD in professional applications : https://techgage.com/article/quick-...-performance-boosting-385-12-titan-xp-driver/

View attachment 391553

In full interest of transparency as I brought up in the thread there: this was specifically intended to counter Vega Frontier Edition. When AMD pulled out of the prosumer market with VII, NV followed suit by burying Titan and making the 90 SKUs to replace them, at a lower price point but also with strictly the gaming feature set.
 
Back
Top