Tuesday, March 1st 2022

NVIDIA DLSS Source Code Leaked

The mother of all cyberattacks hit NVIDIA over the weekend, putting out critical driver source-code, the ability to disable LHR for mining, and even insights into future NVIDIA hardware, such as the Blackwell architecture. An anonymous tipster sent us this screenshot showing a list of files they claim are the source-code of DLSS.

The list, which looks credible enough, includes C++ files, headers, and assets that make up DLSS. There is also a super-convenient "Programming Guide" document to help developers make sense of the code and build correctly. Our tipsters who sent this screenshot are examining the code to see the inner workings of DLSS, and whether there's any secret sauce. Do note that this is DLSS version 2.2, so a reasonably recent version including the latest DLSS 2.2 changes. This code leak could hold the key for the open-source Linux driver community to bring DLSS to the platform, or even AMD and Intel learning from its design. Stealing Intellectual Property is a big deal of course and NVIDIA's lawyers will probably be busy picking apart every new innovation from their competitors, but ultimately it'll be hard to prove in a court of law.
Add your own comment

83 Comments on NVIDIA DLSS Source Code Leaked

#26
erek
birdieI've checked the sources and indeed DLSS uses Tensor Cores.

You can probably run this logic on shaders but shaders already do shaders, so any additional load will slow down your game, therefore moving this logic from free to use tensor cores to shaders could result not in a performance gain but a performance loss.

Drivers source code is also there. It's a treasure trove of highly confidential info. NVIDIA might as well make DLSS open source (not necessarily under licenses like GPL but something proprietary e.g. like Epic or Crytek do with their engines).

I've downloaded the entire archive and I'm now repacking it because it was not optimally compressed. I expect it to come off at less than 8GB vs 18GB leaked by the hackers.
Do the drivers build?
Posted on Reply
#27
human_error
Much as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
Posted on Reply
#28
Dr. Dro
Never asked you guys anything... kindly leak the BIOS signature keys for Pascal, Turing and Ampere... we need BIOS editors to have sane settings and fix power limits for our GPUs... pretty please ;):D

ZekeSulastinThen AMD would have to answer as to why they came out with FSR instead of an actual DLSS competitor vOv

Y’all (including the author) realize that the leaked code is radioactive to the nouveau devs and the other companies, right? Or did you all forget how much of a pain in the neck the Windows XP source leak made things for Wine and ReactOS?
Using an NV GPU on Linux is bordering on madness right now. Getting my RTX 3090 to work on Fedora 35 basically required the proprietary drivers (because nouveau does not work and it defaults to llvmpipe, which could not be slower if it tried) and me to forge a covenant with an elder god at the blood price of time I will never get back.

Year of the Linux Desktop... right?
Posted on Reply
#29
bobalazs
yes this sorts of demands a glass of champagne. good riddance. should have followed Amd's example and make everything open source.
Linux users prefer AMD because of NVidia's greed.
Posted on Reply
#30
sith'ari
RidgeIf not it should be.
WOW , this person create a new account , and his :eek:1st post @techpowerup was ... :eek: to embrace theft !!!
Well buddy , when one day you create your own programs/software after years of hard work , only to find-out that someone hacked and stole your life's work and sold it online in order for him to profit through your hard work , then you can come back and say what you just said today.
I'm betting that if that day comes and such things happen to your own life's work ,you'll start to evaluate things a little differently...
Posted on Reply
#31
95Viper
Stop the Nvidia versus AMD versus others...
Stop the insults.
Discuss the topic, not the person posting.

Thank You.
Posted on Reply
#32
Xuper
You can use any idea on project thus you can't use their tech/source code , you can change everything but result is same as nvidia.
Posted on Reply
#33
mouacyk
Dr. DroNever asked you guys anything... kindly leak the BIOS signature keys for Pascal, Turing and Ampere... we need BIOS editors to have sane settings and fix power limits for our GPUs... pretty please ;):D





Using an NV GPU on Linux is bordering on madness right now. Getting my RTX 3090 to work on Fedora 35 basically required the proprietary drivers (because nouveau does not work and it defaults to llvmpipe, which could not be slower if it tried) and me to forge a covenant with an elder god at the blood price of time I will never get back.

Year of the Linux Desktop... right?
I'm sitting on the edge of my seat, contemplating whether I should shunt-mod my RTX 3080 or wait... got all the resistors here with soldering equipment ready to go.
Posted on Reply
#34
qlum
human_errorMuch as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
While I agree with most of this, I would put a caviat on Ray Tracing.
While Nvidia was two years ahead of AMD here, I don't think it really influenced the RT in RDNA 2.
It would have been to late to really change chipdesign at that stage, and Microsoft / Sony would already push for it to be included in their next gen game consoles.
Sure it may have accelerated the implementation in DX / Vulkan but that would be about it.
Posted on Reply
#35
Steevo
FluffmeisterIs IP theft celebrated here?
I am no celebrating the theft, but I am excited at the possibilities it gives the community, to see the inner workings of certain closed boxes that have presented issues for competitors, as if it allows shady things to be exposed then so be it, if it was merely a D move for profit so be it, if it was they implemented a truly unique piece of hardware that so be it.

I don't know who would actually profit from this beyond Nvidia if they get the hiveminds at coding sites to improve things or see things they don't. Intel might be able to, but again those drivers I'm sure have been ran on test benches that have already spit out machine code for them.

Its like we get to see in the Diary more than anything.
Posted on Reply
#36
mouacyk
Is there a potential concern for hostile/harmful takeover of remote GPU's, via either drivers or GFE?
Posted on Reply
#37
Legacy-ZA
human_errorMuch as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
*sigh*

Why did you have to go ahead and make sense, I really wanted to continue hating nVidia. :roll:
Posted on Reply
#38
James7787
Could not happen to better company. This is karma for supporting/not doing anything about miners and scalpers. I have a feeling AMD RDNA 4 all of sudden will make big leap in AI department.
Posted on Reply
#39
Chrispy_
Legacy-ZA
human_errorNvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
*sigh*

Why did you have to go ahead and make sense, I really wanted to continue hating nVidia. :roll:
Just so we're clear on this, Nvidia announced G-Sync and AMD said, "oh, we've already been doing this for years in laptops. Here's a VRR laptop from six years ago that uses VRR for power saving. Why the fuck have you re-invented this proprietary bullshit that we released to market half a decade ago?

RT-Raytracing is still very much up for debate on whether it's worth the cost. DLSS is just an expensive GPGPU motion-vector-based scaler that utilises otherwise unused hardware in their silicon. The tensor cores are otherwise completely useless die area in each and every Turing and Ampere GPU that were designed for compute and not gaming. You are paying for DLSS in silicon when you could have just had a faster GPU with more shaders in the first place.

Nvidia are like Apple, keen to claim they invented stuff but frequently just monetising and locking down someone else's ideas. I sit here on my fourth RTX card without raytracing enabled, and enjoying the artifact-free native resolution of DLSS-off, RTX-off gaming at vastly improved framerates.
Posted on Reply
#40
Dr. Dro
mouacykI'm sitting on the edge of my seat, contemplating whether I should shunt-mod my RTX 3080 or wait... got all the resistors here with soldering equipment ready to go.
Oh man, I just don't trust myself with taking a soldering iron to my exquisitely expensive GPU... motor coordination issues :nutkick:
Posted on Reply
#41
Chrispy_
mouacykIs there a potential concern for hostile/harmful takeover of remote GPU's, via either drivers or GFE?
Asking the real questions.
I'm sure there will be some new vulnerabilities discovered from this but chances are high that you'll need admin rights, device manager/service privileges to the machine to exploit it, at which point the machine is already 100% compromised in every way that matters and a GPU vulnerability is the least of your problems.
Posted on Reply
#42
DarthJedi
human_errorMuch as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
It doesn't work that way. There is no "making sure engineers avoid contamination with source code". Nothing you described works like that.
Posted on Reply
#43
Camm
I'm personally hoping we might get a SRIOV unlock for consumer Ampere. Would remove any need to dual boot Linux for gaming for me.
Posted on Reply
#44
mouacyk
Wow, they're demanding NVidia open source drivers from now on, or they'll unleash chipset secrets... oooh. Open source shouldn't be tainted that way.
Posted on Reply
#45
human_error
naxeemIt doesn't work that way. There is no "making sure engineers avoid contamination with source code". Nothing you described works like that.
Yeah dont try and bullshit this - 13 years in big enterprise software says otherwise. You can 100% get code contamination with engineers if they view code from another source which has restrictive licenses or IP which is not allowed to be used elsewhere in a company, usually as it could lead to IP infringement cases being brought if they work on code in a similar space to that which they viewed.

So if they work on a project, see restricted code from a 3rd party as needed in part of that project they are then contaminated and can't work on a new project that competes with the 3rd party code for some time, at least in big enterprise which will be the AMD/Intel operating environment. Same as if they viewed protected code that was leaked from a competitor - it can lead them to implement things in a way which they would not have done if they had not seen the restricted code, this getting value from seeing it.
Posted on Reply
#46
Vayra86
Open source drivers wouldn't hurt tbh. Please keep me safe from the Geforce ExperiencezzxXZXZxz junk

Maybe we can mod the UI of that ancient Nvidia config panel one day huh?
Posted on Reply
#47
Camm
human_errorYeah dont try and bullshit this - 13 years in big enterprise software says otherwise. You can 100% get code contamination with engineers if they view code from another source which has restrictive licenses or IP which is not allowed to be used elsewhere in a company, usually as it could lead to IP infringement cases being brought if they work on code in a similar space to that which they viewed.

So if they work on a project, see restricted code from a 3rd party as needed in part of that project they are then contaminated and can't work on a new project that competes with the 3rd party code for some time, at least in big enterprise which will be the AMD/Intel operating environment. Same as if they viewed protected code that was leaked from a competitor - it can lead them to implement things in a way which they would not have done if they had not seen the restricted code, this getting value from seeing it.
This isn't quite true, seeing code from another vendor doesn't preclude reverse engineering being illegal. However your defence that no competing code made it into your product is much harder to defend, which is why when this happens often there is two teams, one that documents functions, and the other that codes it.
Posted on Reply
#48
Chrispy_
Vayra86Open source drivers wouldn't hurt tbh. Please keep me safe from the Geforce ExperiencezzxXZXZxz junk

Maybe we can mod the UI of that ancient Nvidia config panel one day huh?
Oh, that WindowsXP-themed monstrosity that lacks the two most important things windows 10/11's own graphics settings can't do; Control your hardware clocks/fan/voltages, and combine display outputs into single virtual displays aka Eyefinity?

Yeah. It sucks so hard. People defend it for some reason, which is even weirder.
Posted on Reply
#49
wolf
Performance Enthusiast
Chrispy_You are paying for DLSS in silicon when you could have just had a faster GPU with more shaders in the first place.
On Turing at least RT+ Tensor cores combined account for at most 8-10% of die area, not exactly robbing most people of a bucketload extra performance that could have been CUDA cores. I'm all for improving graphics and adding innovative features instead of strictly more/better/faster of the same.
Chrispy_enjoying the artifact-free native resolution of DLSS-off
Native isn't perfection all the time either, far from it, and in fact often DLSS can improve on aspects of Native, especially when paired with meh TAA.

Use your graphics card however you'd like, naturally, but from where I'm sitting it's 8-10% die area put to great use.
Posted on Reply
#50
R-T-B
heni87Got'em xD

VBIOS and driver signer next? Is that too much to ask?
This is what I want. I want VBIOS modding to return.
Posted on Reply
Add your own comment
Jun 2nd, 2024 17:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts