• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA DLSS Source Code Leaked

It would be super ultra illegal if some programmer somewhere make a open source version of DLSS that runs on shader cores, imagine a DLSS that runs on AMD and <=Pascal GPU, probably be slow AF, but still pretty interesting.

Im guessing it will run in similar relative perf as the RT content did in Pascal. About 1/8th of the perf
 
I've checked the sources and indeed DLSS uses Tensor Cores.

You can probably run this logic on shaders but shaders already do shaders, so any additional load will slow down your game, therefore moving this logic from free to use tensor cores to shaders could result not in a performance gain but a performance loss.

Drivers source code is also there. It's a treasure trove of highly confidential info. NVIDIA might as well make DLSS open source (not necessarily under licenses like GPL but something proprietary e.g. like Epic or Crytek do with their engines).

I've downloaded the entire archive and I'm now repacking it because it was not optimally compressed. I expect it to come off at less than 8GB vs 18GB leaked by the hackers.
 
I've checked the sources and indeed DLSS uses Tensor Cores.

You can probably run this logic on shaders but shaders already do shaders, so any additional load will slow down your game, therefore moving this logic from free to use tensor cores to shaders could result not in a performance gain but a performance loss.

Drivers source code is also there. It's a treasure trove of highly confidential info. NVIDIA might as well make DLSS open source (not necessarily under licenses like GPL but something proprietary e.g. like Epic or Crytek do with their engines).

I've downloaded the entire archive and I'm now repacking it because it was not optimally compressed. I expect it to come off at less than 8GB vs 18GB leaked by the hackers.
Do the drivers build?
 
Much as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
 
Never asked you guys anything... kindly leak the BIOS signature keys for Pascal, Turing and Ampere... we need BIOS editors to have sane settings and fix power limits for our GPUs... pretty please ;):D

Cat GIF


Then AMD would have to answer as to why they came out with FSR instead of an actual DLSS competitor vOv

Y’all (including the author) realize that the leaked code is radioactive to the nouveau devs and the other companies, right? Or did you all forget how much of a pain in the neck the Windows XP source leak made things for Wine and ReactOS?

Using an NV GPU on Linux is bordering on madness right now. Getting my RTX 3090 to work on Fedora 35 basically required the proprietary drivers (because nouveau does not work and it defaults to llvmpipe, which could not be slower if it tried) and me to forge a covenant with an elder god at the blood price of time I will never get back.

Year of the Linux Desktop... right?
 
yes this sorts of demands a glass of champagne. good riddance. should have followed Amd's example and make everything open source.
Linux users prefer AMD because of NVidia's greed.
 
If not it should be.

WOW , this person create a new account , and his :eek:1st post @techpowerup was ... :eek: to embrace theft !!!
Well buddy , when one day you create your own programs/software after years of hard work , only to find-out that someone hacked and stole your life's work and sold it online in order for him to profit through your hard work , then you can come back and say what you just said today.
I'm betting that if that day comes and such things happen to your own life's work ,you'll start to evaluate things a little differently...
 
You can use any idea on project thus you can't use their tech/source code , you can change everything but result is same as nvidia.
 
Never asked you guys anything... kindly leak the BIOS signature keys for Pascal, Turing and Ampere... we need BIOS editors to have sane settings and fix power limits for our GPUs... pretty please ;):D

Cat GIF




Using an NV GPU on Linux is bordering on madness right now. Getting my RTX 3090 to work on Fedora 35 basically required the proprietary drivers (because nouveau does not work and it defaults to llvmpipe, which could not be slower if it tried) and me to forge a covenant with an elder god at the blood price of time I will never get back.

Year of the Linux Desktop... right?
I'm sitting on the edge of my seat, contemplating whether I should shunt-mod my RTX 3080 or wait... got all the resistors here with soldering equipment ready to go.
 
Much as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
While I agree with most of this, I would put a caviat on Ray Tracing.
While Nvidia was two years ahead of AMD here, I don't think it really influenced the RT in RDNA 2.
It would have been to late to really change chipdesign at that stage, and Microsoft / Sony would already push for it to be included in their next gen game consoles.
Sure it may have accelerated the implementation in DX / Vulkan but that would be about it.
 
Is IP theft celebrated here?

I am no celebrating the theft, but I am excited at the possibilities it gives the community, to see the inner workings of certain closed boxes that have presented issues for competitors, as if it allows shady things to be exposed then so be it, if it was merely a D move for profit so be it, if it was they implemented a truly unique piece of hardware that so be it.

I don't know who would actually profit from this beyond Nvidia if they get the hiveminds at coding sites to improve things or see things they don't. Intel might be able to, but again those drivers I'm sure have been ran on test benches that have already spit out machine code for them.

Its like we get to see in the Diary more than anything.
 
Is there a potential concern for hostile/harmful takeover of remote GPU's, via either drivers or GFE?
 
Much as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.

*sigh*

Why did you have to go ahead and make sense, I really wanted to continue hating nVidia. :roll:
 
Could not happen to better company. This is karma for supporting/not doing anything about miners and scalpers. I have a feeling AMD RDNA 4 all of sudden will make big leap in AI department.
 
Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
*sigh*

Why did you have to go ahead and make sense, I really wanted to continue hating nVidia. :roll:
Just so we're clear on this, Nvidia announced G-Sync and AMD said, "oh, we've already been doing this for years in laptops. Here's a VRR laptop from six years ago that uses VRR for power saving. Why the fuck have you re-invented this proprietary bullshit that we released to market half a decade ago?

RT-Raytracing is still very much up for debate on whether it's worth the cost. DLSS is just an expensive GPGPU motion-vector-based scaler that utilises otherwise unused hardware in their silicon. The tensor cores are otherwise completely useless die area in each and every Turing and Ampere GPU that were designed for compute and not gaming. You are paying for DLSS in silicon when you could have just had a faster GPU with more shaders in the first place.

Nvidia are like Apple, keen to claim they invented stuff but frequently just monetising and locking down someone else's ideas. I sit here on my fourth RTX card without raytracing enabled, and enjoying the artifact-free native resolution of DLSS-off, RTX-off gaming at vastly improved framerates.
 
I'm sitting on the edge of my seat, contemplating whether I should shunt-mod my RTX 3080 or wait... got all the resistors here with soldering equipment ready to go.

Oh man, I just don't trust myself with taking a soldering iron to my exquisitely expensive GPU... motor coordination issues :nutkick:
 
Is there a potential concern for hostile/harmful takeover of remote GPU's, via either drivers or GFE?
Asking the real questions.
I'm sure there will be some new vulnerabilities discovered from this but chances are high that you'll need admin rights, device manager/service privileges to the machine to exploit it, at which point the machine is already 100% compromised in every way that matters and a GPU vulnerability is the least of your problems.
 
Much as I dislike how locked down Nvidia keeps a lot of their features this is not something to be celebrated. I work in the software industry and this won't help any competitors - they all have to make sure that engineers don't get contaminated with source code they can't use as even just reading some of it or the overall approach can influence how they would approach a problem in an IP infringing way. Engineers who work on some projects where they can see code the wider business can't use are blacklisted from some projects for years so that no claims of IP theft can be made.

The community won't benefit from this as AMD and Intel engineers won't touch it, and any other groups who do will be sued into the ground if it looks like they've based their work on Nvidia code, even if they haven't, as they'd need to pay for a defense against lawsuits even if they're in the right.

Nvidia have spent money developing these technologies, and frustrating as it is that many are locked to the Nvidia platform it has spurred open competition - G-sync led to Freesync and VRR VISA spec, Nvidia Raytracing led to it being added to DX12 and AMD following suit, DLSS leading to FSR. Some of these may have happened anyway, but Nvidia may well have accelerated their release and pushed competitors to do more which is good for everyone.
It doesn't work that way. There is no "making sure engineers avoid contamination with source code". Nothing you described works like that.
 
I'm personally hoping we might get a SRIOV unlock for consumer Ampere. Would remove any need to dual boot Linux for gaming for me.
 
Wow, they're demanding NVidia open source drivers from now on, or they'll unleash chipset secrets... oooh. Open source shouldn't be tainted that way.
 
Last edited:
It doesn't work that way. There is no "making sure engineers avoid contamination with source code". Nothing you described works like that.

Yeah dont try and bullshit this - 13 years in big enterprise software says otherwise. You can 100% get code contamination with engineers if they view code from another source which has restrictive licenses or IP which is not allowed to be used elsewhere in a company, usually as it could lead to IP infringement cases being brought if they work on code in a similar space to that which they viewed.

So if they work on a project, see restricted code from a 3rd party as needed in part of that project they are then contaminated and can't work on a new project that competes with the 3rd party code for some time, at least in big enterprise which will be the AMD/Intel operating environment. Same as if they viewed protected code that was leaked from a competitor - it can lead them to implement things in a way which they would not have done if they had not seen the restricted code, this getting value from seeing it.
 
Open source drivers wouldn't hurt tbh. Please keep me safe from the Geforce ExperiencezzxXZXZxz junk

Maybe we can mod the UI of that ancient Nvidia config panel one day huh?
 
Yeah dont try and bullshit this - 13 years in big enterprise software says otherwise. You can 100% get code contamination with engineers if they view code from another source which has restrictive licenses or IP which is not allowed to be used elsewhere in a company, usually as it could lead to IP infringement cases being brought if they work on code in a similar space to that which they viewed.

So if they work on a project, see restricted code from a 3rd party as needed in part of that project they are then contaminated and can't work on a new project that competes with the 3rd party code for some time, at least in big enterprise which will be the AMD/Intel operating environment. Same as if they viewed protected code that was leaked from a competitor - it can lead them to implement things in a way which they would not have done if they had not seen the restricted code, this getting value from seeing it.

This isn't quite true, seeing code from another vendor doesn't preclude reverse engineering being illegal. However your defence that no competing code made it into your product is much harder to defend, which is why when this happens often there is two teams, one that documents functions, and the other that codes it.
 
Open source drivers wouldn't hurt tbh. Please keep me safe from the Geforce ExperiencezzxXZXZxz junk

Maybe we can mod the UI of that ancient Nvidia config panel one day huh?
Oh, that WindowsXP-themed monstrosity that lacks the two most important things windows 10/11's own graphics settings can't do; Control your hardware clocks/fan/voltages, and combine display outputs into single virtual displays aka Eyefinity?

Yeah. It sucks so hard. People defend it for some reason, which is even weirder.
 
Back
Top