• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

FSR4 running on RDNA3 in Linux via hack.

Joined
Oct 6, 2021
Messages
1,918 (1.45/day)
System Name Raspberry Pi 7 Quantum @ Overclocked.
A Valve employee managed to get FSR4 working on RDNA3 through FP8 emulation in FP16. While it’s not fully usable, it’s impressive that one person was able to pull off such a feat: https://themaister.net/blog/2025/05...pixels-on-linux-through-maniacal-persistence/

1746918284654.png
 
Skimming some of article seems like unless amd updates FSR4 to use FP16 then 7000 series isn't gonna get it. Question then is it possible to port code without massive performance hit that article points out in their porting of it though they are doing emulation or is change over gonna not yield same quality to recode it to work on 7000 series?
 
Question then is it possible to port code without massive performance hit that article points out in their porting of it though they are doing emulation or is change over gonna not yield same quality to recode it to work on 7000 series?
Like everything it’s probably going to be a brutal performance hit. Some of this stuff remember has hardware assistance in silicon to do the math. Same with ray tracing, ray tracing isn’t Nvidia tech you can literally write the math on a piece of paper. Some of these instructions just take forever.
 
There are rumours of AMD working on an FSR4-lite implementation for RDNA3. There will likely be some compromises in terms of performance and quality but it won't be hard to beat FSR 3, as XeSS has shown.
 
Last edited:
IMHO, i'd like AMD to work %10000 harder on spreading FSR to more titles first. BTW, AMD releases updated FidelityFX SDK featuring FSR 3.1.4 with reduced upscaler ghosting

I just realised I had a typo in my post. That was supposed to be for FSR4-lite for RDNA 3. Either way, I agree. AMD needs to push FSR harder. I think we can help by letting game studios know that we want this too. Adopting FSR 3.1 is a no-brainer for game studios. It works on all GPUs, and gamers can upgrade to newer versions without requiring a game update.
 
I just realised I had a typo in my post. That was supposed to be for FSR4-lite for RDNA 3. Either way, I agree. AMD needs to push FSR harder. I think we can help by letting game studios know that we want this too. Adopting FSR 3.1 is a no-brainer for game studios. It works on all GPUs, and gamers can upgrade to newer versions without requiring a game update.
Thing is is future FSR gonna work on all gpu's or is 4+ gonna be locked. That is gonna change things as kinda makes XeSS more the no-brainier atm given unknown future of FSR.
 
A Valve employee managed to get FSR4 working on RDNA3 through FP8 emulation in FP16. While it’s not fully usable, it’s impressive that one person was able to pull off such a feat: https://themaister.net/blog/2025/05...pixels-on-linux-through-maniacal-persistence/

View attachment 399077

Lovely technical writeup. AMD code quality being utter sewage as usual. I knew I was in for a horrorshow when I read

FSR 4 is not actually released yet to developers for integration in engines, but exists as a driver toggle that can be enabled for certain FSR 3.1 games. The FSR 3.1 implementation had the foresight to let the implementation peek into the driver DLLs and fish out a replacement implementation. This is of course a horrible mess from any compatibility perspective, but we just have to make it work.

Undocumented, buggy, broken mess. It's amazing they even got this hack to be usable, let alone ship it.
 
It's relatively easy to port and mod anything related to graphics because by design everything has to go through the driver, so all the commands/instructions can be interpreted.
 
Thing is is future FSR gonna work on all gpu's or is 4+ gonna be locked. That is gonna change things as kinda makes XeSS more the no-brainier atm given unknown future of FSR.

So far all the games that support FSR 4 also support a 3.1 fallback for older hardware. AMD would have to be extremely foolish to remove the 3.1 fallback as it would affect their own customers. I don't know if the rumoured lite version of FSR 4 will work on non-AMD hardware. If it does, it might compete with XeSS DP4A version. Generally, once a game is setup to provide inputs to a temporal upscaler, it is relatively easy to support them all.

XeSS is pretty good. I usually prefer XeSS image quality over FSR 3.1 but the performance is worse, so I'd rather have FSR 4 instead.

This is how I'd rank them overall: FSR 1 < FSR 2.2-3.1 < XeSS 1.3-2.0 < DLSS 2 < DLSS 3 < FSR 4 < DLSS 4
 
So far all the games that support FSR 4 also support a 3.1 fallback for older hardware. AMD would have to be extremely foolish to remove the 3.1 fallback as it would affect their own customers. I don't know if the rumoured lite version of FSR 4 will work on non-AMD hardware. If it does, it might compete with XeSS DP4A version. Generally, once a game is setup to provide inputs to a temporal upscaler, it is relatively easy to support them all.

XeSS is pretty good. I usually prefer XeSS image quality over FSR 3.1 but the performance is worse, so I'd rather have FSR 4 instead.

This is how I'd rank them overall: FSR 1 < FSR 2.2-3.1 < XeSS 1.3-2.0 < DLSS 2 < DLSS 3 < FSR 4 < DLSS 4

It's the other way around. There are no games at all with a native FSR 4 implementation, and an SDK does not exist, it's as of today completely undocumented on GPUOpen and the libraries are completely proprietary, with no shims for Linux deployment being provided. The Windows driver intercepts the FSR 3 calls and replaces the functions with the FSR 4 DLL, their nasty habit of doing hackish injections again.

I was too nice when I called AMD code quality sewage.
 
The Windows driver intercepts the FSR 3 calls and replaces the functions with the FSR 4 DLL, their nasty habit of doing hackish injections again.

I was too nice when I called AMD code quality sewage.
I don't understand why you're calling it sewage. You can intercept calls on any graphics driver, that's how they all work, is Nvidia sewage ? You can change DLLs too, nothing stops anything from intercepting function calls either and execute whatever you want.
 
I don't understand why you're calling it sewage. You can intercept calls on any graphics driver, that's how they all work, is Nvidia sewage ? You can change DLLs too, nothing stops anything from intercepting function calls either and execute whatever you want.
He will also claim that he has no anti AMD sentiment. This user looks for AMD threads to add his opinion. Nothing of real substance
 
He will also claim that he has no anti AMD sentiment. This user looks for AMD threads to add his opinion. Nothing of real substance

"This user", "nothing of real substance". The reason you're not seeing me direct the same criticism at Nvidia is because they provide an API with an SDK, programming documentation, work together with engine vendors (Unity, Unreal, Unigine, etc.) and game developers for a clean implementation. Reading the post in the OP also helps, and it's no shame if you don't understand it.

Sounds more of a case of "this guy offended my feelings" than a "nothing of real substance".

I don't understand why you're calling it sewage. You can intercept calls on any graphics driver, that's how they all work, is Nvidia sewage ? You can change DLLs too, nothing stops anything from intercepting function calls either and execute whatever you want.

Because sewage is cleaner than intercepting DLL calls and replacing them on the fly for code swap at runtime. If there is one dirty programming malpractice that need to be eliminated, it's this one. And for a company that once took pride in an open, documented implementation, it's sure getting a lot of slack cut to it. Have they learned absolutely nothing with the Anti-Lag/VAC incident?
 
Because sewage is cleaner than intercepting DLL calls and replacing them on the fly for code swap at runtime. If there is one dirty programming malpractice that need to be eliminated, it's this one.
That'd break a lot of mods... Can you give a good reason why?

Have they learned absolutely nothing with the Anti-Lag/VAC incident?
That wasn't a case of simple dll substitution / intercepting. AMD was actually modifying existing running game code IIRC. That is far more egregious.
 
So far all the games that support FSR 4 also support a 3.1 fallback for older hardware. AMD would have to be extremely foolish to remove the 3.1 fallback as it would affect their own customers. I don't know if the rumoured lite version of FSR 4 will work on non-AMD hardware. If it does, it might compete with XeSS DP4A version. Generally, once a game is setup to provide inputs to a temporal upscaler, it is relatively easy to support them all.

XeSS is pretty good. I usually prefer XeSS image quality over FSR 3.1 but the performance is worse, so I'd rather have FSR 4 instead.

This is how I'd rank them overall: FSR 1 < FSR 2.2-3.1 < XeSS 1.3-2.0 < DLSS 2 < DLSS 3 < FSR 4 < DLSS 4

I don't know about that, they remove drivers sooner than nVidia. How ever they did start a rare update every so often.
 
That'd break a lot of mods... Can you give a good reason why?

It would break a lot of mods... but unfortunately, the keyword is precisely that, mods. What I'm advocating here is for code sanity, a basic security and stability principle. A DLL is essentially a "sub-program" or code snippet that directly references the memory via pointers during execution, with generally unrestricted access to that program's memory space. A graphics driver, which operates in kernel mode, should never be resorting to petty hacks and much less modifying userland/non-privileged code on the fly. This is why standardized APIs exist, after all.

Once you understand that, you'll basically know exactly what I am trying to get at. By intercepting a DLL and redirecting or replacing a code function in real time, you're essentially modifying the expected behavior of an application on the fly. This can (and likely will) cause deadlocks, race conditions, memory leaks, crashes, data corruption, fun stuff. This is different from properly loading a DLL (e.g. LoadLibrary instruction or injecting it at the correct point during context creation, in a known-safe load address and in the correct order of execution), which in practice, at least should allow exceptions to be handled more gracefully. Microsoft explicitly states that you should never create a process or use managed code in their DLL best practices guide.

Of course, this primarily concerns Windows. It's interesting to note that this is a genuinely benign use of DLL interception, but it's mechanically no different from malicious hijacking (a term I intentionally avoided using) of a DLL. I'm no programmer and I know this.
 
That's all fair, but the reality is: by outright removing that functionality, you'd break a lot more than you'd fix, IMO.
 
That's all fair, but the reality is: by outright removing that functionality, you'd break a lot more than you'd fix, IMO.

Yeah, but I'm not asking for its removal, I'm simply asking: will a billion dollar corporation stop shipping code made by interns on their free time? Or at least dedicate some budget towards actually having a decent implementation of an advertised major feature they are using as a product seller :shadedshu:
 
Yeah, but I'm not asking for its removal, I'm simply asking: will a billion dollar corporation stop shipping code made by interns on their free time? Or at least dedicate some budget towards actually having a decent implementation of an advertised major feature they are using as a product seller :shadedshu:
That I think is a fair ask, sure.
 
Because sewage is cleaner than intercepting DLL calls and replacing them on the fly for code swap at runtime. If there is one dirty programming malpractice that need to be eliminated, it's this one.
What does it matter if it's intercepting DLL calls, how do you think driver updates work when they fix bugs and optimize games ? They intercept calls/shaders from the application and replace/modify the code, makes no difference whether it's from a DLL or executable.

You can replace DLLs in anything btw, whether you intercept the calls or simply replace the DLL make no difference to the application.
 
Last edited:
Back
Top