• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Alan Wake 2: FSR 2.2 vs. DLSS 3.5 Comparison

I don't think in any scenario regardless of hardware used people will be happy if their vendor of choice loses.... People root for these companies like a sports team afterall.
Rooting for AMD like a sports team is what I imagine being a man utd fan is nowadays, every time that the comeback looks to be on sight, stupid fucking mistakes occurr. Glad to be a man city fan
 
Rooting for AMD like a sports team is what I imagine being a man utd fan is nowadays, every time that the comeback looks to be on sight, stupid fucking mistakes occurr. Glad to be a man city fan
I'm a Dallas Cowboys fan so I know that feeling exactly lol.
 
Can we really be sure that Nvidia is not doing anything funny in their drivers, knowing that most sites will just use one graphics card for testing both DLSS and FSR?
FSR looks exactly the same no matter GPU you run it on. If you have proof it's otherwise, show it.
Is that the new method to officially fuck AMD cards ?
How so? FSR 2 being run at native resolution is barely more demanding than classic TAA. Same applies to DLSS 2 at native (aka DLAA).
 
FSR has a flickering fence,

DLSS has a warping ermm crowd gate thing
 
FSR looks exactly the same no matter GPU you run it on.
Prove it. And prove that this is the case in EVERY GAME.

Throwing a one line reply of an opinion and then asking from the other person to prove it is otherwise, it's easy, isn't it?
 
Prove it. And prove that this is the case in EVERY GAME.

Throwing a one line reply of an opinion and then asking from the other person to prove it is otherwise, it's easy, isn't it?
As far as I concern you claim without evidence, therefore your claim is dismissed without evidence too.

Unless you want to put in some evidence yourself and try again?
 
Very true. I remember recently enough checking out FSR on witcher 3 on a 3060ti and it looked insanely bad.
Happen to see it on a 5700xt after that and it looked fine.
The witcher 3 FSR on my 4090, when I tried it, looked so busted that I couldn't believe they would even release it like that, so I could totally believe this. Maybe they fixed it since then, they are always updating them.

Starfield FSR looks great, no issues.
 
It's still amazing, in a bad way, how an overly expensive video card is not able to run properly a game natively with MSAA, without any of the FSR/DLSS garbage. I mean, seriously, the graphics is nice, but not that nice to justify the ridiculous performance reqs.
They should take example of the latest Source engine on how to properly create a good one.
 
It's still amazing, in a bad way, how an overly expensive video card is not able to run properly a game natively with MSAA, without any of the FSR/DLSS garbage. I mean, seriously, the graphics is nice, but not that nice to justify the ridiculous performance reqs.
They should take example of the latest Source engine on how to properly create a good one.
I have the same view on that one. Barely playable at 4k with all the DLSS/FSR whatever on. I don't like where this is going, especially if you have to pay crap load for graphics cards and then you are stuck with underwhelming performance and an upscaler cant even make OK.
 
Prove it. And prove that this is the case in EVERY GAME.

Throwing a one line reply of an opinion and then asking from the other person to prove it is otherwise, it's easy, isn't it?
That's like, failing at logic 101. You made a claim you have to back it up, it's not up to everyone else to disprove your claim
 
Prove it. And prove that this is the case in EVERY GAME.

Throwing a one line reply of an opinion and then asking from the other person to prove it is otherwise, it's easy, isn't it?
From AMD FidelityFX™ Super Resolution | AMD:
  • Based on industry standards and is fully open source.
  • Does not require specialized graphics hardware.
AMD states FSR does not require specialized graphics hardware and as far as I know there have not been any reports of FSR looking differently on different GPUs in any game.

FSR algorithm is the same independent of GPU: https://github.com/GPUOpen-Effects/FidelityFX-FSR2
 
Prove it. And prove that this is the case in EVERY GAME.
It's you who has to prove your conspiracy theory, not the other way around.
Yes but nvidia is sabotaging FSR in their drivers. As if it needs nvidias help to look like crap
I've played many games with FSR 2 enabled on the RX 6800 XT which was my main GPU for long until recently. I've seen typical FSR 2 anomalies (ghosting, flickering, other types of temporal instability) in almost all of them. Still it looked good enough for me but the question is - are AMD also sabotaging FSR 2 in their drivers? :p
 
Last edited:
I know that, but empirical evidence goes a long way to shut an argument down. Too many of these performance reviews play to subjectivity. The stills are pointless as they don't show the FSR shimmering issues. Frankly, in most DLSS/FSR stills, FSR looks better/less blurry. I appreciate people have differing views but the stills aren't helping the conclusions. TBH, I have shimmering with my 4070ti in Cyberpunk.

Quoting myself as I went and tested out whether this was actually a DLSS artefact. Different game, but it is about DLSS.

On my 4070ti, when I disabled DLSS, I still see little 'scratches' appear and disappear on certain surfaces (very flat low polygon metal doors or beams - you know, the Cyberpunk staple interior surface). Hardly noticeable but there. On straight lines as well. Rare, but visible. Enabling DLSS amplifies them a lot. At first I thought it might be my cheapy 4070ti (got the cheapest I could get). But the DLSS on quality mode makes it far more noticeable. Regardless, I can forgive the shimmer of DLSS on a few surfaces to play the game at a decent frame rate. I imagine AMD users feel the same as me. No biggie.
 
It's you who has to prove your conspiracy theory, not the other way around.
It's not a conspiracy theory. We can't be absolutely sure that Nvidia's drivers and Nvidia's architecture works absolutely the same as AMD's. In fact your whole argument is based on another theory and a speculation. The theory that AMD programmers have inside knowledge of Nvidia hardware and drivers and can predict probable visual bugs of FSR when running on Nvidia hardware and the speculation that if they notice those bugs, they will spend time and resources until they fix them.
In my original post I expressed a question of how much certain can we be and even if we are, that maybe someone should verify it before using words like "broken". You and others where so much upset about my question, that you are rushing in here to post about conspiracy theories. Why? What do you afraid?

AMD states FSR does not require specialized graphics hardware and as far as I know there have not been any reports of FSR looking differently on different GPUs in any game.
What I said above. At least the part that is relevant to your post. AMD can declare that FSR is made based on some standards and that the code is intended to run the same on any architecture. The same way games and drivers are intended to run great, only to end up with a 100 bugs that need fixing. Why would we rush to speculate that FSR code is a perfect code that doesn't understand of hardware changes? Did AMD programmers became miracle makers all of a sudden when everyone is accusing them for bad drivers and subpar features? Double standards?
There are posts in here saying they seen difference.
As far as I concern you claim without evidence, therefore your claim is dismissed without evidence too.

Unless you want to put in some evidence yourself and try again?
Thank you for your contribution in this thread.
Now, is there anything of value you would like to post instead of pretending to be a lawyer?
 
Quoting myself as I went and tested out whether this was actually a DLSS artefact. Different game, but it is about DLSS.

On my 4070ti, when I disabled DLSS, I still see little 'scratches' appear and disappear on certain surfaces (very flat low polygon metal doors or beams - you know, the Cyberpunk staple interior surface). Hardly noticeable but there. On straight lines as well. Rare, but visible. Enabling DLSS amplifies them a lot. At first I thought it might be my cheapy 4070ti (got the cheapest I could get). But the DLSS on quality mode makes it far more noticeable. Regardless, I can forgive the shimmer of DLSS on a few surfaces to play the game at a decent frame rate. I imagine AMD users feel the same as me. No biggie.

Try replacing the stock DLSS DLL to the latest ver 3.5.10 hosted by TPU, it would improve image quality by quite a bit
 
What I said above. At least the part that is relevant to your post. AMD can declare that FSR is made based on some standards and that the code is intended to run the same on any architecture. The same way games and drivers are intended to run great, only to end up with a 100 bugs that need fixing. Why would we rush to speculate that FSR code is a perfect code that doesn't understand of hardware changes? Did AMD programmers became miracle makers all of a sudden when everyone is accusing them for bad drivers and subpar features? Double standards?
FSR uses only standard graphics APIs.
  • If there is a bug in FSR, same bug will be rendered on AMD, Nvidia & Intel GPUs.
  • If there is a driver bug that affects FSR, it means there's is a bug in graphics API implementation and it will affect everything that uses that API, not just FSR. These kind of bugs do happen and are not limited to FSR and are also not limited to only one specific company. Do you have proof AMD, Nvidia and Intel render all shaders in all games 100% correctly?
  • If Nvidia driver detects FSRs code and renders it incorrectly on purpose there is malicious intent. I would be really interested to see such evidence and so would be every HW news site.
There are posts in here saying they seen difference.
They claim they saw difference at different times. Which means it could also be due to a different version of the game, different graphics settings or a different resolution. This can be easily verified by anyone who has access to both AMD & Nvidia GPUs.
 
FSR uses only standard graphics APIs.
  • If there is a bug in FSR, same bug will be rendered on AMD, Nvidia & Intel GPUs.
  • If there is a driver bug that affects FSR, it means there's is a bug in graphics API implementation and it will affect everything that uses that API, not just FSR. These kind of bugs do happen and are not limited to FSR and are also not limited to only one specific company. Do you have proof AMD, Nvidia and Intel render all shaders in all games 100% correctly?
  • If Nvidia driver detects FSRs code and renders it incorrectly on purpose there is malicious intent. I would be really interested to see such evidence and so would be every HW news site.

They claim they saw difference at different times. Which means it could also be due to a different version of the game, different graphics settings or a different resolution. This can be easily verified by anyone who has access to both AMD & Nvidia GPUs.
Is it so bad a comparison of FSR on RTX and RX cards that needs excess argumentation of why it doesn't need to happen?
 
And again the issue of shimmering or blurring...is minor.
I saw some footage in forests where the foliage was ass to look at with FSR, but in towns/most places, frankly it's a non-issue at 4K. I can play the game just fine with FSR 2.2 quality and not care.
 
How did I not know you guys have a youtube channel!
It’s a fairly new project, we’re just messing around with it and not promoting it too much. Our main site is the big money maker, YouTube just costs money at this stage, we’re all willing to learn and improve though.
I don’t see myself going in front of the camera though. Why stop doing what i love and am good at, to be miserable instead. I still enjoy being involved with data collection, prep, ideas for presentation style, etc. i also produce b roll for some videos and I’m terrible at it but improving.
 
I am still having problems digesting words like "broken" for FSR, when the card used is an Nvidia one. Can we really be sure that Nvidia is not doing anything funny in their drivers, knowing that most sites will just use one graphics card for testing both DLSS and FSR?
Playing AW2 currently on my 7900xt with fsr2.. looks spectacular..

I did notice that when watching a video comparison from a techtuber if I can't turn off FSR on my all AMD system I am not playing this at all on it...
Looks fantastic on my 7900xt with fsr2.2 quality. Also using an Odyssey G8 OLED. Really blown away by the quality of the game and graphics.
 
Back
Top