• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Deathloop: DLSS vs. DLAA vs. FSR Comparison

Joined
Sep 9, 2021
Messages
78 (0.06/day)
Originally, Deathloop on PC was only available with support for AMD FidelityFX FSR upscaling technology. Recently, it has been updated to also support NVIDIA's Deep Learning Super Sampling (DLSS) technology with the ability to run DLSS at native resolution (DLAA). In this mini-review, we take a look and compare the image quality and performance offered by these solutions.

Show full review
 
Nice work.
Nice to see how much detail DLAA can reconstruct from native 1080p.
DLAA almost has no TXAA-blur
i the video, when the player moves in 1080p i observe less shadow-flickering/aliasing around the near upper left window vs the other options, wich is nice.
edge-/semitransparent-/shadow-flickering or related aliasing seem to get minimized really well by DLAA
only sad that one needs a 600 USD card to use it
 
DLAA looks the best no questions asked, DLSS is slightly better than FSR, but there is barely anything in it. Overall I got to say that I still don't like using any "features" that reduce my image quality!

I feel like I would only use it if I was running a GTX 1050 or something and couldn't get a stable 30fps even at 1080p, I think in these scenarios DLSS or FSR are reasonable, because you don't care about quality at that point, you just want playable framerates!
 
i agree.
if i need to lower the resolution in-game, say to 900p in cyberpunk caused by my gtx 1650, "lowering the renderscale to a 900p equivalent when on 1080p" looks worse than setting it to "900p native".
so scaling vs. native can do a bad job sometimes
using more modern scaling features like FSR or DLSS at highest quality settings do a good job, but the differences in sharpness/crispness vs flicker/aliasing-reduction are quite big sometimes

i personally would call DLSS DL-Source-Scaling because there are no SSAA-like benefits in reducing the input-resolution to get to the marketed output.
DLSS 4K is nowhere near where a real 2xSSAA 4K would be, let alone a 4xSSAA 4K

compare the naming to DLAA, that thing using the full source everytime should be named SuperSampling
 
FSR needs to have the sharpening added without question. Even in 4K I wasn't impressed with FSR (without the usual sharpening). 1080p with FSR is a nonstarter with or without sharpening.
 
DLSS looks over sharp and aliased in these relative to FSR which is strange since no sharpening was added. It's weird because in other scenario examples FSR looks sharper than DLSS. It's fascinating how DLSS and FSR as you reduce the resolution to 1440p and 1080p the jaggies decrease in turn. It would seem counter intuitive to increasing resolution bumping up relative DPI, but it's complicated by post process being quicker at lower resolutions since it's lower overhead for the same post process and how well the post process can aid in improving upon scene jaggies.
 
You people are blind, FSR is way softer and textures looked half the resolution. 1080p was very poor for FSR. Did they screw up and do FSR ultra-performance and not ultra-quality!
 
Shimmer issue in the video seems like it's related to Nvidia hardware in regard to FSR I didn't notice it happening in a Radeon video with FSR on the game.
 
Last edited:
A very strong showing here for DLSS and DLAA again. Unfortunately for FSR, the Native image has issues it just exacerbates, and the lack of sharpening doesn't help either. I wonder what TAA looks like compared to FSR if we run at a lower render scale and compare at ~equal performance? if this is even possible to test on this title of course.
 
DLSS looks over sharp and aliased in these relative to FSR which is strange since no sharpening was added. It's weird because in other scenario examples FSR looks sharper than DLSS. It's fascinating how DLSS and FSR as you reduce the resolution to 1440p and 1080p the jaggies decrease in turn. It would seem counter intuitive to increasing resolution bumping up relative DPI, but it's complicated by post process being quicker at lower resolutions since it's lower overhead for the same post process and how well the post process can aid in improving upon scene jaggies.

DLAA looks almost sharp as Native with zero AA, but with SSAA applied.

DLSS version 2.3 (comes with Deathloop) also has higher IQ compare to earlier version too, seems like more temporal data are being used.
 
The pictures are clearly mislabeled as your 1440p DLSS quality image is much more clear and shows way less jaggies especially noticeable on the power lines than your 4k DLSS quality image. It is not even close so someone would have to be blind not to notice it.
 
You people are blind, FSR is way softer and textures looked half the resolution. 1080p was very poor for FSR. Did they screw up and do FSR ultra-performance and not ultra-quality!
The article said FSR did not use its normal sharpening and is leaving it up to the user to apply the sharpening slider. Or something along those lines.
 
On the first image, comparing DLSS Performance vs FSR Performance (both at 100 fps), the difference is quite striking. I didn't expect the former to look as good as native TXAA (75 fps).
 
To me DLSS Quality is barely different than DLAA at 4K, which means I would take the framerate boost. FSR looks bad across the board here IMHO. Without the sharpening it is pretty clear that the game is running sub-4K. If you don't have a DLSS capable card I guess I can see the point of FSR, but with Intel also including tensor cores in their line up, my guess is that AMD gets onboard the ML train soon as well. I seem to remember Nvidia saying a long time ago that DLSS could be utilized by AMD if they had the right hardware (I have tried to find this article but have been unsuccessful so if anyone knows what I'm talking about it would be great to know my memory isn't that bad). But, this leads me to think that DLSS will be automatically supported by Intel's cards since they essentially have tensor cores, so only the necessary driver should be needed for DLSS. If this is the case, it would leave AMD as the only one without a ML upscaling solution and it would make DLSS or XeSS essential for just about any PC game going forward. That would be a bad position for AMD cards as clearly DLSS quality is close enough that most gamers will turn it on in demanding games.
 
Back
Top