• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Redfall: XeSS 1.0 vs. FSR 2.1 vs. DLSS 3 Comparison

Joined
Sep 9, 2021
Messages
78 (0.06/day)
Redfall is out now, with support for NVIDIA's DLSS Super Resolution (DLSS 2.3), NVIDIA's DLSS Frame Generation (also known as DLSS 3), NVIDIA's Deep Learning Anti-Aliasing (DLAA), Intel's Xe Super Sampling 1.0 (XeSS 1.0) and AMD's FidelityFX Super Resolution 2.1 (FSR 2.1). In this mini-review we take a look, comparing the image quality and performance gains offered by these technologies.

Show full review
 
They all look shit, this is too much attention for this bottom barrel title.

A blue glowy shotgun. I mean. What
 
DLSS3 seems a dev get out of jail card for poor cpu optimisation, which might bring more cpu intensive titles in future.
 
No DLAA+FG? Thats the setting I use.
"you can use DLAA and DLSS Frame Generation without any issues if you want to maximize your image quality."
 
DLSS3 seems a dev get out of jail card for poor cpu optimisation, which might bring more cpu intensive titles in future.

If devs release a game that CPU bottlenecks at around 30 FPS, that'll feel bad at native and unplayable for most with DLSS3.0 FG due to the latency increase with FG enabled.

I cannot imagine PC gamers would take kindly to games feeling extremely clunky with high latency, regardless of if it looks smooth. That IMO is far far worse than simply requiring a more expensive GPU for higher graphics settings.
 
Tried this on gamepass 2 days ago. Ran like shit on my 3090/5800X3D with dlss on, mostly 90+ fps but would drop to 30odd for no reason at all. Wasn't fun at all. First enemy vampire i took down was clipping through walls. The start with the boat and the ocean was pretty cool though. Uninstalled after about an hour of playing
 
I appreciate the effort made to create all those comparative slides - hours of work! But if the scene is NOT EXACTLY THE SAME, and lighting is different, and the frame and positioning is slightly different, and the "time of day" is different, then the comparatives are entirely subjective and prone to misinterpretation. There needs to be a better way.
 
Native TAA is such a blurry mess, look at the grass and bricks on the wall. Practically every scaling option is better.
 
Using r.TemporalAA.Algorithm, set to 1, makes massive improvements to the native TAA. You can do that by editing the Engine.ini or using the Universial Unreal Engine Unlocker to get at the command console.
 
While i very much appreciate these kind of tech reviews, its a shame its on this game. Unfortunatly Bethesday have beat the "Arkane" horse a bit to far. Since Dishonored, games form that studio (Arkane) have been getting progressively worse, with RF being widely panned. While it slooks pretty, its nothing more the a MP game with a weak storyline. I may get it on sale when it has dropped below the ÂŁ15 mark, as by then a number patches will have been released and they might have potentially fixed some of the glaring gameplay issues by then.
 
Another dumpster-fire of a launch from a developer who hasn't finished the game yet. DF's review says it's competing against Jedi Survivor for the crown of worst state at launch.

The common theme for all of these terrible launches appears to be UnrealEngine 4.

Callisto Protocol
Dead Space Remake
TLoU Part 1 Re-Re-Reremaster
Jedi Survivor
Redfall

Hogwarts Legacy seemed to be almost ready when it launched, as did Atomic Heart, so the problem isn't UE4, just that devs using it seem to fail harder than most. The in-house engines for things like Forspoken or the RE4 remake were actually pretty smooth launches from those devs, IIRC.
 
Well, UE4 multicore support is kind of meh, especially in older versions. If the studio is not very experienced with the engine, it can be quite a disaster, especially in open-world games with many assets. There was the same issue in Ark, one core doing all the work, don't know how it is now. UE4 shouldn't be really used for open-world games.
 
Shit game. Shit downscaling tech.
Give us good old max quality, uncompromised textures and leave us alone.
 
strangely (or as usual ... for me) FSR look slightly more pleasing than DLSS ... plus FSR being hardware agnostic ...

on a sidenote...
i remember reading somewhere that AMD always had more "precision" in graphics quality and that was what made them less performant basically :laugh:
i owned enough cards from both... and as i had more pleasure with the red side (even when it was ATI) than green ... i might be an oddball ... :oops:

However, another game where DLSS shows its superiority.
Those who buy "green" are not really idiots.
that's relative :laugh: 3fps more in 4k looking a little worse is not superiority, but well it's also subjective opinion from me :p

1440p and 1080p no superiority at all tho ;)
 
on a sidenote...
i remember reading somewhere that AMD always had more "precision" in graphics quality and that was what made them less performant basically :laugh:
i owned enough cards from both... and as i had more pleasure with the red side (even when it was ATI) than green ... i might be an oddball ... :oops:
ATI's anisotropic filtering was better than Nvidia's up until Geforce 7. But, that's old news and hasn't been relevant in a long time.
 
However, another game where DLSS shows its superiority.
Those who buy "green" are not really idiots.

Shill a little harder; JHH might not be able to hear you. :rolleyes:
 
ATI's anisotropic filtering was better than Nvidia's up until Geforce 7. But, that's old news and hasn't been relevant in a long time.
AMD's aniso was better than Nvidia's up until GTX 600. Kepler. HD 5000 or 6000 basically perfected AF and Nvidia was nowhere close with GTX 400 and 500.

Another dumpster-fire of a launch from a developer who hasn't finished the game yet. DF's review says it's competing against Jedi Survivor for the crown of worst state at launch.

The common theme for all of these terrible launches appears to be UnrealEngine 4.

Callisto Protocol
Dead Space Remake
TLoU Part 1 Re-Re-Reremaster
Jedi Survivor
Redfall

Hogwarts Legacy seemed to be almost ready when it launched, as did Atomic Heart, so the problem isn't UE4, just that devs using it seem to fail harder than most. The in-house engines for things like Forspoken or the RE4 remake were actually pretty smooth launches from those devs, IIRC.
Neither Dead Space or TLOU Part 1 are Unreal 4, lol.
 
Last edited:
ATI's anisotropic filtering was better than Nvidia's up until Geforce 7. But, that's old news and hasn't been relevant in a long time.
i still saw that post Geforce 7 gen ... but i meant in general ;)

might be a placebo,

although the last time i saw that "pattern" ... my R9 290 (under an Aquatuning WB ) had a little something more, dunno more crisp ... than the subsequent card i owned (which was a GTX 980 Poseidon) even tho she was slightly slower ... and it, indeed, did seem that the perf drop was because of the graphical quality difference :oops:

the comparison would be like, between seeing the same game in high settings and mid settings (while being set on max on both cards) the performance hit was also giving me that impression ... oh well as i said ... i'm an oddball, don't mind me ... too much.


that might also be due to the fact that i always had driver issues (mostly, but other issues too ) with Nv (most recent was the GTX 1070) and litteraly no issues with the current card since i bought it (luckily i jumped over the RX 7X00 driver causing issues for the 6X00/6X50 series )
 
i still saw that post Geforce 7 gen ... but i meant in general ;)

might be a placebo,

although the last time i saw that "pattern" ... my R9 290 (under an Aquatuning WB ) had a little something more, dunno more crisp ... than the subsequent card i owned (which was a GTX 980 Poseidon) even tho she was slightly slower ... and it, indeed, did seem that the perf drop was because of the graphical quality difference :oops:

the comparison would be like, between seeing the same game in high settings and mid settings (while being set on max on both cards) the performance hit was also giving me that impression ... oh well as i said ... i'm an oddball, don't mind me ... too much.


that might also be due to the fact that i always had driver issues (mostly, but other issues too ) with Nv (most recent was the GTX 1070) and litteraly no issues with the current card since i bought it (luckily i jumped over the RX 7X00 driver causing issues for the 6X00/6X50 series )
Image quality comparisons on the same settings and same display would be so interesting to read about if done right!
Wonder if there's any way to objectively test this other than just by having people blind test it side by side.
 
The time of day is changing while they are changing settings likely. The game world doesn't pause when you go into menu's. Technically the game is "on-line" even when you are playing Single Player.
 
"Speaking of performance, Redfall is a very CPU intensive game, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12."

all I can say is

1683329924898.png
 
Back
Top