• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What DLSS/FSR Upscaling Mode do you use?

What DLSS/FSR Upscaling Mode do you use?

  • Native

    Votes: 13,024 44.5%
  • Quality

    Votes: 11,341 38.8%
  • Balanced

    Votes: 2,593 8.9%
  • Performance

    Votes: 1,376 4.7%
  • Ultra Performance

    Votes: 930 3.2%

  • Total voters
    29,264
  • Poll closed .
Reread and comprehend better.... There is no contradiction in my post.

You’re the one who should reread what you stated, and reword. You literally said it hurts IQ in your second sentence then claims it doesn’t destroy IQ in a second paragraph.

What you should’ve said is: both destroy IQ but DLSS does a better job.
 
You’re the one who should reread what you stated, and reword. You literally said it hurts IQ in your second sentence then claims it doesn’t destroy IQ in a second paragraph.

What you should’ve said is: both destroy IQ but DLSS does a better job.
No, I said DLSS is such a small hit it doesn't affect it much, while fsr does. You're the one claiming that you don't understand what is plainly written.
 
Dlss quality, whenever available. It barely affects image quality compared to native, and lets me crank up other IQ settings that make for better overall image quality.

You can tell the amd users here by the "native" whiners who say upscaling destroys image quality. Fsr does but not dlss which is far superior.
You sort of made a fool of yourself there. For one you sound like a stereotypical fanboy of nvidia. Second, it depends on implementation in the game.
 
I will typically use FSR Performance for my 980ti, and avoid upscaling entirely for my 6800xt.

With the 980ti it's usually a question of just trying to run something it can't at the native res, like a newer UE4 game or something. I'm willing to take a hit on image quality if it's the difference between a game being playable and not.

My 6800xt hasn't really encountered anything it can't run well at 1440p/144hz, but that's likely because I mostly play stuff that's a couple years old. As soon as I run into something that makes it sweat, then I'll slowly make my way from FSR quality to performance until I get the FPS I want.
 
Only made it a few sentences before contradicting yourself, upscaling will always be worse compared to native. You literally cannot create more texture information from a lower input resolution to match a higher base native resolution.
Of course you can. We have been doing it for ages. EG. photo restoration




If that was the case then eg. 1440p DLSS Q would look worse than native 1080p, but it doesn't. How does that happen?
 
Last edited:
Indiana Jones does need some help at 4K
 
Circus method.

Screen: 1920x1080.
Virtual super resolution: 3200x1800.
XeSS: usually Quality, sometimes Balanced.
FSR: prefer not to but if no other way, Quality.
DLSS: unavailable on my machine but Performance would do just fine.
 
Only made it a few sentences before contradicting yourself, upscaling will always be worse compared to native. You literally cannot create more texture information from a lower input resolution to match a higher base native resolution.


DLSS, FSR and XeSS can actually make thing appear that wouldn't at native resolution. This is also true for most temporal upscaler. This would be true with a spatial upscaler that can only use the current frame as information to upscale.

"modern" temporal upscaler will apply a hidden jitter between frame. Each frame would be rendered slightly offset from the center and they would rotate on a grid to get information that wouldn't appear at native because they wouldn't be on the native pixel grid. Note that this work best at higher resolution with a less aggressive upscaling.

This same setup can also work at native using DLAA. This help greatly to find edge and remove the aliasing and add details that wouldn't be showed on the grid.

Game texture are much bigger than the space they normally use on a screen. You lose texture information even at native.

All those upscaler are huge step up versus things of the past like spacial upscaler or checkerboarding. but they still have way to go. But i think the next step will be done in the engine itself and not from a GPU vendor.
 
DLSS, FSR and XeSS can actually make thing appear that wouldn't at native resolution. This is also true for most temporal upscaler. This would be true with a spatial upscaler that can only use the current frame as information to upscale.

"modern" temporal upscaler will apply a hidden jitter between frame. Each frame would be rendered slightly offset from the center and they would rotate on a grid to get information that wouldn't appear at native because they wouldn't be on the native pixel grid. Note that this work best at higher resolution with a less aggressive upscaling.

This same setup can also work at native using DLAA. This help greatly to find edge and remove the aliasing and add details that wouldn't be showed on the grid.

Game texture are much bigger than the space they normally use on a screen. You lose texture information even at native.

All those upscaler are huge step up versus things of the past like spacial upscaler or checkerboarding. but they still have way to go. But i think the next step will be done in the engine itself and not from a GPU vendor.
I don't see how losing information that's not on the screen to begin with a bad thing.

Edit: Besides, do you have a source? I don't think that's how upscaling works, but I wouldn't mind reading up on it.
 
I go native wherever possible. FSR for some games where I don't mind the blurriness when standing still (happens alot in RE2R, its better in RE4R), if not FSR I do DLSS. XeSS isnt in any games I have so I cant test it for a matter of fact

I usually try to aim for balanced or quality, depending on the game. for any compeitive title which supports it I pick whatever keeps important things visible enough without blurring it too much.
 
I don't see how losing information that's not on the screen to begin with a bad thing.

Edit: Besides, do you have a source? I don't think that's how upscaling works, but I wouldn't mind reading up on it.
Think about it, since 1440p dlss q (so 960p internal res) looks better than native 1080p, something must be very wrong with native, right? You can push this even further and go 4k dlss ultra Performance vs 1080p native.
 
Think about it, since 1440p dlss q (so 960p internal res) looks better than native 1080p, something must be very wrong with native, right? You can push this even further and go 4k dlss ultra Performance vs 1080p native.
That's not the comparison I like to make. No one plays at 1080p on a 4K screen. The comparison I make is 4K native vs 4K DLSS, 1440p native vs 1440p DLSS and 1080p native vs 1080p DLSS. As long as you don't use some crappy TAA, native wins every time.
 
That's not the comparison I like to make. No one plays at 1080p on a 4K screen. The comparison I make is 4K native vs 4K DLSS, 1440p native vs 1440p DLSS and 1080p native vs 1080p DLSS. As long as you don't use some crappy TAA, native wins every time.
That's not the point I'm making at all. The point is that since dlss can look better while using a lower internal res than native, something must be wrong with native.
 
That's not the point I'm making at all. The point is that since dlss can look better while using a lower internal res than native, something must be wrong with native.
That's highly dependent on the game and its definition of "native". Some of them force some crappy TAA on you by default which could make DLSS look like the better option. But I'd say, that's not native at all.
 
Native. If a game won't run at that, then I simply won't buy it. I'm not opposed to upscaling from a technological point of view, but what I refuse to support / normalize is the ongoing bait & switch ensh*tification where upscaling ends up a crutch / excuse for not optimising at all, rather than an enhancement in addition to it.
 
Native in 99% of cases unless I'm playing with benchmarks just to see how different setting perform ie: FPS vs IQ if it looks shit but performs well it's a no if it looks good but performs a little worse it's a maybe I might use it etc etc....
 
4K full RT DLAA first , FG second , DLSSQ third , also RT are not the best looking for me , that goes to Hellblade II
 
I don't see how losing information that's not on the screen to begin with a bad thing.

Edit: Besides, do you have a source? I don't think that's how upscaling works, but I wouldn't mind reading up on it.
The whole tech is well explained in this video. The part about camera jitter is in the first few minutes so no need to watch the full thing if you don't want to.
 
Half the time I don't even use native rendering, I use supersampling. I have a 1440p monitor, and I often play games at 4K.
I have an RX 7800 XT, and I play a lot of older games that don't really need a GPU that powerful in order to give good performance at native 1440p. Compared to the game's built-in MSAA implementation, 4K scaled down to 1440p often looks just as good IMO, and runs better.
If I had an Nvidia GPU, and played games that support DLSS, I might use it, but I don't, so no.
I really hate jaggies and other aliasing artefacts. I also hate blur and temporal artefacts. I want a high-detail image, which looks as close to reality as possible.
DLAA is (at least in theory, assuming a good implementation, though that also applies to any other AA method) the gold standard of image quality and performance available at the moment, but none of the cinematic games I regularly play support it, so I stick to supersampling.
 
I don't use upscaling at all. Not a fan of the way it looks.
Same here.

I just wish you strength to endure all the comments bluntly stating that you're objectively wrong (in your subjective opinion) despite 44% of voters thinking the same.
 
I don't use upscaling at all. Not a fan of the way it looks

Same here.
I also prefer no upscaling, playing video games at native resolution via the GPU like our great ancestors did in the late 90's

360_F_564616442_LVzKknYbr3fbLynniZliWcTOQXqQ3PPy.jpg
 
I also prefer no upscaling, playing video games at native resolution via the GPU like our great ancestors did in the late 90's

360_F_564616442_LVzKknYbr3fbLynniZliWcTOQXqQ3PPy.jpg
What do you mean our ancestors? The late '90s - early '00s was my best era of video gaming! I got my first PC and first video game in '97. Good old times! :rolleyes:

Or is that me in your picture? Definitely looks familiar. :wtf:
 
What do you mean our ancestors? The late '90s - early '00s was my best era of video gaming
I actually started playing PC games in the 80's so probably just older than you
 
Back
Top