• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What DLSS/FSR Upscaling Mode do you use?

What DLSS/FSR Upscaling Mode do you use?

  • Native

    Votes: 13,024 44.5%
  • Quality

    Votes: 11,341 38.8%
  • Balanced

    Votes: 2,593 8.9%
  • Performance

    Votes: 1,376 4.7%
  • Ultra Performance

    Votes: 930 3.2%

  • Total voters
    29,264
  • Poll closed .
I don't want to get too into this even if I could go technical also but as a gamer since 286 pc's (well on Nintendo 8bit before that even) and with games of that era being good also; I don't really care that much if there are some graphical glitches. They still look better than all games 20 years ago for example. It wouldn't be dealbreaker for me if Witcher 4 looked like Witcher 1...

I do love good graphics so of course I use ultra settings and DLAA if possible, but I still don't care that much of the graphics but more of the immersion & everything else. For example, sounds matter more to me than few graphical glitches. That being said I can as easily just drop down to high quality and DLSS performance. I rather take 90fps with few glimmers than better looking 60fps (possibly with random stuttering).
 
I chose "native" because there was no option for supersampling.
I have a 1440p monitor. I play at 4K in games that my GPU can handle 4K at and where the UI scaling doesn't get ruined. At least in the games I play, this looks just as good or better than MSAA x8, and runs much faster. I mostly play old or indie games which don't even support DLSS or FSR.
I don't think FSR/DLSS are bad technologies, but when they're available, I want to use them for antialiasing, not for higher frame rates. If I want higher frame rates, I'll turn down the lighting and fog settings which don't make any noticeable difference to graphics but still tank fps.
I hate jaggies.
 

Ultra Performance​

Because what else to use when ur game lags cus ur 2 year old 700 bucks video card is not good enough for even games from 2 years ago? Usually performance, but the newer games tend to want to go even further. Id love to use native but you know.. slideshows aint fun.
 
I chose "native" because there was no option for supersampling.
I have a 1440p monitor. I play at 4K in games that my GPU can handle 4K at and where the UI scaling doesn't get ruined. At least in the games I play, this looks just as good or better than MSAA x8, and runs much faster. I mostly play old or indie games which don't even support DLSS or FSR.
I don't think FSR/DLSS are bad technologies, but when they're available, I want to use them for antialiasing, not for higher frame rates. If I want higher frame rates, I'll turn down the lighting and fog settings which don't make any noticeable difference to graphics but still tank fps.
I hate jaggies.
1737907691305.png


You have the same options in Nvidia's control panel, so you can override most games that way.
 

Ultra Performance​

Because what else to use when ur game lags cus ur 2 year old 700 bucks video card is not good enough for even games from 2 years ago? Usually performance, but the newer games tend to want to go even further. Id love to use native but you know.. slideshows aint fun.
There's only 1 game that wont run right at native ultra on a 4070 super/4070ti, and that is black myth wukong. Even then, you dont need ultra performance to make that game playable.
 
With DLSS Transformer model coming out I bet the percentage of people utilizing Balanced/Performance mode will go up tremenously
 
With DLSS Transformer model coming out I bet the percentage of people utilizing Balanced/Performance mode will go up tremenously
I'm a hold out, but if the quality is there, I would use it. Right now DLSS just isn't quite there compared to a native render.
 
View attachment 381811

You have the same options in Nvidia's control panel, so you can override most games that way.
Sorry to go off topic a little but I've been wondering this for a while.... does AMD's version of downsampling rival dldsr? I heard its closer to the original dsr, which isn't as good. And honestly, because I was unsure, that was part of the reason I got a 4090 rather than an amd card ( I was pissed at nvidia at the time, well and myself, for getting a 3070 with only 8gb, only for that to start becoming an issue only a few months later so I was primed for a change) But the downsampling uncertainty plus the pricing difference is what made me hold my nose and go nvidia again ( AMD prices were way higher than US MSRP at the time yet the 4090 was UNDER US msrp - guess I got lucky on that one).

Anyway, have you used both? How would you compare them?

-------

Oh yeah, I tried using the the transformer model in ff7 rebirth and either it didn't work or I couldn't tell the difference. DLSS is done weird in that game so maybe thats why, I find I can't really use it at all or the hair looks weird as heck. Guess I'll just have to wait.
 
Last edited:
Both brands have virtual super resolution and you can use it the same way.
I'm currently playing AC Mirage.
On RTX 4080 Super I can do DSR x2,25 + DLSS Quality - DLSS is not good enough in this game and changes some objects.
On the 7900 XTX VSR + Driver Frame generation - FG if you look closely can be seen on some objects as well.


So either way there are compromises.
With both cards I can play with enough FPS on my current 2560x1080p resolution without these addons.
The AMD way can be played practically anywhere (dx11, dx12, vulkan) and the card is cheaper, the Nvidia way only if implemented in the game.
 
Everyone who has been involved in this discussion should watch the following.
Jeff got a lot right.
@CraftComputing
Yes, enjoyed every moment of this one. You discussed a lot of important info and clarified much of it.
While I disagree on a couple points you made, those points are subjective and very much personal preference, not going to waste anyone's time with them.

Well done Jeff!
 
With both cards I can play with enough FPS on my current 2560x1080p resolution without these addons.

- You miss somerthing essential. You lack a lot of pixel. 2560x1440 - WHQD is for sure more demanding.

Why do you need Upscaling for a resolution far below 1440p? with a "4080 super" ?

Just for information:

That resolution does really exists - Random Screen I found:

LG 34WQ650-W 34 Inch 21:9 UltraWide Full HD (2560 x 1080) 100Hz IPS Monitor, 100Hz Refresh Rate with RGB 99% Color Gamut, VESA DisplayHDR 400, USB Type-C, AMD FreeSync, Tilt/Height Adjustable Stand
 
Both brands have virtual super resolution and you can use it the same way.
I'm currently playing AC Mirage.
On RTX 4080 Super I can do DSR x2,25 + DLSS Quality - DLSS is not good enough in this game and changes some objects.
On the 7900 XTX VSR + Driver Frame generation - FG if you look closely can be seen on some objects as well.


So either way there are compromises.
With both cards I can play with enough FPS on my current 2560x1080p resolution without these addons.
The AMD way can be played practically anywhere (dx11, dx12, vulkan) and the card is cheaper, the Nvidia way only if implemented in the game.
Right, yeah I don't doubt AMD's features or anything, infact their software has always looked superior to nvidia's tbh. I was just curious about the dowsampling. As nvidia claims dldsr 2.25x has the same quality as dsr 4x, and its kinda true. Not only that, it doesn't mess up the desktop as bad as dsr does, or in other words, its easier to run windows itself at 4k on a 1440p monitor than 5120x2880... which means constantly changing resolutions every time you tab out which is a pain.

How you would you say the downsampling picture quality compares on its own, is it as good as dldsr? Or is it closer to dsr? I really wish I had an amd card to test on cause this point might very well sway my future buying decisions.

More than anything I hate aliasing, and dldsr is my best tool against it. Sure I might miss dlss but I can live without it, especially if xess comes to more games. But I need good downsampling, and 4x isn't always realistic in new games, since using old methods, thats the best way, mathematically to have the pixels converge evenly. But nvidia claims to have overcome this, and its hard to argue with the results, they are good.
 
That resolution does really exists - Random Screen I found:
I'll write it again, I use Virtual Super Resolution - this creates bigger resolutions than the default on the monitor. You can create it through a driver or third party program(2k, 4k UW etc), if your monitor supports that resolution/hertz it will work.

Right, yeah I don't doubt AMD's features or anything, infact their software has always looked superior to nvidia's tbh. I was just curious about the dowsampling. As nvidia claims dldsr 2.25x has the same quality as dsr 4x, and its kinda true. Not only that, it doesn't mess up the desktop as bad as dsr does, or in other words, its easier to run windows itself at 4k on a 1440p monitor than 5120x2880... which means constantly changing resolutions every time you tab out which is a pain.

How you would you say the downsampling picture quality compares on its own, is it as good as dldsr? Or is it closer to dsr? I really wish I had an amd card to test on cause this point might very well sway my future buying decisions.

More than anything I hate aliasing, and dldsr is my best tool against it. Sure I might miss dlss but I can live without it, especially if xess comes to more games. But I need good downsampling, and 4x isn't always realistic in new games, since using old methods, thats the best way, mathematically to have the pixels converge evenly. But nvidia claims to have overcome this, and its hard to argue with the results, they are good.
It's simple, for alt+tab I can set the same desktop resolution as the game and when I'm done playing I can just default it back.

DSR - dynamic super resolution = virtual super resolution - same thing they do.

In this situation I don't need to downsample, I just play at a higher resolution and FG doubles the framerate so I'm actually playing at a higher resolution.
 
It depends on resolutions... Some games at 8K, FSR Quality to Ultra Performance.
FSR 3.1 + FG works great at SONY titles at 8K, such Horizon Forbidden West, Spider Man...
 
Everyone who has been involved in this discussion should watch the following.
youtube]QKv9KzFnujE
Jeff got a lot right.
@CraftComputing
Yes, enjoyed every moment of this one. You discussed a lot of important info and clarified much of it.
While I disagree on a couple points you made, those points are subjective and very much personal preference, not going to waste anyone's time with them.

Well done Jeff!
I lost interest within the first 5s, stopped watching at the 30s mark. Maybe it's informative, but the intro comes off as an ad and nothing more.
 
my poor slow dell laptop doesn't have a dGPU, so i always pick ultra performance in CS2.
 
Back
Top