• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Can someone explain to me, how Super Resolution Fidelity works ?

Joined
Dec 15, 2022
Messages
47 (0.05/day)
Im confused about all this ??? So i just go from 2560x1440 to 1920x1080 and then i enable Super Resolution ? And its suppose to look like its at 2560x1440 ? But i get the fps boost from going down to 1920x1080 ? Is that what this is suppose to do ? I have a 6900xt.. i tried this on Total war warhammer 3, and i get 40fps more.. the game looks the same to me clarity wise, but everything did get large... If i alt tab i see the game blurs out, then tab back in and it sharpens back up.. So am i see the Super resolution turning off and on from the alt tab ?? the amd notification says 1080 to 1440 when the game launchs.. But i will say the icons and menus in the game do get larger,, aka being at 1080p versus 1440.... So what is the honest point for this ? Everything still gets bigger from dropping the resolution ? Is this just suppose to make them look "sharper" like it would be at 1440 ??

I watched the videos and read some tid bits but all they do is show image to image, and they look near identical to me minus some small dicrepencies.. But from what i see, the game did dumb down to 1080p hense why every thing got larger, but is it suppose to just make everything "sharper" like it would be at 1440 ?? EDIT: I tried 1080p with it off,, yea i see what it does, 1080 looks like poop, compared to 1440, and with fidelity on, it sharpens it up and looks near identical to 1440 in clarity and details. However everything is still at 1080p hense why stuff looks "zoomed" in. I guess i can see the point in this.. to help you if your getting lower fps than you would want, to keep everything "sharp" and great looking, all you gatta do is drop the resolution to get way more fps, while you keep the visuals of 1440p

i have a 6900xt
 
Last edited:
? So i dont have to drop my resolution in game anymore ? It says on the description in AMD's software that i have too and it will only be enabled when i set the native resolution down... if i have the game at native monitor resolution "1440" the AMD pop up when i launch the game is yellow like its on standby, when i drop the in game resolution to 1920x1080 and reboot the game then the AMD pop up says 1080-1440 with a green icon.. So what do you mean ?
 
In principle, the game engine, reading the monitor model and the computer characteristics, offers a profile with appropriate settings. Then FSR works automatically, no need to manually select the resolution upscale. Fumbling with settings by users without knowledge of what they are doing usually does not improve results.
 
Im confused about all this ??? So i just go from 2560x1440 to 1920x1080 and then i enable Super Resolution ? And its suppose to look like its at 2560x1440 ?

No, you stay at 2560x1440 and you enable FSR. The game will run internally at a lower resolution and it will be upscaled, you get the best results if you enable this when you are at your native resolution.
 
That's because RSR is supposed to work in 99% of games without any special implementation like FSR. FSR renders the UI elements like text, icons and other HUD elements at native resolution but upscales everything else. RSR is "dumb" and basically upscales the UI and everything else so that's why the icons and text look bigger. If the game has no FSR support, RSR is the next big thing but it has this downside.
 
I think were getting some things mixed here.

By "Super resolution Fidelity" I am assuming OP actually means "Fidelity Super resolution" which is FSR, RSR (Radeon Super Resolution) works independently from FSR at the driver level.
 
The only options in AMD drivers is called,Fideloty super resolution.. it says i gatta manually turn the game resolution down.

Like Hawk said. I also noticed besides the larger everything due to the lower resolution that the jaggies better hidden at a higher resolution apear much more noticable because the resolution was lowered.

So in a way this Super Resolution sucks.. Yea i get it the textures are upscaled or lixels or whatever it does.. but everything is horridly jagged at least more than nornal unless i turn TAA on which blurs everything out in games that use it.

As for the other option youre reffering too i think that one has to be enabled in game. Which total war warhammer doesnt support. So i would have to use the other Super resolution thing where i gatta manually turn the game resolution down while the video card upscales to my native resolution.
 
Low quality post by TumbleGeorge
It's not easy to explain how something works to someone with a basic lack of knowledge. :(
Separately, there may be problems with this technology and some games or game mods.
 
Im confused about all this ??? So i just go from 2560x1440 to 1920x1080 and then i enable Super Resolution ? And its suppose to look like its at 2560x1440 ? But i get the fps boost from going down to 1920x1080 ? Is that what this is suppose to do ? I have a 6900xt.. i tried this on Total war warhammer 3, and i get 40fps more.. the game looks the same to me clarity wise, but everything did get large... If i alt tab i see the game blurs out, then tab back in and it sharpens back up.. So am i see the Super resolution turning off and on from the alt tab ?? the amd notification says 1080 to 1440 when the game launchs.. But i will say the icons and menus in the game do get larger,, aka being at 1080p versus 1440.... So what is the honest point for this ? Everything still gets bigger from dropping the resolution ? Is this just suppose to make them look "sharper" like it would be at 1440 ??

I watched the videos and read some tid bits but all they do is show image to image, and they look near identical to me minus some small dicrepencies.. But from what i see, the game did dumb down to 1080p hense why every thing got larger, but is it suppose to just make everything "sharper" like it would be at 1440 ?? EDIT: I tried 1080p with it off,, yea i see what it does, 1080 looks like poop, compared to 1440, and with fidelity on, it sharpens it up and looks near identical to 1440 in clarity and details. However everything is still at 1080p hense why stuff looks "zoomed" in. I guess i can see the point in this.. to help you if your getting lower fps than you would want, to keep everything "sharp" and great looking, all you gatta do is drop the resolution to get way more fps, while you keep the visuals of 1440p

i have a 6900xt
Where are your System specs?
 
Is this "FidelityFX™ Super Resolution" aka FSR, or something else?


FSR you dont change your resolution, the lower settings in FSR games will do that automatically for you.
It's designed to leave some elements alone, while lowering others - some games its great, some it's terrible

The entire point is to get a performance boost, it's not meant to make things look better.

VSR aka virtual super resolution does the opposite by rendering higher resolution and then downscaling the image, but it comes at a massive performance loss instead - and neither AMD nor nvidias implementations of that tech are great (Nvidia call it DSR, DSDR or DLAA)
Excuse me ? I already figured it out by just reading more. If you said that to me in person id kick your teeth in. That was rude af
Threaten anyone on the forum, and your account will be suspended.
 
Please, if only threats to me were considered, let the OP's punishment be revoked or reduced. I am not affected by these words. Proof is that I didn't report this post.

As a rule, TPU does not tolerate content that is abusive; whether or not the other person takes offence. Behaviours that invoke direct threat toward others is removed and points given. Other factors come into play that is not always visible to the general TPU user base. Where members show contrition, or at least provide reasonable excuse for 'transient' behaviour, the mod team can reconsider the infractions given.
 
Please, if only threats to me were considered, let the OP's punishment be revoked or reduced. I am not affected by these words. Proof is that I didn't report this post.
Multiple comments to multiple people, including aimed at myself.
Regardless of if someone is offended or not, the things he said simply weren't acceptable here - and when asked to stop he doubled down.

What's left visible is the stuff that's deemed fine to leave behind - so if you can see it, it wasn't the problem.
 
Sounds like a child attempting to troll.
Normally a thread would be locked by now and probably should be, but if it's educational it's educational.

Some people want help - but they also cannot stand to be corrected or made to look foolish.
And that's very hard to avoid, if those same people refuse to understand they are asking for help from strangers, for free and use the required manners and etiquette while doing so.
 
Another good resource for quality comparisons can be found on the YT channel of digital foundry.
Big +1 to DF being a fantastic source for a highly technical look at, among other things, upscaling technology.

This is the 1st video on FSR 1.0, but there is content since then which is highly informative too, and covers other competing tech like DLSS, XeSS and the newer FSR 2.0, it can give a great understanding of how it works and what you can expect from it. RSR is essentially FSR 1.0 applied at the driver level to any game (or program?) you choose, but it upscales the HUD and UI elements too, which FSR implemented into a game will not do.
 
TL;DR: They don't always render at a higher res and scale it down, sometimes it feels like they're rendering say four different displays at once, one frame at a time drastically increasing render input latency
The newer versions of this tech (DLAA and DLDSR vs old DSR) deal with this a lot better, and likely explains their math choices somehow (2.25x vs 4x, etc)
With nvidia this is easy to measure since the geforce experience overlay shows you render latency and system latency,

All these up and down scalers are basically the same technology, just using different comrpessions - it's youtube having 4K video but outputting on your 1080p screen, and somehow it still looks better.

You must not confuse render latency with frametimes - frametimes are just 1000\FPS, render times are how long it took the GPU to render the frame after the CPU requested it
If render times are higher than frametimes, you get input lag as you're reacting to an image from the past.


I just went and played with the nvidia version of this, DLDSR

It's rendering at a higher resolution, then downscaling - more or less like an AA setting that shows you a resolution not in use, like a reverse of hte high DPI scaling in windows
At 4K i run 150% zoom (which gives me a 2D resolution of 2560x1440) and using DLDSR pushes that 2D back to 4K

VSR, DSR, DLDSR, DLAA all do the same thing with different mathematical magic to use the most of the pixels - one example i had was in DRG (UE4 engine) where at 1440p vs 4K, a monitor screen in the loading area had visible text at 4k, while 1440p could not read the text

With DLAA (And interestingly, DLSS at the max quality setting) that text became legible


The downside? They require extra frames to be rendered, and add a lot to render latency.
In starcraft II, an easy to render DX9 title I can get as low as 3ms render latency at native resolution


These values are taken *while paused* so that there is no CPU or GPU variation, the only difference is the resolution chosen
The options:
1673931657608.png


Other relevant settings: DX9, in game Vsync off, Nvidia Fast Vsync forced on, ultra low latency mode forced on.
Without this combination i was seeing render latency spike start at 50ms and spike higher, which was about as fun as 20fps

2.25x resolution:
Math wise, this is acceptable because 1000 Hz/FPS divided by 101FPS is 95.2 - so i'm getting render latency below what is needed for that refresh rate - it feels perfectly fine
1673931707560.png


Then 1.78x
1000\122 = 8.1ms
So the render latency is making this frame rate feel slower than it actually is, but it's still fast enough to not notice.
On a 60Hz display with 16.6ms per frame this is invisible, but on a 165hz display at 6.0ms, this could feel slightly slower than normal, even with high FPS values
1673932115278.png



Now stock 4K
we're seeing 130FPS drop to 101FPS, so nvidia is right - this scaling gives *amazing* performance for what it visually achieves
But we're also down to 5.3ms, more than enough to not feel any lag whatsoever on a 165hz display
1673932278644.png


Dropping to 1080p didn't lower the render times much at all here, so it's not about the resolution itself
If you really wanted 240Hz or something for Esports, this might matter - but clearly DLDSR is the cause of the latency here, not higher resolutions
1673932912738.png



Summary here: Since SC2 drops frame rates to <60 a lot, you might as well enjoy the visual quality, as long as the render latencies dont go too high. High latency here makes it damned hard to click where you want in combat, but keyboard shortcuts would alleviate that a lot.


I'll show DLAA stock and DLSS up next with DRG, and combine them if the game lets me., posts will probably auto merge when i do so

DRG:
Notice this is a different value, total system latency - this includes the time the monitor takes to display the image at the current refresh rate. It cannot be directly compared with the SC2 results above.
When i alt tab in and out the render latency briefly appears, but the reading is erratic due to the tabbing

I run a 120FPS cap here, so the FPS should be the same with every test showing just the changes in rendering techniques
Obviously, with an uncapped value my latencies get lower - but once you reach a point your GPU is always 100% loaded, latency gets WORSE and not better

I'm running two 4K 60Hz monitors overclocked to 65Hz, it's uncommon but combined with keeping latencies low, i get the fantastic visuals and the low input latency of my 165hz displays - just that slower visual refresh leading to a little motion blurring. When my 4k 60 and 1440 165 monitor use the same VA 4ms panel, visually they're damned near impossible to tell apart in person.

All results are with:
Fast Vsync on (Doesnt work in DX12)
In game Vsync off
FPS Limiter enabled
Reflex enabled

Native 4k, 65FPS cap
My 65Hz display maths out to 15.38ms, so this is pretty much bang on "perfect"
1673934832253.png


Natrive 4k, 120FPS cap (in game limiter)
Now while my display can only update every 15.38ms, they're rendered in 13ms - meaning what i see is 2ms newer. A small change, but the entire benefit of a higher refresh display, on a lower refresh display
1673934611696.png


4K, DLSS quality
12.5ms maths out to 80hz/FPS here.
Far faster than my 65hz can achieve, so again - it feels really fast and responsive
1673934477329.png


This game supports Fidelity FX 1.0 and 2.0 so here they are for direct comparisons. 1.0 does well, with some artifacting - 2.0 looks like ass at present.
1.0 ------------------------- -------------------------------------------------------------- 2.0
(In this title, 2.0 looks and runs worse)
1673935798105.png
1673935835431.png




Now we bring in the DLDSR resolutions.
It gets ugly.

2.25x
5760x3240
65 fps cap
1673935044442.png

Despite being at 65fps, that latency maths out to 40hz - so this feels slow
Not unusably slow, but it definitely throws aim off, making you miss shots and mis-time jumps



2.25x
5760x3240
120FPS cap - the GPU Cant reach it, so this might as well be unlimited
1673935144357.png

92FPS should be 11.1s, but its four times that - so you're four frames behind where the game is
Because the GPU fell behind, the CPU had frames ready and waiting - so by the time the GPU can use them, they're old
This feels like using a laggy bluetooth mouse and is impossible to game with

Combining this with DLSS quality to remove some of the GPU load still has it in laggy unusuable territory
1673935375826.png



Then to top it off, we use DLAA - which does the same as the above example by combinging a higher render resolution, with DLSS to drop to native resolution

1673935441212.png

And it's 1/3 the latency! What a win! DLAA for life, am i right?!?


The key here is that if the GPU usage hits 100% (such as when i unlimit the FPS) DLAA goes to shit
1673935507023.png

4k

Whatever resolution this maths out to be with DLAA, we're seeing 194FPS - 5.15hz - combined with four times that latency
This 200FPS experience has the same latency and feel as running a regular old 49.2fps




Summary:

If you can use this tech without your GPU hitting 100%, you can have a good experience.
If you hit 100% usage, even erratically in gameplay: no. God no.


To show the worst possible hell, lets try that 2.25x res with DLAA
1673935690333.png

that cinematic 19hz feel really sells it, ya know?
 
Last edited:
Back
Top