• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon Super Resolution RSR Quality & Performance

Same…3440x1440 there’s nothing I could downscale to that would work right..I do play quite a few games with FSR which has been great for the most part but I’ve seen some bad implementations. Hopefully FSR 2.0 will make it into the games already using it.
Games adhering to accepted standard resolutions and the mess that is ultrawide aspect ratios is likely to blame here. I mean, there are tons of standard 16:9 resolutions, and they all match each other's proportions. Ultrawide ("21:9", though it rarely is) has ... three? Four? And most are really high? And none of them are actually the same aspect ratio? You have 2560x1080 on the low end (2.370370370:1, 21.33333:9), 3440x1440 as the most common option (2.3888889:1, 21.5:9), 3840x1600 is the premium/38" variant (2.4:1, 21.6:9), and then you have the ultra high end "5k" 5120x2160 (same as 2560x1080 as it's 2x the pixels in each direction). So, scaling between any of these resolutions will either result in a stretched or cropped image, or in black bars somewhere. I can understand why nobody would want to implement a system that allows for that kind of scaling without tuning it per application, as risks for distortions and things looking bad are quite significant.
 
RSR is a copy of NIS in his the way you apply it and both are spatial upscaler, but NIS is Lanczos + a their own brand sharpening pass where RSR/FSR use Edge-Adaptive Spatial Upsampling plus Contrast Adaptive Sharpening.

The confusion started when a prominent youtuber found that FSR had Lanczos algorithm in it but it's way more than just that. FSR focus on cleaning the edges it can detect to make them smoother. Also Lanczos can introduce ring effect and the way each algorithm use to remove this artefact is different.

Technically, FSR/RSR is more advanced than NIS. And since it's open source, Nvidia could just use it. Not going to happen but if they wanted they could.
 
Same…3440x1440 there’s nothing I could downscale to that would work right..I do play quite a few games with FSR which has been great for the most part but I’ve seen some bad implementations. Hopefully FSR 2.0 will make it into the games already using it.
I used 1920x800 in the game but I wasn't impressed by the graphics quality.
 
I used 1920x800 in the game but I wasn't impressed by the graphics quality.
That's a pretty high scaling factor though - nearly 1.8x, and a rather weird ratio to boot. Still, was your quality comparable to the 1080p-to-4k examples here (2x scaling factor)? And how did it look compared to running that resolution natively? Upscaling will always be some kind of tradeoff between performance and visual quality (until we get ubiquitous AI-quality upscaling, at least), so judging it only against the higher resolution is only one side of the comparison - there should also be a comparison against the lower resolution without upscaling. The closer to the former and the further from the latter the upscaled quality is, the better the result.
 
Games adhering to accepted standard resolutions and the mess that is ultrawide aspect ratios is likely to blame here. I mean, there are tons of standard 16:9 resolutions, and they all match each other's proportions. Ultrawide ("21:9", though it rarely is) has ... three? Four? And most are really high? And none of them are actually the same aspect ratio? You have 2560x1080 on the low end (2.370370370:1, 21.33333:9), 3440x1440 as the most common option (2.3888889:1, 21.5:9), 3840x1600 is the premium/38" variant (2.4:1, 21.6:9), and then you have the ultra high end "5k" 5120x2160 (same as 2560x1080 as it's 2x the pixels in each direction). So, scaling between any of these resolutions will either result in a stretched or cropped image, or in black bars somewhere. I can understand why nobody would want to implement a system that allows for that kind of scaling without tuning it per application, as risks for distortions and things looking bad are quite significant.
Totally agree and I knew going in there would be issues at some point, tweaks, hacks etc but I’m also a “pure native” guy so while sure I feel a little left out this really isn’t a feature I would use unlike FSR. So far I’ve only had to “hack” 2 games(Plague’s Tale which I don’t like the “solution” because it’s using Cheat Engine and well basically anything TellTale but there’s a one click “patcher” for any of them.) The new Far Lone Sails only goes up to 1440 so nothing I can do about that, hopefully the devs sort it. I stayed away from UW as long as I.could but this was always going to be my “final monitor” and I just happened to have extra cash and just went for it. This is just one of those “compromises” I knew I’d have to make.
 
I'm loving the idea of RSR, specifically for demanding games that don't run well at 4K, but lack FSR/DLSS support.

Something like two-thirds of my Steam library could benefit from it, if I had an AMD GPU for gaming on still, but I'm more interested in how a planned Ryzen 6800U laptop will benefit from 1080p > 4K upscaling with RSR on the RDNA2 iGPU. The requirements of "Radeon 5000-series" means that it's unlikely to work on my Renoir's Vega8 graphics.

Games adhering to accepted standard resolutions and the mess that is ultrawide aspect ratios is likely to blame here. I mean, there are tons of standard 16:9 resolutions, and they all match each other's proportions. Ultrawide ("21:9", though it rarely is) has ... three? Four? And most are really high? And none of them are actually the same aspect ratio? You have 2560x1080 on the low end (2.370370370:1, 21.33333:9), 3440x1440 as the most common option (2.3888889:1, 21.5:9), 3840x1600 is the premium/38" variant (2.4:1, 21.6:9), and then you have the ultra high end "5k" 5120x2160 (same as 2560x1080 as it's 2x the pixels in each direction). So, scaling between any of these resolutions will either result in a stretched or cropped image, or in black bars somewhere. I can understand why nobody would want to implement a system that allows for that kind of scaling without tuning it per application, as risks for distortions and things looking bad are quite significant.
Ultrawide "standards" were such a mess that after an 18-month experiment half a decade ago trying to daily-drive "21:9" I just gave up and bought another 32" 16:9 display instead. It's big enough that I can just run it with black bars top and bottom if I really really need the wider aspect ratio. True cinema DCP "2.35:1" aspect ratio isn't even actually 2.35:1 and at 4096 x 1716 it's a super-odd resolution that doesn't fit panel manufacturers neatly. 3840x1600 can't handle it properly, and if you're not getting ultrawide for cinematic content, what is the point?!
 
Last edited:
Something like two-thirds of my Steam library could benefit from it, if I had an AMD GPU for gaming on still, but I'm more interested in how a planned Ryzen 6800U laptop will benefit from 1080p > 4K upscaling with RSR on the RDNA2 iGPU. The requirements of "Radeon 5000-series" means that it's unlikely to work on my Renoir's Vega8 graphics.
Sadly that’s a huge complaint right now. Because of the Vega iGPU you can’t use RSR this is a very poor effort considering how much it would benefit laptops….
 
Sadly that’s a huge complaint right now. Because of the Vega iGPU you can’t use RSR this is a very poor effort considering how much it would benefit laptops….
Vega is ancient though, and I'm not even convinced that any of the Vega IGPs can really game at > 720p in anything other than decade-old titles. I'm lucky in that my laptop has a 100Hz panel that I can run at 90, 85, and 75Hz too - so 720p50 (half vsync) actually looks reasonable for FPS games and 1080p42 or 38 (half 85/75Hz) works for sightseeing games that don't need fast aiming or reflexes.
 
Vega is ancient though, and I'm not even convinced that any of the Vega IGPs can really game at > 720p in anything other than decade-old titles.

New products using it are still being released though...

No need for decade-old titles either, just not anything recent. And emulation stuff would also benefit a lot (although not a targetable use case)
 
Vega is ancient though, and I'm not even convinced that any of the Vega IGPs can really game at > 720p in anything other than decade-old titles. I'm lucky in that my laptop has a 100Hz panel that I can run at 90, 85, and 75Hz too - so 720p50 (half vsync) actually looks reasonable for FPS games and 1080p42 or 38 (half 85/75Hz) works for sightseeing games that don't need fast aiming or reflexes.
Yeah a friend of mine has the ASUS 5900H/6800M he had some luck with basic old stuff like Portal etc on it but the fact it’s basically “blocking” the compatible dGPU is not cool….
 
You must have had some particularly bad luck. As someone who's never owned an Nvidia GPU, I've obviously seen my share of driver bugs and issues too, but nothing I would describe as particularly bad. (I never owned a 5700 XT though; that situation is still pretty bad, but seems to me like a hardware bug of some kind.) In my experience AMD drivers for the past half decade or so have been frequently updated, with most bugfixes being rapid and well executed. I'm having a pretty bad time with Elden Ring still, but ... that's Elden Ring, and not a GPU driver issue. Beyond that, and some crash-to-desktop bug with Alien: Isolation a long, long time ago, I can't remember when I last ran into anything particularly annoying.
Bad luck is all I have, but when I have bad experiences, I don't bother with whatever has been giving me those bad experiences anymore.
 
Really neat overall, definitely looks better than I expected. Although as it stands right now, for me and my current hardware (R7 5800X, 6700 XT, 1440p 144hz display), I can maintain a constant 144fps or higher at high (if not mostly maxed) quality settings, and it's only the very newest of games (which often aren't the most well-optimized, for the first several months at least) that I have to settle for a lower constant fps at higher quality, or lower quality settings if I want to maintain a constant high fps. Halo Infinite was a good example, I had to make use of the in-game dynamic resolution feature to have a great experience (mainly for fps consistency). Would be interesting to see how RSR handles it now. Raytracing is also very low on my priority list currently, so even if I'm playing a game that supports it, like Doom Eternal, I'd much rather play with it off to be able to max everything out and have extremely consistent + high fps. That game in particular is so fast paced I think it makes very little difference overall, so even with RSR available I doubt I'd use it much. And we all know that AMD's raytracing tech isn't quite there yet, especially on something like the 6700 XT, so I'm generally of the mindset that it's just not something I'm expecting to use despite the existence of RSR.

This is all to say, I think it's good feature and I'm very glad it exists, but for me personally I doubt I'll be using it much until I start consistently playing games that push my hardware to its limits, which I imagine won't be the case for a while - hopefully. I agree with the common sentiment that this feature would be much more useful for older GPU architectures. At the very least Vega should have been included. But I'm not quite up in arms yet, I also agree with the notion that AMD could very well be planning on releasing the feature for older GPUs down the road, and just needs more time to tune the feature for those architectures. Idk.
 
I think it looks like a improvement and better than the initial FSR for certain by a fair bit. The only thing I wish AMD would do with it is make the render pipeline flexible in regard to which pipeline blocks are processed in which order. In the performance FSR or quality FSR you might want tone mapping/post processing/hud considerations and trade offs before or after FSR upscaling via toggle option. There are real performance and image quality considerations for all three to be applied before or after FSR applying them before is easier on GPU resources.

Tone mapping is actually low enough resource usage provided it's not using a LUT texture filter scaling that you can apply it after FSR and not really impact GPU resource usage noticeably or measurably in a significant way to actually matter, but will change the tone mapping results a bit due to being applied before or after scaling since it'll process a few less or more sub pixel details before or after depending on where it is in the render pipeline.

For both post processing and hud there are performance and image quality considerations to why you might want it before or after and that can also vary depending on FSR performance and quality as well. In the case of FSR performance you have more leeway to processing them after FSR upscale w/o it really impacting GPU resource overhead as much because it's a lower upscale quality in the first place. You might however for quality FSR still want to put preference towards image quality regardless of the added performance impact.

If AMD has the ability and option to place toggles to switch about and invert the order these render methods are processed in the render pipeline it would be good thing to consider for the reasons I tried to highlight. I feel that it's important if it can configurable for the different elements involved it should be within reason.
 
Last edited:
I think it looks like a improvement and better than the initial FSR for certain by a fair bit. The only thing I wish AMD would do with it is make the render pipeline flexible in regard to the HUD and FSR where you can swap and interchange the order in those two portions being processed. That would help for games where the HUD elements don't look so good w/o scaling them after the FSR as opposed to before it the way it's setup now though previous FSR 1.0 processed the HUD after the FSR. Having a toggle for that would be really nice. I can see why you might want to process the HUD before FSR for performance reasons of course, but at the same time I see cases where it would be preferable to do after. If AMD can make that happen they should strongly consider it.
What you're describing here is literally the difference between RSR and FSR, and is why FSR requires a per-game implementation. You can't insert a new rendering pass before the HUD is applied on a driver level, you can only apply post-processing to the finalized image after the game is done rendering it. And you certainly can't convince the game to then render the UI at the upscaled output resolution without developer input. It would be fantastic if this was doable, but it isn't.
 
It's a shame Direct X API couldn't create a symbolic links shorts that placed into the gamelauncher.exe folder to general purpose post process injection configurations that just automatically get created and use a assigned folder for multipurpose configuration techniques for post process. If the OS/API could just detect it when a game is launced and place the symbolic into the gamelauncher.exe folder it would be nice. The actual physical copying of the shader folder with reshade from game folder to game folder is a nuisance plus since they are in separate folders it makes it harder to manage configurations if you edit one for one game then play another game and use the same shader, but didn't edit it or copy it over.

There could certainly be cases where you might not want the same configuration for every game, but other cases where a tone mapping configuration that works well in one game could and should work quite well in virtually in any other game at the same time plus also be toggled on or off or substituted for another generic one that slight bit differently with a bit lighter or darker strength emphasis for example. I wonder if ReShade developers could even setup something like that in tandem with Microsoft it would be a big improvement to have game's automatically do that detected by the OS and render API itself.
 
Looks like the 5500XT might be better value for money, with this

Edit: brain not working, the card everyone half hated with the 4x PCI-E lanes
 
I know that "resolution purists" will skip over this thread but for somebody with an good ole card like GTX 1070 upscaling makes games like God of War, Forza Horizon 5 and Halo look and perform great upscaled to 4k.
 
That's a pretty high scaling factor though - nearly 1.8x, and a rather weird ratio to boot. Still, was your quality comparable to the 1080p-to-4k examples here (2x scaling factor)? And how did it look compared to running that resolution natively? Upscaling will always be some kind of tradeoff between performance and visual quality (until we get ubiquitous AI-quality upscaling, at least), so judging it only against the higher resolution is only one side of the comparison - there should also be a comparison against the lower resolution without upscaling. The closer to the former and the further from the latter the upscaled quality is, the better the result.
Someone tell me to create 2100x900, this works better and the visuals is good.
Of course, with OC 6800XT I don't need it for the games that I played but the gain is not bad.
 
fking stupid
slower and looks worse then native lol
 
fking stupid
slower and looks worse then native lol
... I think you're misunderstanding something here if that's your takeaway. It clearly looks worse than native 2160p in most cases (which is expected for upscaling - the question is by how much, and if it is a problem), but slower? Not even remotely. Remember, the "[resolution] RSR" examples here are listing the resolution being upscaled from, not the target resolution. "1080p RSR" is a 1080p image upscaled with RSR to 2160p; 1440p RSR is a 1440p image upscaled to 2160p. All of these vastly outperform native 2160p. They don't measure up visually, but they mostly provide obviously better image quality than natively rendering at those resolutions - especially if that isn't the native resolution of your monitor! - while nearly matching the performance of those lower resolutions.
 
Surely, this is the upscaling mode we want in Minecraft RTX if we need it to be better optimized when using high-end AMD cards. As well as getting both raytracing and upscaling and the 9th Generation Consoles (It's compatible with consoles you know, and Microsoft is buying into it).
 
Back
Top