• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Elusive FidelityFX Super Resolution Coming This June?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,680 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD FidelityFX Super Resolution (FSR), the company's elusive rival to the NVIDIA DLSS technology, could be arriving next month (June 2021), according to a Coreteks report. The report claims that the technology is already at an advanced stage of development, in the hands of game developers, and offers certain advantages over DLSS. For starters, it doesn't require training from a generative adversarial network (GAN), and doesn't rely on ground truth data. It is a low-overhead algorithmic upscaler (not unlike the various MadVR upscalers you're used to). It is implemented early in the pipeline. Here's the real kicker—the technology is compatible with NVIDIA GPUs, so studios developing with FSR would be serving both AMD Radeon and NVIDIA GeForce gamers.



View at TechPowerUp Main Site
 
compatible with nvidia gpu's... yeah this is going to be one of those virtual resolution things again that makes games look like crap. hard pass. i'll still read the reviews don't get me wrong, but I don't feel confident this will compete equally with DLSS 2.0 or DLSS 2.1

that being said, I'm still happy I have my rx 6800 gpu.
 
If it's half as good as DLSS 2.0 but is super easy to put in all new titles, it'll be a winner, more choices are always good. I wouldn't be expecting results as good as DLSS 2.0 though.

Not confident on timelines though, 2 months ago they had no idea how it would even work, and it launches in a month? lets see.
 
If DLSS was over-engineered then it's quite possible that an alternate approach can prove just as good as to have no perceived difference during game-play.

Besides, nay-sayers; if you're not a software or hardware engineer, or data scientist with knowledge of machine learning, then I struggle to see how you can have a valid opinion outside of manufacture hating or just plain moaning.
 
Nvidia jumped the gun, they released utter crap 2000 series graphics, uber expensive, pathetic performance and had to sell it basically on "features", so they forced ray tracing way too early! They knew it was shit, they knew it tanks performance insanely, so they decided to basically run games at 720p and upscale them to 4k and basically have garbage graphics and call it DLSS.

So you enable ray tracing to have *supposedly* better graphics, but it runs like crap, so you enable DLSS to ruin your graphics quality. So you get NONE of the benefits! You essentially have worse graphics running at worse fps, all in order to have mildly better shadows, which actually become worse through dlss upscaling.

And again there is a very clear visual difference between native and DLSS. It does look significantly worse than native, even in Control the best example Nvidia has, it still has shimmers and deinterlacing.

What DLSS essentially is, is an automated graphic settings reduction. It reduces graphic settings and thus graphical quality for better performance, that is what it is. You get worse quality graphics at better performance, I get that in ALL of the games by lowering the quality in the settings. Going from Ultra to medium in most cases provides a 50-60% uplift in performance for small visual downgrades, in fact I'd argue if you tinker with the graphics settings you can have better visual quality and faster performance than DLSS enabled.
 
Besides, nay-sayers; if you're not a software or hardware engineer, or data scientist with knowledge of machine learning, then I struggle to see how you can have a valid opinion outside of manufacture hating or just plain moaning.
So we're not allowed to have an opinion when we see the tech in action and make up our own minds? it doesn't exist yet to there are many unknowns, but we've also allowed to spitball and share opinions on what has been shared so far.
And again there is a very clear visual difference between native and DLSS. It does look significantly worse than native, even in Control the best example Nvidia has, it still has shimmers and deinterlacing.

What DLSS essentially is, is an automated graphic settings reduction. It reduces graphic settings and thus graphical quality for better performance, that is what it is. You get worse quality graphics at better performance, I get that in ALL of the games by lowering the quality in the settings. Going from Ultra to medium in most cases provides a 50-60% uplift in performance for small visual downgrades, in fact I'd argue if you tinker with the graphics settings you can have better visual quality and faster performance than DLSS enabled.
So misrepresented it's almost comical to read.

Yeah for years you've been able to just run games at lower resolutions, surely that's not a revelation, but DLSS is different. It's is not the same as just running 1080p internally and displaying it on a 2160p monitor, it looks disproportionately better than that because of the technology in play.

Go on then, in Control or Metro Exodus Enhanced Edition, don't enable DLSS and tinker other settings to have better visual quality and performance than with DLSS on, I'll wait.
 
So we're not allowed to have an opinion when we see the tech in action and make up our own minds? it doesn't exist yet to there are many unknowns, but we've also allowed to spitball and share opinions on what has been shared so far.
Of course you can have an opinion, it's just unlikely to be an educated one - and therefore not particularly useful outside of being annoying.

It's about as useful as me saying a Porsche 911 is a crap car without actually being able to drive.
 
Of course you can have an opinion, it's just unlikely to be an educated one - and therefore not particularly useful outside of being annoying.
Well given it's an opinion, it really only serves to represent personal findings and preferences, but I agree to an extent that I don't really value opinions of people who haven't had the experience themselves - first hand.

Like if you owned a Porsche 911 and said it was crap, that's more valid than someone who read the spec sheet, saw maybe a review or 2 and decided they think it's crap based on that extremely limited and possibly misleading or inaccurate information.

I started a thread about DLSS 2.0 for that exact reason, lots of people who've never seen it with their own eyes / own an RTX card ready and willing to crap all over it, I am interested in what people who actually see it/game with it on think about it.
 
Well given it's an opinion, it really only serves to represent personal findings and preferences, but I agree to an extent that I don't really value opinions of people who haven't had the experience themselves - first hand.

Like if you owned a Porsche 911 and said it was crap, that's more valid than someone who read the spec sheet, saw maybe a review or 2 and decided they think it's crap based on that extremely limited and possibly misleading or inaccurate information.

I started a thread about DLSS 2.0 for that exact reason, lots of people who've never seen it with their own eyes / own an RTX card ready and willing to crap all over it, I am interested in what people who actually see it/game with it on think about it.
I understand where you are coming from, but the devil's advocate of that statement is threads like that can easily become an echo chamber.
Given how relatively expensive RTX cards are even before the mining boom, people will tend to defend their purchase more than usual.
 
Nvidia jumped the gun, they released utter crap 2000 series graphics, uber expensive, pathetic performance and had to sell it basically on "features", so they forced ray tracing way too early! They knew it was shit, they knew it tanks performance insanely, so they decided to basically run games at 720p and upscale them to 4k and basically have garbage graphics and call it DLSS.

So you enable ray tracing to have *supposedly* better graphics, but it runs like crap, so you enable DLSS to ruin your graphics quality. So you get NONE of the benefits! You essentially have worse graphics running at worse fps, all in order to have mildly better shadows, which actually become worse through dlss upscaling.

And again there is a very clear visual difference between native and DLSS. It does look significantly worse than native, even in Control the best example Nvidia has, it still has shimmers and deinterlacing.

What DLSS essentially is, is an automated graphic settings reduction. It reduces graphic settings and thus graphical quality for better performance, that is what it is. You get worse quality graphics at better performance, I get that in ALL of the games by lowering the quality in the settings. Going from Ultra to medium in most cases provides a 50-60% uplift in performance for small visual downgrades, in fact I'd argue if you tinker with the graphics settings you can have better visual quality and faster performance than DLSS enabled.
Two options, or you're telling a joke, or you are very missinformed. Games like Control or Death Stranding looks better with DLSS than native resolution...
 
I understand where you are coming from, but the devil's advocate of that statement is threads like that can easily become an echo chamber.
It's funny because I found some forums, and this one in particular, to be an Anti-DLSS (among other things) echo chamber, and a theme was, the vast majority, if not all of these users 'hating', don't own or use RTX cards.

So I can see where the distinct chance for the opposite exists and people just rave on it with no constructive criticisms whatsoever, but so far it actually seems more balanced than the free for all, which I find overwhelmingly negative when everyone that has read a review, or seen a compressed youtube video, or some cherry-picked side-by-sides, but comes to the party with their bias and just dumps on it.

I SO want AMD to succeed with FSR, it's only going to be good for us all if they do, but as it stands I have serious doubts.
 
If it doesn't work with all games it will be as useless as DLSS.

DLSS is far from useless, used it in many games so far, especially when I output to my 4K OLED at 120Hz/Gsync, looks and runs awesome. Death Stranding looked better than native with DLSS enabled using quality preset. Text was sharper. Textures looked better. DLSS is very good for elminating jaggies.

DLSS support will explode over time, native support in most popular game engines is going to happen. Already confirmed. Unreal Engine, Unity - also DLSS 3.0 should allow all games that support TAA to force DLSS instead, which is 100s of titles even older games

DLSS is the true magic of RTX series. Allows for a huge fps boost or RT without huge fps drop. Ray Tracing is a joke without DLSS but even without DLSS, Nvidias 3000 series beats AMDs 6000 with ease in RT scenarios. Ray Tracing can be great in some titles, single player ones. In multiplayer it's all about performance for me tho.

If DLSS is implemented well (most DLSS 2.x games) - the tech is insanely good. Free performance and pretty much identical image quality, sometimes better, there's several videos and tests with side by side comparisons. Why say no to 50-75% more fps _and_ improved visuals? Just never use motion blur with DLSS (who uses motion blur anyway... sigh, motion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs)

DLSS 1.0 sucked pretty much, blurred crap but DLSS 2.x is nothing like 1.0 some people still think DLSS means blur tho haha.

It's funny how people with AMD GPUs or GTX cards always seem to think DLSS is useless :laugh: I Wonder why.
 
Last edited:
I'm wondering if it can be used to render the game at 1600x900p and upscale with this feature to 1920x1080p, or 1080p to 1440p and display it on an 1080p monitor. Just curious.
 
motion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs
I agree with most of the rest of your post but I also think motion blur really depends. I can't argue the reason you gave, but depending on the implementation I like motion blur even at high framerates, notably more so the best type; per-object motion blur. Poorly done full-scene (camera) motion blur can be BARF.

Some people just flat out hate any blur whatsoever, I get that, but per-object motion blur intends to mimic the way we actually see objects in motion (try waving your hand back and forth in front of your eyes, do you see a blurred hand or frame by frame snaps of a sharp hand?), I find turning it off games can look juddery, even at 100+ fps now.
 
DLSS is far from useless, used it in many games so far, especially when I output to my 4K OLED at 120Hz/Gsync, looks and runs awesome. Death Stranding looked better than native with DLSS enabled using quality preset. Text was sharper. Textures looked better. DLSS is very good for elminating jaggies.

DLSS support will explode over time, native support in most popular game engines is going to happen. Already confirmed. Unreal Engine, Unity - also DLSS 3.0 should allow all games that support TAA to force DLSS instead, which is 100s of titles even older games

DLSS is the true magic of RTX series. Allows for a huge fps boost or RT without huge fps drop. Ray Tracing is a joke without DLSS but even without DLSS, Nvidias 3000 series beats AMDs 6000 with ease in RT scenarios. Ray Tracing can be great in some titles, single player ones. In multiplayer it's all about performance for me tho.

If DLSS is implemented well (most DLSS 2.x games) - the tech is insanely good. Free performance and pretty much identical image quality, sometimes better, there's several videos and tests with side by side comparisons. Why say no to 50-75% more fps _and_ improved visuals? Just never use motion blur with DLSS (who uses motion blur anyway... sigh, motion blur is only something you use when you try and mask a low framerate, aka consoles and low end PCs)

DLSS 1.0 sucked pretty much, blurred crap but DLSS 2.x is nothing like 1.0 some people still think DLSS means blur tho haha.

It's funny how people with AMD GPUs or GTX cards always seem to think DLSS is useless :laugh: I Wonder why.
I have doubts that DLSS support will explode over time. I don't disagree that DLSS is a great technology, but besides Unreal and Unity, it seems like the other game developers are still training DLSS at a game level and not engine level. DLSS have been out for quite a long time now, and while it took the second iteration to fix the issues with DLSS 1.0, I am still not seeing a lot of take ups. There are many game engines out in the market, and Unreal is not the only one. Unity is mostly applicable to mobile games in my opinion, and that's only by the end of the year.

The problem for game developer is this, if they want to push for next gen/high end graphics, i.e. RT, the existing hardware just can't run the game smoothly, i.e. at a high framerate. So with no alternatives to DLSS at this point and vast majority of gamers use Nvidia hardware, it made sense to spend resources to integrate DLSS to improve user experience. If there is a good alternative that is hardware agnostic, and it works to the tune of being 75% as good as DLSS, yet easier to optimize for, I would expect DLSS to go down like the GSync vs FreeSync comparison. For game developers, time to market is crucial and any time not wasted on optimizing for proprietary technology and able to achieve good enough results is a win for them. AMD is clearly aware of this, and purposely making this technology to be hardware agnostic and calling out that they are working with game developers on the solution. Just my opinion, and I am looking forward to see how FSR performs when its released.
 
Last edited:
I agree with most of the rest of your post but I also think motion blur really depends. I can't argue the reason you gave, but depending on the implementation I like motion blur even at high framerates, notably more so the best type; per-object motion blur. Poorly done full-scene (camera) motion blur can be BARF.

Some people just flat out hate any blur whatsoever, I get that, but per-object motion blur intends to mimic the way we actually see objects in motion (try waving your hand back and forth in front of your eyes, do you see a blurred hand or frame by frame snaps of a sharp hand?), I find turning it off games can look juddery, even at 100+ fps now.

Yeah i agree implementation always matters, it's just that when you use motion blur with DLSS you can experience (even more) smearing
 
Of course you can have an opinion, it's just unlikely to be an educated one - and therefore not particularly useful outside of being annoying.

You do realize the irony in your claim, right ? If you're not a engineer yourself then you don't even get to tell us that our opinion is probably wrong because how would you know that, right ?
 
I have doubts that DLSS support will explode over time. I don't disagree that DLSS is a great technology, but besides Unreal and Unity, it seems like the other game developers still seems to be training DLSS at a game level and not engine level. DLSS have been out for quite a long time now, and while it took the second iteration to fix the issues with DLSS 1.0, I am still not seeing a lot of take ups. There are many game engines out in the market, and Unreal is not the only one. Unity is mostly applicable to mobile games in my opinion, and that's only by the end of the year. The problem for game developer is this, if they want to push for high end graphics, i.e. RT, the existing hardware just can't run the game smoothly, i.e. at a high framerate. So with no alternatives to DLSS at this point and vast majority of gamers use Nvidia hardware, it made sense to spend resources to integrate DLSS to improve user experience. If there is a good alternative that is hardware agnostic, and it works to the tune of being 75% as good as DLSS, yet easier to optimize for, I would expect DLSS to go down like the GSync vs FreeSync comparison. For game developers, time to market is crucial and any time not wasted on optimizing for proprietary technology and able to achieve good enough results is a win for them. Just my opinion, and I am looking forward to see how FSR performs when its released.

Well yeah, but there's alot of popular pc games using Unity, like Valheim, Rust, Escape from Tarkov etc.

DLSS support in Unreal Engine tho, is far more usable for most PC gamers and that is already natively integrated, this engine is widely used for AAA games, also very demanding ones
 
Whoa. So it is coming this June? Not bad. I really am curious how this FSR will stand against the DLSS 2.0 and 2.1. If it actually is any good. But the idea FSR working for both AMD and NV cards, easier to implement to any game, is a damn killer if this feature turns out to be good or at least close to what DLSS 2.0 offers. A damn killer I tell ya'll.
 
Two options, or you're telling a joke, or you are very missinformed. Games like Control or Death Stranding looks better with DLSS than native resolution...
are you trying to say that Control or Death Stranding looks better on 720p DLSS 2.0 upscaled to 1440p (performance mode) than with no DLSS at native 1440p? :confused:
 
are you trying to say that Control or Death Stranding looks better on 720p DLSS 2.0 upscaled to 1440p (performance mode) than with no DLSS at native 1440p? :confused:
Generally when this is seen/said they're referring to quality mode, not performance mode.

1440p dlss quality mode would be upscaled from 960p, balanced 835p, performance 720p, ultra performance 480p.

And yeah , control, and death strading tend to resolve more fine detial in DLSS quality mode. There's more to the image than that of course but certainly that aspect is fantastic.
 
You do realize the irony in your claim, right ? If you're not a engineer yourself then you don't even get to tell us that our opinion is probably wrong because how would you know that, right ?
Luckily I am a software engineer, but I wouldn't make claims about anything I didn't fully understand, hadn't tried myself, or felt otherwise unqualified to speak about.

In preference, I prefer to try and encourage people to have a more balanced and fair outlook on things - rather than just dismissing something as being garbage before it's even released.
 
are you trying to say that Control or Death Stranding looks better on 720p DLSS 2.0 upscaled to 1440p (performance mode) than with no DLSS at native 1440p? :confused:

Yes, check it out Digital Foundry reviews about DLSS on that two games, I don't understand the kind of magic but... it's what i see

Generally when this is seen/said they're referring to quality mode, not performance mode.

1440p dlss quality mode would be upscaled from 960p, balanced 835p, performance 720p, ultra performance 480p.

And yeah , control, and death strading tend to resolve more fine detial in DLSS quality mode. There's more to the image than that of course but certainly that aspect is fantastic.
Of course, quality mode FTW! :)
 
I maintain that DLSS marketing is fake, DLSS 4k isnt 4k, its 1080p (depending on the setting) upscaled, so comparing 4k vs 4k DLSS is just nonsense, it should be instead about comparing 1080 vs 1080 with DLSS upscaled and then talking about the image quality improvements (just like how any form of AA improves the image quality).

apart from that, I think stuff like DLSS should be about what its meant for, making Ray Tracing easier to do and so imo DLSS or equivalent should reduce and then upscale the resolution of Ray Traced tech only, so those ray traced reflections for example, make it quarter res so its easier on the hardware and then use DLSS etc to upscale that so it does not look like crap, and do the same for shadows etc.
 
Back
Top