• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD FSR 2.0 Quality & Performance

Only "ray-traced" reflections... though.
Nope. All RT effects. Reflections are what devs USUALLY use, but not always.
For Metro its RTGI. For other games it is shadows or RTAO.
 
I hope both technologies are here to stay and hopefully they become industry standard where all games will come with out of the box.
 
Only "ray-traced" reflections... though.
Huh? They support literally anything you can do with RT, just like AMD or NVIDIA (or Intel Arc) PC cards can.
The reason they might seem more "limited" is simply performance issue, consoles aren't top end PC hardware where GPU consumes more than the whole console
 
Some games are starting to add a sharpening slider for DLSS and honestly this is what DLSS has sorely needed. At least in deathloop the FSA 2.0 + sharpness is the best output here by a wide margin.

I think FSR 2.0 is close to DLSS 2.3 in terms of output quality; it doesn't really matter if there are minor differences because the effectiveness of FSR and DLSS varies from game to game and from scene to scene. The basic mechanics of FSR 2.0 much more closely match DLSS now - temporal sampling of jittered camera positions with a couple of features designed to combat the two worst drawbacks of this technique (thin feature shimmer and motion trail artifacts)

Where I think FSR is vastly superior to DLSS is the adaptive resolution. Finally you can run something demanding at a target framerate and not have to pause, sacrifice some graphics options, potentially restart the game to apply them, and then wait until the next big firefight and hope it's enough. You'll (maybe) notice it getting a bit blurry in the heat of the moment, but it's only temporary and you don't have to sacrifice those higher quality settings or resolution for 99% of the gameplay just to cover those 1% peak demands.
 
I really enjoyed the introduction to the technology. It was a great write up.

It is too bad that the minimum requirements are so high. I could really use this with my 1060 but it is below min spec for 1080p up scaling. Those who need it the most can't even use it.
why not try it first before getting upset
 
I hope both technologies are here to stay and hopefully they become industry standard where all games will come with out of the box.
I am confident that FSR will become industry standard fast - as it's the only tech that runs on all platforms. Game developers may continue to implement both as long as Nvidia make it easy for them and incentivise them to do so, but look at it from the game developer's perspective:

Do you:
  1. Implement a single solution (FSR) which works for all three of your target markets with no restrictions, and is officially endorsed by the exclusive GPU vendor for the two console markets.

    OR

  2. Do all the work to implement FSR but also do additional work to add DLSS that is only usable by about one quarter of one of your three target markets, and adding it is redundant because it doesn't really do anything special that FSR already does.
Please, find me a good reason why a Dev would pick the second option from now on? Outside of financial incentives from Nvidia, you just wouldn't.
 
Where I think FSR is vastly superior to DLSS is the adaptive resolution. Finally you can run something demanding at a target framerate and not have to pause, sacrifice some graphics options, potentially restart the game to apply them, and then wait until the next big firefight and hope it's enough. You'll (maybe) notice it getting a bit blurry in the heat of the moment, but it's only temporary and you don't have to sacrifice those higher quality settings or resolution for 99% of the gameplay just to cover those 1% peak demands.
DLSS in Deathloop does support dynamic resolution scaling. It was tested by someone on TPU months ago. It seems that is a game specific feature.
 
DLSS in Deathloop does support dynamic resolution scaling. It was tested by someone on TPU months ago. It seems that is a game specific feature.
Ah okay, I've not seen it as a feature in any games yet, though I have seen games that let you combine the in-game dynamic resolution scaling with DLAA.

If the dynamic scaling actually affects the internal render resolution of DLSS then that's good. DLAA or DLSS with the game's own dynamic resolution is just a simple per-frame upscale without any of the motion-vector or temporal buffers that make DLSS 2.3 and FSR 2.0 better.
 
I thought that it was only for DLSS 1.0 ? you can use dlss in the real time preview of unity and unreal engine, and I really doubt that Nvidia servers are computing every single project that are being made

DLSS 2.x still use a neural network but it's not trained per game. Nvidia ship the inference and it's what run on the tensor core.

One of the thing is people think you need AI for a lot of thing were a good algorithm can do the work just fine and be much easier and cheaper to run. But writing algorithm require more work than to train an AI. AI is being used right now to brute force so many things that could just have been done using good programing.

there are area where AI is really useful and cannot be beaten by clever algorithm. but these area is just a small subset of what people try to apply AI to.
 
It is too bad that the minimum requirements are so high. I could really use this with my 1060 but it is below min spec for 1080p up scaling. Those who need it the most can't even use it.

It will still work. Unlike NVIDIA, AMD doesn't stop you from trying things that aren't explicitly supported.
 
Instead of wastin precious developer time on mimicing lower settings, why don't you simply change the settings from ultra high to very high? The result will be the same with regards to the FPS improvement.. :D
Do you really think that everyone does that or has hardware for that. I don't even use presets, but I manage with low-high settings, with most set to medium. Technology was more interesting for low end gamers, that may be able to use weaker hardware that otherwise might not run game you want. The problem is still picture quality. The main problem with gaming, which has been a problem for at least decade is that games need faster and faster hardware to run but very often there's nearly no visual quality gain in newer games. You can run 10 year old game at very high settings and it will look better than new game at low, but old game could run well with GTX 650, meanwhile new game will be slideshow. I frankly don't want games to look awful, but I also don't care to much about visual quality. However, I hate when newer games are more demanding and look worse or run worse for no obvious reason. It's damn shame that many devs don't know how to dev properly.
 
DLSS and RT are proprietary "features" by nvidia with no value for the user who can think.
The PS5 and new XBox do not support RT, so the gamers actually do not need it.

AMD's mistake is that it follows instead of thinking proactively about new unique features with real value.
Jesus you're delusional.
RT is not proprietary, it existed since the 1980's
Ray Tracing is the future of graphics.
Xbox and PS5 DO support Ray Tracing.
 
Does sound like a hater, plus, everyone seems to forget that for all intents and purposes, Nvidia has limitless financial resources when compared to AMD, but expect AMD to not only compete, but to do better (while also not seeking profit in the same way as Nvidia....so many people think AMD should be a non-profit company and hold them to standards they hold nobody else to)....this is a great big step, and should only get better as long as Nvidia doesn't pull an intel and instead of innovating, just use vast amounts of money to box AMD out and get developers to be exclusive to Nvidia IP....which they will
Yeah, it's like the pot, and the kettle around here at times... His comment was typical of an NVIDIA fan boy, but you trying to say people want AMD to run as a non profit is just as fan boyish.

AMD, and their record setting quarters, clearly shows they are as focused on profits as NVIDIA, or Intel. Digging deeper, they got rid of their sub $300 CPU's last go, and sold a silly amount of slower 3k chips for those who couldn't budget $300+ for the improved 5k chips. Yesterday's GPU refresh offering around 5% more performance for 10%+ more money is as bad any of their competitors tactics. Well NVIDIAs original MSRP on their 20 series being super high to help sell excess 10 series GPU's was worse, but it's along the same lines. AMD 6500xt was a bad joke, right? There's more if you want.

Competition is the only thing keeping tech prices at these inflated levels despite the mining, and PC boom being over. I'm hoping Intel jumps in the GPU game soon, and succeeds hard. Both GPU makers took advantage of their customers, and any decent 3rd option would help out us consumers.
 
It will still work. Unlike NVIDIA, AMD doesn't stop you from trying things that aren't explicitly supported.
If it "works" but the impact is severe enough to leave you with (nearly) unplayable frame rate, then it doesn't work. I mean, those minimum recommendations weren't thrown out just for fun.

EDIT: Talking about 1080p, is there going to be a 1080p comparison?
 
Last edited:
RT... it existed since the 1980's
Ray Tracing is the future of graphics.

It had been the "future" since 80s and never came :D
You don't have the technology to make ray-tracing working for real gaming. Unless every computer is connected via the internet with many powerful ray-tracing supercomputers in order to give you that constant framerate that you wish between 60 FPS and 144 FPS (for example).

you're delusional.

I'm delusional or you believe in unicorns? :D
 
It had been the "future" since 80s and never came :D
You don't have the technology to make ray-tracing working for real gaming. Unless every computer is connected via the internet with many powerful ray-tracing supercomputers in order to give you that constant framerate that you wish between 60 FPS and 144 FPS (for example).



I'm delusional or you believe in unicorns? :D
Dude, you didn't even understand what I've said. You said that RT is useless and proprietary when in reality it's a technology that existed since 1980's and it's the only way to get photorealistic graphics. What real time rt in games have anything do with this?
 
Jesus you're delusional.
RT is not proprietary, it existed since the 1980's
Ray Tracing is the future of graphics.
Xbox and PS5 DO support Ray Tracing.
DirectX 12 has been expanded to cover ray tracing, machine learning and faster storage. This is why these features should be normal parts of any benchmarking. Not a special sub section of tests. The reason these feature are treated differently is the perception NVidia supports these features better and delivers more performance. Thus AMD are to be protected from the negative results in benchmarks.
 
Last edited:
For developers that already support DLSS 2.0, adding FSR 2.0 support will be easy, AMD talks about days.

Exactly that !!
When DLSS is present AMD have stated that implementing FSR 2.0 is very easy. "Deathloop" is a game that already has DLSS implemented , so what we are seeing here is the best case scenario for FSR 2.0
Of course , the next logical question is "how FSR 2.0 performs when DLSS is not present".
AMD has stated that this will be a much lengthier procedure ( 05:17 at the following video) ,and of course we don't know yet what kind of quality will this implementation have.
 
Exactly that !!
When DLSS is present AMD have stated that implementing FSR 2.0 is very easy. "Deathloop" is a game that already has DLSS implemented , so what we are seeing here is the best case scenario for FSR 2.0
Of course , the next logical question is "how FSR 2.0 performs when DLSS is not present".
AMD has stated that this will be a much lengthier procedure ( 05:17 at the following video) ,and of course we don't know yet what kind of quality will this implementation have.
Customer wins I guess, now that FSR 2 is out, benchmarks will have to accept DLSS/FSR 2 results. There is no reason not to accept RT and upscaling now.
 
If it "works" but the impact is severe enough to leave you with (nearly) unplayable frame rate, then it doesn't work. I mean, those minimum recommendations weren't thrown out just for fun.
There are still a big performance it versus running the game at the lower resolution directly.

By example in the previous test
1440p Native with TAA: 70 FPS
4K with FSR 2.0 Quality (1440p internal resolution): 52 fps -26% loss vs native 1440p
4K with DLSS 2.0 Quality (1440p internal resolution): 54 fps -23% loss vs native 1440p

So if you can take a 25% hit on the internal resolution, you can probably run FSR 2.0. from what I see.
 
So wtf is DL in DLSS these days? Apparently AMD is doing it without any neural network shenanigans.
 
DLSS and RT are proprietary "features" by nvidia with no value for the user who can think.
The PS5 and new XBox do not support RT, so the gamers actually do not need it.

AMD's mistake is that it follows instead of thinking proactively about new unique features with real value.
RT is not proprietary to Nvidia. Nvidia RTX was before RT became a feature set of DX12.
 
RT is not proprietary to Nvidia. Nvidia RTX was before RT became a feature set of DX12.

Okey, so nvidia rtx is proprietary, sorry for missing the "x" in the end..

AMD said that you can get your ray-tracing only in the cloud. Good luck!

1652372119321.png
 
Back
Top