• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD FSR 2.0 Quality & Performance

wooooooh so did you actually try this tech on other games? where did you come up with all these conclusion so fast? you sound like a hater not gonna lie....
during gameplay i am sure spotting the difference between the two will be so hard even for the most experienced. people are forgetting that these reconstruction techniques aren't perfect by any means, and for AMD to provide such improvements without the need for any proprietary hardware is a massive step forward.

We know how techniques similar to this work in other games. We just recently had Epic's TSR in Ghostwire which shows the same faults. Theres a reason amd kept using Deathloop as example, with its stylized graphics and softer looking image. Also, remember wizzard claiming that fsr1 stacks well against DLSS when that launched ? We all know how that turned out.

Why are we still acting like "proprietary hardware" means much when every nvidia card in the last four years had RT and DLSS ? It is not possible for you to purchase an nvidia card now and forever in the future without DLSS and RT.
 
Wouldn't call it DLSS killer when it can only provide comparable IQ at 4K, and if a game can support FSR 2.0 it would also be able to support DLSS, however, it is free and doesn't use die space, AMD deserve all congratulations here!
 
NVIDIA looks for problems to solve using the most difficult/expensive approach possible, which it can sell as exclusive features for a generation or two.
AMD looks at the problem NVIDIA discovered, the solution, and tries to find the most frugal alternative solution.
QFT. Very succinctly put and broadly what people could have expected and perhaps continue to expect unless market share shifts significantly.

Certainly not a bad thing, and caters to both sides, if you want the new and shiny, theres a choice, if you want free and open, theres another, both with their own 'costs'
 
To be fair this FSR 2.0 looks better than I have expected. Really looking forward to try it out.
 
the fuck man, FSR 2.0 + sharpen often looks better than native
there has to be some caveat, like lots of blur in motion, I refuse to believe it works this good (I'm happy, mind you, as I've a 6800 XT, but still)
 
Maybe the time has come to return to AMD after many years with their next gpu generation? FSR 2.0 looks great, identical to DLSS. I am guessing 7000 series GPU will be able to do 60 fps with RT + FSR 2.0
 
Also not sure how this gets the Innovation award, it's literally doing something that already exists and was innovated already by other people, I don't think it actually meets the definition of innovative.
 
Also not sure how this gets the Innovation award, it's literally doing something that already exists and was innovated already by other people, I don't think it actually meets the definition of innovative.
well if you consider what FSR 2.0 does and that it can be used by any graphics card there is with no restrictions is innovative since there is nothing like that for what it does.
 
Very nice improvements over 1.0, it's head to head with DLSS but with without the 'AI magic'.
With current GPU market and stagnating technology, developers really need to pick up the pace with implementing these kind of upscalers if they want to keep improving the graphics with a playable performance.
 
and not treating this like a one-off science experiment as they have in the past (TressFX, standalone "FidelityFX sharpening").
If we are talking about RIS, then it's broken AF. It says it will run in DX9, DX11 games, but my success rate with it working has been way lower. Only Genshin Impact seems to work with it. It doesn't work with Horizon 5 or anything else I tried. Meanwhile DXR's sharpening works in any game. More forgotten features like Chill or Radeon Boost have been either crap or non-functional. Chill with RX 580 was buggy since 2019. This lack of maintenance and lack of improvements is easily one of the worst "feature" of AMD software.
 
well if you consider what FSR 2.0 does and that it can be used by any graphics card there is with no restrictions is innovative since there is nothing like that for what it does.
My 2c... If we take what you said as absolutely true, it's still not really innovative... at all.

But, I accept your opinion, just not really interested in debating it.
 
My 2c... If we take what you said as absolutely true, it's still not really innovative... at all.

But, I accept your opinion, just not really interested in debating it.
That's the thing with today's world. Everything is debatable since a most stuff is an extension to something that was already there or it uses ideas that have emerged long time ago.
 
Also not sure how this gets the Innovation award, it's literally doing something that already exists and was innovated already by other people, I don't think it actually meets the definition of innovative.
Have you watched the GDC FSR 2.0 video?

 
Where is crowd who always claiming that DLSS is blurry? Now if FSR achieves DLSS image quality does it mean it has same amount of blur?

Oh My God, it really looks great... Nvidia must be sweating blood seeing their solution that needs a dedicated ASIC, wasting precious space on the die, being matched via open source software. lol

Just like Nvidia software Async made AMD hardware Async run for its money?

Another great, free, open technology by AMD.

TenZZor cores my aZZ!

Always remember: if it is free, the product is you!

AMD's mistake is that it answers these dirty initiatives by nvidia. Tessellation, and now RT... Do you remember when nvidia paid a game developer to REMOVE the DX 10.1 implementation (Assassin's Creed DX10.1) in which the Radeons were better?

Stop russian propaganda. https://techreport.com/news/14707/ubisoft-comments-on-assassins-creed-dx10-1-controversy-updated/
 
Instead of wastin precious developer time on mimicing lower settings, why don't you simply change the settings from ultra high to very high? The result will be the same with regards to the FPS improvement.. :D
 
Have you watched the GDC FSR 2.0 video?
I will and I'll report back :)

I might have part of this misunderstood or just straight missed, willing to admit if it's the case.
 
kudos to AMD
 
DLSS and RT are proprietary "features" by nvidia with no value for the user who can think.
The PS5 and new XBox do not support RT, so the gamers actually do not need it.

AMD's mistake is that it follows instead of thinking proactively about new unique features with real value.

COnsoles support RT.
 
Only "ray-traced" reflections... though.
Does Metro Exodus EE on consoles not support RTGI? I think it does. And the Matrix demo? Some games have RT shadows as well.

I think the problem is performance. You just cannot enable all the effects without sacrificing performance. That is why we might be looking at 30 FPS targets for UE5 games.


Anyway, FSR 2.0 is a huge upgrade. DLSS still resolves distant sub-pixel detail better, but you have to really look for it. Tensor cores is probably where the 3 FPS difference comes from.

I am still hoping for that SDK that will include all three technologies and will be easy to implement in all games.
 
Does Metro Exodus EE on consoles not support RTGI? I think it does. And the Matrix demo? Some games have RT shadows as well.

I think the problem is performance. You just cannot enable all the effects without sacrificing performance. That is why we might be looking at 30 FPS targets for UE5 games.


Anyway, FSR 2.0 is a huge upgrade. DLSS still resolves distant sub-pixel detail better, but you have to really look for it. Tensor cores is probably where the 3 FPS difference comes from.

I am still hoping for that SDK that will include all three technologies and will be easy to implement in all games.

Yes, performance is the problem. We don't have the transistor budget on these manufacturing processes in order to make everything ray-traced and the performance above 30 FPS.
 
Still not sure if Tensor cores really do anything for DLSS; They were architected for AI workloads but none of the DLSS "AI" happens on your RTX card, it's performed by Nvidia on their large server farms and then baked into the game-ready driver as a preset.
I thought that it was only for DLSS 1.0 ? you can use dlss in the real time preview of unity and unreal engine, and I really doubt that Nvidia servers are computing every single project that are being made
 
Back
Top