Wednesday, March 16th 2022

AMD Set to Announce FSR 2.0 Featuring Temporal Upscaling on March 17th

AMD is preparing to announce their FidelityFX Super Resolution (FSR) successor tomorrow, on March 17th before a showcase of the technology at GDC 2022 as we previously reported on according to leaked slides obtained by VideoCardz. AMD FSR 2.0 will use temporal data and optimized anti-aliasing to improve image quality in all presents and resolutions compared to its predecessor making it a worthy component against NVIDIA DLSS 2.0. The slides also confirm that FSR 2.0 doesn't require dedicated machine learning hardware acceleration and will be compatible with a "wide range of products and platforms, both AMD and competitors".

The technology has been implemented in Deathloop where FSR 2.0 "Performance" mode with ray tracing increased frame rates from 53 FPS to 101 FPS compared to 4K native resolution with ray tracing. The slides do not reveal if AMD will make the source code for FSR 2.0 open-source as they have done for FSR and Intel is planning to do with XeSS. AMD is also expected to release Radeon Super Resolution which is an FSR driver implementation available for all games on March 17th.
Source: VideoCardz
Add your own comment

33 Comments on AMD Set to Announce FSR 2.0 Featuring Temporal Upscaling on March 17th

#26
Space Lynx
Astronaut
maybe we should all stop bitching and just be thankful these companies are still trying to help gamers during the time of shortage? i mean nothing was stopping Nvidia or AMD from just saying fuck all gamers, lets go all in on crypto short term and make that money, the gamers will come back when we tell them its time to come back.

nothing was stopping them, literally they are a duopoly.
Posted on Reply
#27
wolf
Performance Enthusiast
Xex360Nothing is better than native
This statement isn't true at all, why do people still say this?
erekisn't 4x DSR / DL-DSR better than native?
Yes.
Vayra86Its true, you can get better than native now.
Absolutely we can.

Just saying "native" as an all encompassing, never able to be surpassed level of quality is misleading at best. What resolution? what pixel density? what display technology? What AA method? what game? in motion? the list goes on.

The sentiment needs to die already. There are multiple ways to improve IQ over rendering the exact number of pixels, in the 'traditional way of your display natively (and more often than not, with an AA pass).
Posted on Reply
#28
Aretak
Xex360Nothing is better than native, I think because DLSS can resolve some details better than native (but it ruins the image in many others) they are referring to this.
A lot of people just seem blind when it comes to the artifacts that DLSS introduces. I've heard many people say that Cyberpunk 2077 looks "better than native" with DLSS enabled, yet every time I enable it, even on Quality mode at 1440p, all I see is pixels crawling all over the place on distant fine detail and odd flickering on certain textures. Can't say the ghosting ever really bothered me though, even before they improved it.
Posted on Reply
#29
wolf
Performance Enthusiast
AretakA lot of people just seem blind when it comes to the artifacts that DLSS introduces. I've heard many people say that Cyberpunk 2077 looks "better than native" with DLSS enabled
Often in games, Native is not artefact free (or to frame another way, it's not perfect), there can be shimmering, ghosting, crawling, popping textures, LOD changes etc... So, DLSS may fix some, and introduce others, hence why some would claim better than native, because the artefact/s their eyes are most drawn to are solved, perhaps at the expense of others, but they might be missed. As well as the typical aspects we know it does well, fine detail and wire fences etc, those aren't really debated, and fine detail has that ability to grab the eye and impress.

Another factor might be something like using a slow VA panel, people might not see any ghosting because their panel produces it anyway so they're just used to it all the time, and DLSS doesn't seem to make it any worse.

I'm not blind to how DLSS can negatively affect an image, but it can certainly be said that aspects of the image are positively improved upon. One of my top bugbears is shimmering, with the AA in DLSS, it manages to often greatly reduce or even completely solve shimmer. So to me, native presentation with distracting shimmer, vs a DLSS implementation with that solved the shimmer, but perhaps has some pixel crawl or minor ghosting, I'll almost certainly find the DLSS image preferable - before even counting the extra FPS given.
Posted on Reply
#30
Vayra86
CallandorWoTmaybe we should all stop bitching and just be thankful these companies are still trying to help gamers during the time of shortage? i mean nothing was stopping Nvidia or AMD from just saying fuck all gamers, lets go all in on crypto short term and make that money, the gamers will come back when we tell them its time to come back.

nothing was stopping them, literally they are a duopoly.
Maybe you should just speak for yourself because honestly... what?! This is a truly dumb statement in every possible way.
wolfThis statement isn't true at all, why do people still say this?

Yes.

Absolutely we can.

Just saying "native" as an all encompassing, never able to be surpassed level of quality is misleading at best. What resolution? what pixel density? what display technology? What AA method? what game? in motion? the list goes on.

The sentiment needs to die already. There are multiple ways to improve IQ over rendering the exact number of pixels, in the 'traditional way of your display natively (and more often than not, with an AA pass).
Its true, especially the bit about motion clarity, but there is also perception and getting comfortable with certain methods of rendering, in much the same way that applies to 'buying a new monitor' and getting used to a different number of pixels at a different pitch. But I will concede that DLSS and FSR can be brought to a setup where there are only advantages. Its certainly a step forward.

And there is the implementation. I remember the Elder Scrolls Online, because the assets themselves are so low detail/low texture quality, the whole thing kinda falls apart and becomes an even more blurry mess. That's probably a game/engine/level of detail where the native sharpness adds fidelity, even if it also has some aliased edges. Sharpening passes in that game (I used to screw around with reshade and stuff on it way before DLSS) were horrible too even back then, the best effort was adding some mild juice to the bland color pallette.
Posted on Reply
#31
Space Lynx
Astronaut
Vayra86Maybe you should just speak for yourself because honestly... what?! This is a truly dumb statement in every possible way.



Its true, especially the bit about motion clarity, but there is also perception and getting comfortable with certain methods of rendering, in much the same way that applies to 'buying a new monitor' and getting used to a different number of pixels at a different pitch. But I will concede that DLSS and FSR can be brought to a setup where there are only advantages. Its certainly a step forward.

And there is the implementation. I remember the Elder Scrolls Online, because the assets themselves are so low detail/low texture quality, the whole thing kinda falls apart and becomes an even more blurry mess. That's probably a game/engine/level of detail where the native sharpness adds fidelity, even if it also has some aliased edges. Sharpening passes in that game (I used to screw around with reshade and stuff on it way before DLSS) were horrible too even back then, the best effort was adding some mild juice to the bland color pallette.
yawn
Posted on Reply
#32
Unregistered
napataOf course it can be better than native. When you change the AA implementation, like DLSS does, you can end up with much better IQ even if you're rendering from a lower resolution. Some games look awful at native.
It depends globally it's worse, but for some fine details DLSS can improve things due to its nature.
AquinusI wouldn't call that better than native. :laugh:
Of course not, but some reviewers mislead people and say it's better. It does improve
erekisn't 4x DSR / DL-DSR better than native? @Xex360

You got me there!
#33
InVasMani
It's worth pointing out ATI has had temporal sparse grid AA since like the X800 series so we knew this was coming eventually.
Posted on Reply
Add your own comment
May 11th, 2024 00:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts