Thursday, May 6th 2021

AMD's Elusive FidelityFX Super Resolution Coming This June?

AMD FidelityFX Super Resolution (FSR), the company's elusive rival to the NVIDIA DLSS technology, could be arriving next month (June 2021), according to a Coreteks report. The report claims that the technology is already at an advanced stage of development, in the hands of game developers, and offers certain advantages over DLSS. For starters, it doesn't require training from a generative adversarial network (GAN), and doesn't rely on ground truth data. It is a low-overhead algorithmic upscaler (not unlike the various MadVR upscalers you're used to). It is implemented early in the pipeline. Here's the real kicker—the technology is compatible with NVIDIA GPUs, so studios developing with FSR would be serving both AMD Radeon and NVIDIA GeForce gamers.
Sources: Coreteks (YouTube), VideoCardz
Add your own comment

104 Comments on AMD's Elusive FidelityFX Super Resolution Coming This June?

#26
Vya Domus
beedooIn preference, I prefer to try and encourage people to have a more balanced and fair outlook on things - rather than just dismissing something as being garbage before it's even released.
No you don't, you encourage them to shut up because supposedly they're not engineers and therefore can't have a valid opinion on anything. As if you need intricate knowledge on how something is constructed to judge the quality of the end product, but anyway.

You said you are a software engineer, that doesn't mean anything in particular. Did you work with ML frameworks and write similar software to DLSS ?
Posted on Reply
#27
clopezi
ZoneDymoI maintain that DLSS marketing is fake, DLSS 4k isnt 4k, its 1080p (depending on the setting) upscaled, so comparing 4k vs 4k DLSS is just nonsense, it should be instead about comparing 1080 vs 1080 with DLSS upscaled and then talking about the image quality improvements (just like how any form of AA improves the image quality).

apart from that, I think stuff like DLSS should be about what its meant for, making Ray Tracing easier to do and so imo DLSS or equivalent should reduce and then upscale the resolution of Ray Traced tech only, so those ray traced reflections for example, make it quarter res so its easier on the hardware and then use DLSS etc to upscale that so it does not look like crap, and do the same for shadows etc.
Sorry but you are wrong.

Please, check this:



No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.



The tech it's awesome.
Posted on Reply
#28
watzupken
ratirtWhoa. So it is coming this June? Not bad. I really am curious how this FSR will stand against the DLSS 2.0 and 2.1. If it actually is any good. But the idea FSR working for both AMD and NV cards, easier to implement to any game, is a damn killer if this feature turns out to be good or at least close to what DLSS 2.0 offers. A damn killer I tell ya'll.
From the way AMD is selling FSR, it is certainly a more attractive proposition that Nvidia's bespoke DLSS. Having said that, I am not expecting FSR to beat DLSS, but as long as it is 75% close and easier to implement, I think DLSS may have a hard time going forward.
Posted on Reply
#29
Prima.Vera
As an owner of an RTX 3080 , let me put it clear as without any doubt. DLSS 2 is good ,but not in a million year within the same quality as the native render. And yes, I test it on all supported games including Control, Death Stranding or Cyber Punk... Nah, forget about the screenshots. The low res feeling is instant when switching to DLSS, even disturbing sometimes.

P.S.
As for the "marketing" clips that shows DLSS 2 4K image looking better than the native 4K one, I only have 1 word for you: SHARPNESS.
They just add extra sharpness to fake the crispness of the texture, but if you look closer to the images you can clear distinguish the blurred jaggies due to the low res upscaled.
Posted on Reply
#30
Unregistered
clopeziSorry but you are wrong.

Please, check this:
Are you sure those images are correct, I don't recall my 4K image being so poor as this one appears to be.
#31
wolf
Performance Enthusiast
Prima.VeraAs an owner of an RTX 3080 , let me put it clear as without any doubt. DLSS 2 is good ,but not in a million year within the same quality as the native render.
As valid as that view is, it is in fairly stark contrast to how others view it in the titles that show it best. I'd wager if you blind tested a large sample of gamers and didn't butter them up for what they were looking for, the result would surprise you. I can't deny your own experience and what your eyes see, but DLSS 2.0 quality mode can be exceptionally close to native all things considered, with strengths and weaknesses at various points of the image, but all mostly needing to be nitpicked to tell.

Plus, if that's how you feel about DLSS. .. I don't think you'll be impressed with FSR IQ. I hope I'm wrong.
ZoneDymoI maintain that DLSS marketing is fake, DLSS 4k isnt 4k, its 1080p (depending on the setting) upscaled, so comparing 4k vs 4k DLSS is just nonsense
All that really matters is the output image on your screen, marketing is always going to be marketing.

To me the output is all the matters, if it's comparable to native, who cares one bit what the input resolution is?

I'm not bothered at all how the magic pixels make it to my screen, the image I get at the end is the important part. Rendering is all ridiculous computer magic to most anyway, judge the final image as the final image.
Posted on Reply
#32
clopezi
beedooAre you sure those images are correct, I don't recall my 4K image being so poor as this one appears to be.
Ask Eurogamer and DF team!

But the point it's that the tech it's fantastic and it's great news AMD users can also enjoy similar solutions, I hope
Posted on Reply
#33
medi01
btarunrFor starters, it doesn't require training from a generative adversarial network (GAN), and doesn't rely on ground truth data.
For starters, neither does NV's TAA derivative, also known as DLSS 2.0.
clopeziPlease, check this:
All TAA derivatives, and DLSS 2 among them, will expose the following "features":
1) Improved lines (long grass, hair, eyebows, etc)
2) Not that noticeable with blurry textures, that is why people keep repeating that face from that weird game that looks like it's from 2003, as it barely has any texture detail, but has hair
3) Wiping out fine details
4) Adding blur when things move fast (entire screen can be blurred in Death Stranding by just quickly moving the mouse, see arstechnica review)
5) Particularly bad with small, quickly moving objects
Posted on Reply
#34
z1n0x
There are still people that believe and push Nvidia's marketing propaganda that DLSS2 is better than native? HILARIOUS!
Posted on Reply
#35
wolf
Performance Enthusiast
z1n0xThere are still people that believe and push Nvidia's marketing propaganda that DLSS2 is better than native
I believe what I see with my own eyes, yes. In well executed titles, it does look better to my eyes, and it performs better. FRS can hope to achieve the same.
Posted on Reply
#36
bencrutz
wolfI believe what I see with my own eyes, yes. In well executed titles, it does look better to my eyes, and it performs better. FRS can hope to achieve the same.
well i'm certain it doesn't look better than native to me, though i agree it performs way better.
Posted on Reply
#37
wolf
Performance Enthusiast
bencrutzwell i'm certain it doesn't look better than native to me, though i agree it performs way better.
Well a final output imagine is nuanced, especially in motion, there is lots to unpack there. For instance I find shimmering on straight(er) edges is a nit-pick of mine where DLSS helps immensely, along with intricate details, perhaps other parts of the image matter more you you? I'd encourage you to come join the conversation here if you have extended thoughts to share or anything you want to discuss with other other people that use DLSS 2.0.
Posted on Reply
#38
AusWolf
Compatible with nvidia GPUs. That's the stuff! No more proprietary technologies please!
Posted on Reply
#39
ZoneDymo
clopeziSorry but you are wrong.

Please, check this:



No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.



The tech it's awesome.
it gets a "lot more juice" because you are not running it at 4k, that is what makes its so fake.

they know that saying its "1080p" upscaled sounds less good, instead they pretend its magic and their cards can do "4k" with Ray Tracing, but in reality they are doing less then 4k to be able to that.

my problem is the marketing, and I can only repeat myself, if they compared 1080p performance to 1080p performance, ya know, apples to apples, then say "yeah ok the performacne is the same between the two products BUT ours can do this but with our version of anti aliasing so it looks better" then that is fine.

But instead they knowingly say "our card with 4k DLSS does better then the competition at 4k".....well yeah because you are not running it at 4k now are ya?
Posted on Reply
#40
1d10t
wolfSome people just flat out hate any blur whatsoever, I get that, but per-object motion blur intends to mimic the way we actually see objects in motion (try waving your hand back and forth in front of your eyes, do you see a blurred hand or frame by frame snaps of a sharp hand?), I find turning it off games can look juddery, even at 100+ fps now.
I just want to ask where is your stance based on your argument, if you see from this picture that 4K native is more blurry than DLSS.
clopeziSorry but you are wrong.

Please, check this:

Posted on Reply
#41
Luminescent
I play around in photoshop for work from time to time and there is no trick or plugin in the world that can really upscale or AI enhance some image i throw at it, there is only perception from contrast and sharpening, but you see it.
What would work and you would hardly see a difference is taking a picture at 20 megapixels and downscaling it to 5 megapixels, looking at the same picture on a 2k monitor you can hardly see a difference if you don't start pixel peeping.
So we know DLSS only works when you play at very high resolutions but the question is why the game doesn't scale properly with the textures and all that.
I think DLSS has to start from a place of wasting resources and then comes the miracle technology that helps with the waste.
I'm not holding my breath for AMD doing much in this area or getting this on consoles in the future, if the game is properly made in the first place then there is no need for DLSS and other tricks.
Posted on Reply
#42
tfdsaf
What some people describe as "better" image is just a more bland image. Yeah the image is fuller in color, but its a more monotone color and it doesn't have as many pixels as the original image.

I'd like to compare it like using too much sharpening, when you blur the image it can sort of appear as if the image quality is better, so in some instances DLSS can appear to some people to have better image quality, but in reality it doesn't.

In fact in universally stated "Best game" for ray tracing and DLSS Cyberpunk 2077, DLSS quality is much worse than native. GamerNehus had a great video where he shows dozen of pictures at the beginning of the video and asks you to try and figure out which one is which, from native to DLSS performance to the max DLSS quality. Literally 99% of people knew the answers in terms of what is native vs DLSS.

Now in that video he showed small aspects where DLSS had slightly better text visibility, so there are some small, very specific improvements in image quality, but overall for 95% and more its easily distinguishable much WORSE quality than Native. We are also talking about rendering at 4k, which uses the best resolution, don't forget the further down you go, the worse image quality is going to be! So if you play on 1080p and use DLSS with that resolution, your DLSS output is going to be much worse than 4k.
Posted on Reply
#43
Punkenjoy
Also, like All upscaling, the higher the resolution you want to display, the better the results.

Even a standard upscaling + Sharpening on a 4K monitor is fine and usable.
Same thing on a 1440P monitor, not that great but not that bad too. Somehow usable
on a 1080P monitor it's just bad.

that is true for all upscaling technology including DLSS. Digital Foundry (who have made many paid Nvidia sponsored presentation with very doubful claim) aren't telling you the full story and they only look at what can make thing shine better. A media that have real neutral and objective view wouldn't have made that video by example claiming that a 3080 was twice as fast as the 2080 TI. They say a lot of thing that is true, but you can mislead people by pointing the narrative into a specific direction and omitting all the negative.

It's clear that internally, DLSS add a sharpening filter. Apply it on the Native image and it will look even better. You don't need upscaling for that. Also i think Microsoft is trying to do a lot to trick people thinking there is a huge AI part into it because they want to convince people they need to buy the silicon space they added for their Pro/Business AI accelerator but many people doubt that part. Some Bug have shown that it probably use some TAA and it also ofset the rendering of each frame slightly so that the lower resolution capture a bit more detail each frame.

This is why fast moving object or scene are so blurry while they are able to get some tiny details better. They don't seems to reconstruct much, they just use previous frame data and if it move too quickly, there is no data to use.

Nothing that can't be implemented with a more open solution. And this is really where DLSS fail. That is a good technology. Everything that is around that can help to improve performance is good even if there are drawback. The main problem there for me isn't that it's a good or bad technology. It's a good tech with some drawback. The problem is that is a closed technology.

Closed technology are bad for PC gamer, end of the story.
Posted on Reply
#44
Minus Infinity
As a photographer I use a program called Topaz Gigapixel AI to do upscaling of images. I regularly upscale my bird photos by around 30% and sometimes 50% and as long as my source material is good, I cannot tell the difference and often the upscaled image is better. The AI does a spectacular job of improving detail. In one comparison it was impossible to tell the difference between a native 24MP photo taken by one camera and the same photo taken with a 61MP camera when the smaller image was upscaled. This is the beauty of the new AI training, compared to old braindead bicubic upscaling. It's not just about sharpening low res upscaled data, the AI can help create improved detail because it knows what the texture should like.

The results I've seen for DLSS when upscaling from say 1440p-1800p to 4K are more than good to pass muster and given in a game you are not usually standing still looking for tiny flaws, I'll take image quality that looks almost as good but with 50% higher frame rates any day. I would never try and upscale 1080p to 4K, 1440p minimum.
Posted on Reply
#45
Fluffmeister
watzupkenFrom the way AMD is selling FSR, it is certainly a more attractive proposition that Nvidia's bespoke DLSS. Having said that, I am not expecting FSR to beat DLSS, but as long as it is 75% close and easier to implement, I think DLSS may have a hard time going forward.
The key difference is those Nvidia users get to enjoy both.
Posted on Reply
#46
mtcn77
Minus InfinityThe results I've seen for DLSS when upscaling from say 1440p-1800p to 4K are more than good to pass muster and given in a game you are not usually standing still looking for tiny flaws, I'll take image quality that looks almost as good but with 50% higher frame rates any day. I would never try and upscale 1080p to 4K, 1440p minimum.
That is what sells computer video cards, dear. Don't be silly consumer point of view. This is big business.
I do hope old computer graphics take a revival. Everything looked unique to itself. This AI optimisation can do its thing, however I still have my hope for new filtering modalities. Let's make MadVR the benchmark. There is much to benefit, if it at least overcomes this overshading 'bug' in the pipeline - much ado for nothing, sales pitch, graphics relegator, quad helper pixel, planned obsolescence artifact!
Posted on Reply
#47
wolf
Performance Enthusiast
1d10tI just want to ask where is your stance based on your argument, if you see from this picture that 4K native is more blurry than DLSS.
Somewhat of a different bucket there. That's a blurrier (appearing lower resolution) image overall with very little to no movement happening at the time of the screenshot, so motion blur isn't needed. Per object motion blur shines when there is fast object movement, and camera blur would be the same, fast camera movement, not necessarily a blurrier image all the time.
Posted on Reply
#48
Luminescent
Minus InfinityAs a photographer I use a program called Topaz Gigapixel AI to do upscaling of images. I regularly upscale my bird photos by around 30% and sometimes 50% and as long as my source material is good
Topaz plugins are exactly what i was talking about, you can see how hard it tries the so called "AI" and it creates fake pixels then gives a little boost with maybe selective contrast or selective sharpen for shadows, mids or highlights, i can see it and i don't like it.
In videography world there is a plugin called twixtor, it's like a 20 years old plugin that converts a video at let's say 25fps to 50fps, it analyzes the video and it creates new pixels in advance, pray you don't shoot trees or scene is not too complex, this does basically the same thing as modern plugins that called themselves AI powered.
I'm telling you, DLSS sponsored titles need to waste a lot of hardware resources so when you enable the miracle technology it "works" as intended.
Posted on Reply
#49
watzupken
clopeziSorry but you are wrong.

Please, check this:



No sense? Better definition and 90fps instead 45fps... and it's better over the time. Yesterday Metro Exodus was released with DLSS 2.0, on low end cards like RTX 2060, 49fps instead 11.3fps... c'mon, DLSS get a lot more juice from GPU's.



The tech it's awesome.
Again, I think DLSS is a great technology to balance performance vs image quality. However I do sometimes wonder if the game developers actually deliberately made the game look worst without DLSS, especially games like Control being an Nvidia sponsored title and seems to get the most spotlight from Nvidia when it comes to DLSS performance/ image quality. I would rather look at a game that is not Nvidia sponsored for this comparison. Blurriness aside, the reason why I ask this is because if I look at a game like Shadow of the Tomb Raider (this is also an Nvidia sponsored title), and compare the 4K image from Controls where the hair somehow looks mangled at the ends, I certainly don't see this issue with Lara's hair. In short, the 4K native image looks suspiciously bad. Its possible for DLSS to make some image sharper in some cases, but at the same time introduce other image issues, especially when your character is moving in the game. So there is no perfect solution.
FluffmeisterThe key difference is those Nvidia users get to enjoy both.
I feel that is how AMD is trying to hamper adoption of DLSS. When you have a viable alternative to DLSS that is easier to implement without significant compromise in quality, do you think the game developers will still want to spend time on DLSS, especially on day 1? It may come as a subsequent add in, but I suspect it will have a knock on impact on DLSS up take, unless the game is sponsored by Nvidia. While this is a not a good comparison, you can take the GSync vs FreeSync outcome as a reference. The latter is known to be not as good as GSync, but monitor makers have mostly utilized FreeSync over GSync. Yes there is significant cost involved in utilizing GSync which makes this a different comparison, but it also proves the point that if you have a viable alternative that is hardware agnostic/ cheaper, people will tend to go for that solution.
Posted on Reply
#50
P_G19
If it works with Turing GTX & RDNA 1 I'll be happy. (Work fine on non-ultimate DX12)

Edit: Does it really need ray tracing to be on as some old rumors said?
Posted on Reply
Add your own comment
Apr 24th, 2024 18:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts