• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces FSR 3.1, Improves Super Resolution Quality, Allows Frame Generation to Work with Other Upscaling Tech

TAA and better image quality in the same sentence? You’re joking right? In all of the games ive played I’ve yet to see a good implementation of TAA that doesn't turn everything into a smeared mess (most recently the garabage implementation in Hell Divers 2).
As in, Upscaling (specifically DLSS and in my case at 4k on an OLED) producing a better result than the forced TAA at native res, that cannot be totally disabled in a majority of modern games and is often very average at best.

If the comparatively rarer game comes out where TAA is not forced, sure you can get a crisper image at native with say FXAA or SMAA or even no AA (if you can tolerate that unstable mess), at the expense of other faults in the image, no denying that.

Personally I can't stand shimmer, but that's highly personal, I 100% understand that to some the softer resolve is what they can't stand and are willing to trade that perhaps against shimmer for example.

But I try my best to not put forward my opinion as if it's a universal fact that applies to everyone. For me, a potentially slightly softer (4k mitigates the majority of softness, this gets worse as resolution lowers) but very stable image without shimmer, fizzle, flicker and breakup is eminently desirable over a slightly sharper image with one of more of those artefacts persisting.

All decided on a per game basis mind you, I can't remember the titles right now but recently I did play a couple of games where I did go with FXAA or SMAA as I had the rendering budget to spare and the art/geometry etc style didn't present many opportunities for the artefacts I can't stand.
 
Will we see a AAA title new or old supporting FSR 3.1 this year?
 
AMD has decoupled FSR 3.1 frame generation from the upscaling tech, which allows frame generation to work with other upscaling solutions, such as DLSS or XeSS. The possibilities of such a decoupling are endless—have an RTX 30-series "Ampere" GPU that lacks DLSS 3 frame generation support? No worries, use DLSS 2 for the upscaling, and FSR 3.1 for the frame generation.
This! Allow us to give KUDOS to AMD, and both middle fingers to nGreedia!
Even so I have a feeling they will bring their Frame Generation to 20 and 30 RTX series too....


Both DLSS and TAA are image blurring techniques, which is extremely annoying. I know what temporal filters are.
Yes, DLSS is crap in most of the games except Starfield for some reason. Not sure why, but even with low quality DLSS settings and with a little sharpening, the image is almost as native for x3 times the FPS. Guess it is all in the implementation. Also motion blur is very well done in the same game.
 
Even so I have a feeling they will bring their Frame Generation to 20 and 30 RTX series too

Hopefully that would be the expected FSR 3.1 consequence. Although knowing Nvidia everything and nothing is possible... nV doesn't like decoupling and sharing the good stuff (well no-one does really) - we can only hope FSR's FG measures up to nVs offering on 40-series with no compromises... a good knee in the balls for nV to surrender!
 
Hopefully that would be the expected FSR 3.1 consequence. Although knowing Nvidia everything and nothing is possible... nV doesn't like decoupling and sharing the good stuff (well no-one does really) - we can only hope FSR's FG measures up to nVs offering on 40-series with no compromises... a good knee in the balls for nV to surrender!
Don't forget nGreedias lies regarding their "optical flow" accelerators! The 30x0 and 20x0 just dont have them to make it work! We all know it's BS BTW, just like ReBAR support on the 20x0 series!
 
Honestly, the only good thing about upscaling is the fact that you can natively render the UI and run the game at a lower resolution, but all of these comparisons to native on Youtube are disingenuous because honestly it's very hard to really equate any upscaling with native in most games. Something like BG3 where movement is slow and predictable are acceptable compromises for upscaling but they're also the sort of games that need the least upscaling help in the first place.

Upscaling should have only being used in scenarios where the base resolution isn't really all that low, like 75% of the native resolution. But instead of being used sparingly developers are absolutely using it as a crutch all the time, if you think we have it bad enough there are games on consoles that are upscaled from resolutions close to 720P lol.
 
Upscaling should have only being used in scenarios where the base resolution isn't really all that low, like 75% of the native resolution. But instead of being used sparingly developers are absolutely using it as a crutch all the time, if you think we have it bad enough there are games on consoles that are upscaled from resolutions close to 720P lol.
Yup, I've been shouting that from the rooftops ever since DLSS first launched to prop up the untenable performance hit of RT.

Starfield might have been the worst example of this to date - even the highest presets included a pretty bad FSR implementation to cover up the pathetic performance of Bethesda's 20-year-old, woefully obsolete Creation Engine pushed so far beyond its capabilities that even a 4090 struggles to make it look okay.

I got Starfield free with a 7800XT purchase so it ran well enough to play but I simply couldn't believe how bad the game looked and run. There are DX9 XB360 games that look better.
 
As in, Upscaling (specifically DLSS and in my case at 4k on an OLED) producing a better result than the forced TAA at native res, that cannot be totally disabled in a majority of modern games and is often very average at best.

If the comparatively rarer game comes out where TAA is not forced, sure you can get a crisper image at native with say FXAA or SMAA or even no AA (if you can tolerate that unstable mess), at the expense of other faults in the image, no denying that.

Personally I can't stand shimmer, but that's highly personal, I 100% understand that to some the softer resolve is what they can't stand and are willing to trade that perhaps against shimmer for example.

But I try my best to not put forward my opinion as if it's a universal fact that applies to everyone. For me, a potentially slightly softer (4k mitigates the majority of softness, this gets worse as resolution lowers) but very stable image without shimmer, fizzle, flicker and breakup is eminently desirable over a slightly sharper image with one of more of those artefacts persisting.

All decided on a per game basis mind you, I can't remember the titles right now but recently I did play a couple of games where I did go with FXAA or SMAA as I had the rendering budget to spare and the art/geometry etc style didn't present many opportunities for the artefacts I can't stand.

I guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.

Though FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
 
I guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.
Yeah to some extent it is what it is with people, either they can't grasp it because they haven't seen it for themselves, or just don't want to admit it and so make other random, disingenuous, or non-connected to IQ and fps arguments. To what end? I'm really not certain, but some portion of it/to some people it is clearly political.

To me it's simply the proof is in the pudding, if it's as good or better IQ, that's that and it in effect doesn't matter how we got there. And you make a great case for performance normalised IQ too. I am of course fascinated by all manner of rendering technology, but it's always occurred to me as odd that people will die on the hill of what's going on behind the curtain being somehow totally unacceptable. Granted of course it's a bit of a dogs breakfast for which upscaling solution you use, which sub-version, which game and how well it's implemented, what output resolution and monitor technology you game at and so on. You and I on 4k120 OLED's and using DLSS, are effectively on a best case scenario for the technology.
Though FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
I truly want it to be better, I really do, and I am keen when the 3.1 update drops in R&C Rift Apart to do some back to back testing myself. It is unfortunate that one of the solutions with the broadest compatibility and perhaps broadest appeal has so many flaws (up to v3.0) at the resolutions where it's needed the most.
 
I guess there are people who just can't grasp that with AI upscaling enabled, they will get superior image quality (higher res and higher details) than without upscaling at the same FPS :rolleyes:.

Though FSR2.2 is still pretty far from being ideal, with so many visual artifacts that negate all the benefits.
We will just ignore the visual artifacts then...
 
80-90FPS with Upscaling looks much crisper than 60FPS Native in motion though.
120FPS with Upscaling+Frame Gen is even way way crisper than 60FPS Native
I don't need upscaling at 60 FPS native. I would at 30 FPS, but then, it works with too little input to give me an acceptable picture. The only place for upscaling, imo, is when I fire up my 4K TV with the small HTPC it's connected to, because 1. There's no other way to play at 4K with a 1660 Ti, and 4K+FSR looks better than 1080p on a 4K screen, and 2. I sit far enough from the TV not to really care about the slight loss of image quality.

As for the article, I'm glad AMD is trying to improve. I'm wondering, though, if already released games will get FSR 3.1 support, or if you can just pop some DLLs in, like you can with DLSS.
 
We will just ignore the visual artifacts then...
Sure seems like people do ignore visual artefacts when decrying upscaling, when Native without AA, or Native + TAA, FXAA, SMAA, MSAA etc all have artifacts and visual/performance drawbacks of their own.
 
Sure seems like people do ignore visual artefacts when decrying upscaling, when Native without AA, or Native + TAA, FXAA, SMAA, MSAA etc all have artifacts and visual/performance drawbacks of their own.
It's the ghosting and motion smearing mostly.
 
It's the ghosting and motion smearing mostly.
And often times that motion smearing and ghosting (other breakup in motion too, specially in wires/thin lines, a blurrier image, etc.) can be distracting enough for some to prefer the shimmer of more conservative AA techniques (SMAA, MSAA, CMAA2, etc.) against the blur that can be caused by more extreme/temporal AA techniques (TAA, DLAA, FSRNative).
I myself Prefer the sharpness and non temporal artifacting of SMAA over TAA (For reasons found in a very long text that I put in my profile).
It's a choosing game, it becomes a problem when we can no longer choose (either due to TAA/Upscaling being forced, or poor performance that encourages Upscaling).

(Also yes, I am lumping in TAA with Upscaling, they are all just TAA at the end of the day, they're just better at it)
 
AMD continuing to support open source is a huge Plus.
 
I just tried an 7900xtx and to be honest the driver is not very good. No better than old days. While fiddling around, the power tuning refused to turn off again, resulting in a massive fps drop every 10 secs, rendering the game unplayable. Sorry but i really wanted this to work. Sold the card and went straight back to the 3080 which just works. No new card until next gen Nvidia.
 
I never know how people have these catastrophic failures, but somehow it's 99% TPU users that have these issues happen to them.
 
I never know how people have these catastrophic failures, but somehow it's 99% TPU users that have these issues happen to them.

His comment is also incredibly random and outta place.
For counter balance? Running a PowerColor Red Devil 7900 XTX, with basically zero issues now for a little over a year? It's been great :toast:
 
His comment is also incredibly random and outta place.
For counter balance? Running a PowerColor Red Devil 7900 XTX, with basically zero issues now for a little over a year? It's been great :toast:
I had more issues with my 6800XT than my 7900XT. I also love it for what Gaming is for me today.
 
Back
Top