• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Working on an AI-powered FSR Upscaling Algorithm

Considering how you need a RTX 4090 to push 60FPS @ 4K maxed out in all games (not considering RT!) "more power" is not the answer.
This is what is coming, the 5090 will indeed be more powerful. Regardless of upscaling and FG.

Even the 4090 still struggles in some games at 4k.

More power is always the answer!
 
just a matter of time before we just have one standard and can move on from this nonsense, and as stated before, I cannot wait, christ this is annoying
 
There won't be a one standard in this case.
But DirectSR should improve applying those different standards into games. Overall all 3 use the same data, motion vectors, depth info etc so developers should only expose those to DirectSR API and then solution by the vendor takes over.
Should help with implementation, quality, updates. But maybe I'm wrong we'll see.
 
How to imagine that. Monopoly?

the exact opposite, just one open standard, do we use different versions of Anti Aliasing between vendors in the past? nope, all SSAA or MSAA, then FXAA, there is also TAA, different versions of Anisotropic filtering? nope, name any other...I wanna say generic but thats the whole point of it, feature in pc gaming settings that is somehow vendor specific.

F off with that crap, nobody wants this nonsense.
 
They didn't, it was more or less forced upon them. Soon most games will start to integrate upscaling in a way that users wont be able to turn it off like in Alan Wake.
First it was forced. But now, we've got crowds saying "bleh, I won't buy that pesky, ugly AMD card because DLSS is soooo much better!" Personally, I find it funny that people are comparing which image worsening feature gives them less bad images like it was the saving grace of PC gaming. You can sell literal sh** with good enough marketing, I guess. :kookoo:

Edit: We've also got game devs cranking up system requirements to 11, and we've also got gamers who want smooth 4K gaming with ultra graphics. Gaming has become a very needy and expensive hobby for many. Very first world-ish, so to speak. Something has got to give.
 
Last edited:
DLSS has officially won the upscaler war. It's joever.

Thankfully they'll probably all fold into DirectSR anyway, so in the end it's us gamers who will win as a whole over time.
 
First it was forced. But now, we've got crowds saying "bleh, I won't buy that pesky, ugly AMD card because DLSS is soooo much better!" Personally, I find it funny that people are comparing which image worsening feature gives them less bad images like it was the saving grace of PC gaming. You can sell literal sh** with good enough marketing, I guess. :kookoo:

Edit: We've also got game devs cranking up system requirements to 11, and we've also got gamers who want smooth 4K gaming with ultra graphics. Gaming has become a very needy and expensive hobby for many. Very first world-ish, so to speak. Something has got to give.
Always has been, I think, it's just that the pandemic cranked that up to eleven. I remember when high end costed ~the going price for RTX 4070 these days, a card that's considered by a lot of people the bare minimum for a 40 series upgrade now.
 
How about working on cards that can use native res at decent frame rates for a reasonable price,?
They're not mutually exclusive, they're still all making faster and faster cards for 'standard' workloads, pricing is also something all camps are doing, pricing their cards at the limit of what the market will bear.
Who asked for this
Lots of people, LOTS of AMD fans, believe it or not, it's true.

This is all pretty simple, don't like it? don't use it. About the expected level of saltiness and disingenuous arguments about upscaling that I have come to expect here. We want 4k ultra cards at bargain bin prices and will only play games at "native" don't ya know!
 
Last edited:
Good to see them admit defeat. FSR 2.x at 4K looks good in stills, but in game the artifacts are everywhere. 1080p is utter trash. I would never use it on my 6800XT even for 1440p.
One wonders if RDNA4 is getting tensor type cores or NPU?
 
Well it's good that AMD is still continuing to improve their upscaling tech and like ZoneDymo, i hope at some point we will have a unique solution that work well and the same across vendors.


And yes, more power, no matter if you use upscaling or not will always be the solution. it's just that upscaling give a tool to use to achieve performance goal.

I totally understand why people would want to not use them, but they can still be beneficial. If they had a very large support, one could buy a low/mid range GPU and still get a 4K monitor.

I am a bit less for case were the game is just poorly optimized. But for those poorly optimized game, we have to vote with our wallet, Upscaling or not. We had those before upscaling and we would continue to have them even if upscaling go away.
 
At an all time high in datacenter marketshare.
Could you please provide a source for this? All sources I find point to AMD GPUs also being at all time lows, at between 3% to 8%. Their CPUs market share are rising meteorically, though.
 
Considering how you need a RTX 4090 to push 60FPS @ 4K maxed out in all games (not considering RT!) "more power" is not the answer.



Oh yes let us just undo this industry-wide shift.
True. But tell me why I can play many first rate titles on a 6650XT @4k over 60FPS, on Ultra settings and some won't even load. It's not about power it's about dev's getting lazy and cheap.
 
True. But tell me why I can play many first rate titles on a 6650XT @4k over 60FPS, on Ultra settings and some won't even load. It's not about power it's about dev's getting lazy and cheap.
What titles are you playing at 4k ultra on a 6650XT?
 
What titles are you playing at 4k ultra on a 6650XT?
Actually, I've just tried Hogwarts Legacy on my 1660 Ti, and to my great surprise, it plays at 4K Ultra at 60 FPS. Sure, with FSR/XeSS at Performance (so with a 720p render resolution), but still... :rockout:

With sharpening dragged to the max, it looks fairly acceptable, too.

I'm tempted to get back to some testing on my 6500 XT. I've kind of forgotten how much fun budget gaming can be. :ohwell:
 
Actually, I've just tried Hogwarts Legacy on my 1660 Ti, and to my great surprise, it plays at 4K Ultra at 60 FPS. Sure, with FSR/XeSS at Performance (so with a 720p render resolution), but still... :rockout:

With sharpening dragged to the max, it looks fairly acceptable, too.

I'm tempted to get back to some testing on my 6500 XT. I've kind of forgotten how much fun budget gaming can be. :ohwell:
4K FSR performance would be 1080P and not 720P.

But that is great. In the end this is what you want from an upscaler. Like i said earlier, those upscaler should be standard in games and you should always be able to select a lower render resolution than your output resolution.

Then you can get the benefits of having a 4K monitor for all the other stuff while still owning a low/mid range GPU or just an older one.
 
Good. Hopefully this will work on consoles too. FSR2 rendering in 720p looks like absolute garbage. And not that DLSS looks razor sharp at low resolutions, but at least it doesn't flicker like crazy.

I wonder if Sony will have their own AI upscaler in the PS5 Pro.
 
But that is great. In the end this is what you want from an upscaler. Like i said earlier, those upscaler should be standard in games and you should always be able to select a lower render resolution than your output resolution.

Then you can get the benefits of having a 4K monitor for all the other stuff while still owning a low/mid range GPU or just an older one.
It's pretty straight forward stuff, and can go hand in hand with settings optimization, or not, personal preference and all. Some games used to offer lowering input resolution before these new gen upscalers were even a thing, FSR/DLSS and XeSS just do that part even better, I'm all for fierce competition and advancement.

Zero regrets with a 4k monitor too, the other benefits are immense and upscaling is in it's prime in this territory. There's a positive and negative spin for everything, I've been told I've overbought monitor relative to my GPU and now am making sacrifices, I don't see it that way at all, I'd rather upscale 1440p to 4k than be stuck still on a 1440p display, and as we've seen at 4k output upscaling can and often does produce better than native+mehTAA results. Also OLED :)
 
It's pretty straight forward stuff, and can go hand in hand with settings optimization, or not, personal preference and all. Some games used to offer lowering input resolution before these new gen upscalers were even a thing, FSR/DLSS and XeSS just do that part even better, I'm all for fierce competition and advancement.

Zero regrets with a 4k monitor too, the other benefits are immense and upscaling is in it's prime in this territory. There's a positive and negative spin for everything, I've been told I've overbought monitor relative to my GPU and now am making sacrifices, I don't see it that way at all, I'd rather upscale 1440p to 4k than be stuck still on a 1440p display, and as we've seen at 4k output upscaling can and often does produce better than native+mehTAA results. Also OLED :)
There is a balance one must find when selecting monitor and gpu.

Its the reason I stayed at 1440 UW and a 7900XTX. If I was looking at 4K for the games I play and the settings I like to use it would have pushed me to a 4090. However the extra $1200 I would have need for this GPU would have bit the wallet hard. So I'm happy where I'm at currently.

If one plays older games you can get away with 4k and something mid range so many factors are in play when making these choices.
 
Last edited:
There is a balance one must find when selecting monitor and gpu.

Its the reason I stayed at 1440 UW and a 7900XTX. If I was looking at 4K for the games I play and the settings I like to use it would have pushed me to a 4090. However the extra $1200 I would have need for this GPU would have bit the wallet hard. So i'm happy where i'm at currently.

If one plays older games you can get away with 4k and something mid range so many factors are in play then making these choices.
Theres absolutely a balance, I wouldn't pair a 6500XT with a 4k120 OLED, or a 4090 with a 1080p144 IPS, both obviously being exaggerated on purpose to make the point. 3080 and a 4k120 set though? honestly the longer it goes the more impressed I am with it, and as someone vocally for upscaling as another [optional] tool for tweaking at my disposal, along with smart (optimised) settings choices is giving some incredible gaming experiences.
 
4090 with a 1080p144 IPS
But it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.

Comments a-lá "bring us more power, these upscalers are donkey dong" are nonsensical because, if we don't count some specific outliers like 4060/6500 XT/etc, every gen is faster than the previous one by a very significant margin regardless of a use case. We're both getting more speed and more upscaling, thus 1080p gamers are happy because their 400 dollar GPUs are thriving at 100+ FPS in most games, and 4K gamers are also happy because they can use DLSS/FSR/XeSS and get their 60+ FPS with relative ease despite having a mid-three-figure USD GPU. And the image quality of 4K + DLSS at Quality is almost never noticeably worse than native. Rarely it's worse (= eyes hurt) than native in case of FSR and XeSS.

Badly optimised games? VRAM hogs? 20 FPS on 1500 dollar GPUs? Always has been a thing. Nobody thought of 50 FPS as of bad experience when I was a kid (late 90s, early 00s). Standards shifted.

Objectively, the main problems are:
• Artificial deficite (= wares are more expensive than they would be in normal circumstances).
• Lack of SFF/ITX friendly GPUs as well as ridiculously growing PSU requirements.
• AMD are very far behind NVIDIA in everything that's not pure raster performance and they don't seem to try to change that, ultimately leading to value stagnation.

Upscalers were toughly underwhelming 5 years ago. Today, they are fine. Tomorrow, I bet they will shine. Especially if AMD succeed with AI in upcoming FSR versions (only possible in parallel universes I'm afraid...).
 
But it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.
Sure those people exist, they want extreme longevity from their purchase. I said I wouldn't make that pairing, and it is still an example at one end of the spectrum of possibilities.

As for the rest of your post, wholeheartedly agree :)
 
But it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.

Comments a-lá "bring us more power, these upscalers are donkey dong" are nonsensical because, if we don't count some specific outliers like 4060/6500 XT/etc, every gen is faster than the previous one by a very significant margin regardless of a use case. We're both getting more speed and more upscaling, thus 1080p gamers are happy because their 400 dollar GPUs are thriving at 100+ FPS in most games, and 4K gamers are also happy because they can use DLSS/FSR/XeSS and get their 60+ FPS with relative ease despite having a mid-three-figure USD GPU. And the image quality of 4K + DLSS at Quality is almost never noticeably worse than native. Rarely it's worse (= eyes hurt) than native in case of FSR and XeSS.

Badly optimised games? VRAM hogs? 20 FPS on 1500 dollar GPUs? Always has been a thing. Nobody thought of 50 FPS as of bad experience when I was a kid (late 90s, early 00s). Standards shifted.

Objectively, the main problems are:
• Artificial deficite (= wares are more expensive than they would be in normal circumstances).
• Lack of SFF/ITX friendly GPUs as well as ridiculously growing PSU requirements.
• AMD are very far behind NVIDIA in everything that's not pure raster performance and they don't seem to try to change that, ultimately leading to value stagnation.

Upscalers were toughly underwhelming 5 years ago. Today, they are fine. Tomorrow, I bet they will shine. Especially if AMD succeed with AI in upcoming FSR versions (only possible in parallel universes I'm afraid...).
How are you going to get anywhere in anything other than raster when Nvidia has the whole software ecosystem locked down under CUDA?
 
How are you going to get anywhere in anything other than raster when Nvidia has the whole software ecosystem locked down under CUDA?

By making an ecosystem that will captivate the players' attention and marketing it well.

AMD has never been successful at either of these.
 
Back
Top