Monday, March 4th 2024

AMD Working on an AI-powered FSR Upscaling Algorithm

AMD CTO Mark Papermaster confirmed that AMD is working on a new upscaling technology that leverages AI. A key technological difference between AMD FSR and competing solutions NVIDIA DLSS and Intel XeSS, has been AMD's remarkable restraint in implementing AI in any part of the upscaler's pipeline. Unlike FSR, both DLSS and XeSS utilize AI DNNs to overcome temporal artifacts in their upscalers. AMD Radeon RX 7000 series GPUs and Ryzen 7000 CPUs are the first with accelerators or ISA that speed up AI workloads; and with the RX 7000 series capturing a sizable install-base, AMD is finally turning to AI for the next generation of its FSR upscaling tech. Papermaster highlighted his company's plans for AI in upscaling technologies in an interview with No Priors.

To a question by No Priors on exploring AI for upscaling, Papermaster responded: "2024 is a giant year for us because we spent so many years in our hardware and software capabilities for AI. We have just completed AI-enabling our entire portfolio, so you know cloud, edge, PCs, and our embedded devices, and gaming devices. We are enabling gaming devices to upscale using AI and 2024 is a really huge deployment year." In short, Papermaster walked the interviewer through the 2-step process in which AMD is getting into AI, with a hardware-first approach.
AMD spent 2022-23 introducing ISA-level AI enablement for Ryzen 7000 desktop processors and EPYC "Genoa" server processors. For notebooks, it introduced Ryzen 7040 series and 8040 series mobile processors with NPUs (accelerated AI enablement); as well as gave its Radeon RX 7000 series RDNA 3 GPUs AI accelerators. Around this time, AMD also introduced the Ryzen AI stack for Windows PC applications leveraging AI for certain client productivity experiences. 2024 will see the company implement AI into its technologies, and Papermaster couldn't be more clear that a new-generation FSR that leverages AI, is in the works.
Sources: No Priors (YouTube), VideoCardz
Add your own comment

70 Comments on AMD Working on an AI-powered FSR Upscaling Algorithm

#26
ZoneDymo
just a matter of time before we just have one standard and can move on from this nonsense, and as stated before, I cannot wait, christ this is annoying
Posted on Reply
#27
TumbleGeorge
ZoneDymojust a matter of time before we just have one standard
How to imagine that. Monopoly?
Posted on Reply
#28
remekra
There won't be a one standard in this case.
But DirectSR should improve applying those different standards into games. Overall all 3 use the same data, motion vectors, depth info etc so developers should only expose those to DirectSR API and then solution by the vendor takes over.
Should help with implementation, quality, updates. But maybe I'm wrong we'll see.
Posted on Reply
#29
ZoneDymo
TumbleGeorgeHow to imagine that. Monopoly?
the exact opposite, just one open standard, do we use different versions of Anti Aliasing between vendors in the past? nope, all SSAA or MSAA, then FXAA, there is also TAA, different versions of Anisotropic filtering? nope, name any other...I wanna say generic but thats the whole point of it, feature in pc gaming settings that is somehow vendor specific.

F off with that crap, nobody wants this nonsense.
Posted on Reply
#30
AusWolf
Vya DomusThey didn't, it was more or less forced upon them. Soon most games will start to integrate upscaling in a way that users wont be able to turn it off like in Alan Wake.
First it was forced. But now, we've got crowds saying "bleh, I won't buy that pesky, ugly AMD card because DLSS is soooo much better!" Personally, I find it funny that people are comparing which image worsening feature gives them less bad images like it was the saving grace of PC gaming. You can sell literal sh** with good enough marketing, I guess. :kookoo:

Edit: We've also got game devs cranking up system requirements to 11, and we've also got gamers who want smooth 4K gaming with ultra graphics. Gaming has become a very needy and expensive hobby for many. Very first world-ish, so to speak. Something has got to give.
Posted on Reply
#31
Dr. Dro
DLSS has officially won the upscaler war. It's joever.

Thankfully they'll probably all fold into DirectSR anyway, so in the end it's us gamers who will win as a whole over time.
Posted on Reply
#32
cmguigamf
AusWolfFirst it was forced. But now, we've got crowds saying "bleh, I won't buy that pesky, ugly AMD card because DLSS is soooo much better!" Personally, I find it funny that people are comparing which image worsening feature gives them less bad images like it was the saving grace of PC gaming. You can sell literal sh** with good enough marketing, I guess. :kookoo:

Edit: We've also got game devs cranking up system requirements to 11, and we've also got gamers who want smooth 4K gaming with ultra graphics. Gaming has become a very needy and expensive hobby for many. Very first world-ish, so to speak. Something has got to give.
Always has been, I think, it's just that the pandemic cranked that up to eleven. I remember when high end costed ~the going price for RTX 4070 these days, a card that's considered by a lot of people the bare minimum for a 40 series upgrade now.
Posted on Reply
#33
wolf
Performance Enthusiast
mb194dcHow about working on cards that can use native res at decent frame rates for a reasonable price,?
They're not mutually exclusive, they're still all making faster and faster cards for 'standard' workloads, pricing is also something all camps are doing, pricing their cards at the limit of what the market will bear.
thesmokingmanWho asked for this
Lots of people, LOTS of AMD fans, believe it or not, it's true.

This is all pretty simple, don't like it? don't use it. About the expected level of saltiness and disingenuous arguments about upscaling that I have come to expect here. We want 4k ultra cards at bargain bin prices and will only play games at "native" don't ya know!
Posted on Reply
#34
Minus Infinity
Good to see them admit defeat. FSR 2.x at 4K looks good in stills, but in game the artifacts are everywhere. 1080p is utter trash. I would never use it on my 6800XT even for 1440p.
One wonders if RDNA4 is getting tensor type cores or NPU?
Posted on Reply
#35
Punkenjoy
Well it's good that AMD is still continuing to improve their upscaling tech and like ZoneDymo, i hope at some point we will have a unique solution that work well and the same across vendors.


And yes, more power, no matter if you use upscaling or not will always be the solution. it's just that upscaling give a tool to use to achieve performance goal.

I totally understand why people would want to not use them, but they can still be beneficial. If they had a very large support, one could buy a low/mid range GPU and still get a 4K monitor.

I am a bit less for case were the game is just poorly optimized. But for those poorly optimized game, we have to vote with our wallet, Upscaling or not. We had those before upscaling and we would continue to have them even if upscaling go away.
Posted on Reply
#36
Firedrops
RedwoodzAt an all time high in datacenter marketshare.
Could you please provide a source for this? All sources I find point to AMD GPUs also being at all time lows, at between 3% to 8%. Their CPUs market share are rising meteorically, though.
Posted on Reply
#37
ChosenName
ZoneDymojust a matter of time before we just have one standard and can move on from this nonsense, and as stated before, I cannot wait, christ this is annoying
Insert obligatory XKCD; Standards:
Posted on Reply
#38
Redwoodz
FrickConsidering how you need a RTX 4090 to push 60FPS @ 4K maxed out in all games (not considering RT!) "more power" is not the answer.



Oh yes let us just undo this industry-wide shift.
True. But tell me why I can play many first rate titles on a 6650XT @4k over 60FPS, on Ultra settings and some won't even load. It's not about power it's about dev's getting lazy and cheap.
Posted on Reply
#39
Makaveli
RedwoodzTrue. But tell me why I can play many first rate titles on a 6650XT @4k over 60FPS, on Ultra settings and some won't even load. It's not about power it's about dev's getting lazy and cheap.
What titles are you playing at 4k ultra on a 6650XT?
Posted on Reply
#40
AusWolf
MakaveliWhat titles are you playing at 4k ultra on a 6650XT?
Actually, I've just tried Hogwarts Legacy on my 1660 Ti, and to my great surprise, it plays at 4K Ultra at 60 FPS. Sure, with FSR/XeSS at Performance (so with a 720p render resolution), but still... :rockout:

With sharpening dragged to the max, it looks fairly acceptable, too.

I'm tempted to get back to some testing on my 6500 XT. I've kind of forgotten how much fun budget gaming can be. :ohwell:
Posted on Reply
#41
Punkenjoy
AusWolfActually, I've just tried Hogwarts Legacy on my 1660 Ti, and to my great surprise, it plays at 4K Ultra at 60 FPS. Sure, with FSR/XeSS at Performance (so with a 720p render resolution), but still... :rockout:

With sharpening dragged to the max, it looks fairly acceptable, too.

I'm tempted to get back to some testing on my 6500 XT. I've kind of forgotten how much fun budget gaming can be. :ohwell:
4K FSR performance would be 1080P and not 720P.

But that is great. In the end this is what you want from an upscaler. Like i said earlier, those upscaler should be standard in games and you should always be able to select a lower render resolution than your output resolution.

Then you can get the benefits of having a 4K monitor for all the other stuff while still owning a low/mid range GPU or just an older one.
Posted on Reply
#42
THU31
Good. Hopefully this will work on consoles too. FSR2 rendering in 720p looks like absolute garbage. And not that DLSS looks razor sharp at low resolutions, but at least it doesn't flicker like crazy.

I wonder if Sony will have their own AI upscaler in the PS5 Pro.
Posted on Reply
#43
wolf
Performance Enthusiast
PunkenjoyBut that is great. In the end this is what you want from an upscaler. Like i said earlier, those upscaler should be standard in games and you should always be able to select a lower render resolution than your output resolution.

Then you can get the benefits of having a 4K monitor for all the other stuff while still owning a low/mid range GPU or just an older one.
It's pretty straight forward stuff, and can go hand in hand with settings optimization, or not, personal preference and all. Some games used to offer lowering input resolution before these new gen upscalers were even a thing, FSR/DLSS and XeSS just do that part even better, I'm all for fierce competition and advancement.

Zero regrets with a 4k monitor too, the other benefits are immense and upscaling is in it's prime in this territory. There's a positive and negative spin for everything, I've been told I've overbought monitor relative to my GPU and now am making sacrifices, I don't see it that way at all, I'd rather upscale 1440p to 4k than be stuck still on a 1440p display, and as we've seen at 4k output upscaling can and often does produce better than native+mehTAA results. Also OLED :)
Posted on Reply
#44
Makaveli
wolfIt's pretty straight forward stuff, and can go hand in hand with settings optimization, or not, personal preference and all. Some games used to offer lowering input resolution before these new gen upscalers were even a thing, FSR/DLSS and XeSS just do that part even better, I'm all for fierce competition and advancement.

Zero regrets with a 4k monitor too, the other benefits are immense and upscaling is in it's prime in this territory. There's a positive and negative spin for everything, I've been told I've overbought monitor relative to my GPU and now am making sacrifices, I don't see it that way at all, I'd rather upscale 1440p to 4k than be stuck still on a 1440p display, and as we've seen at 4k output upscaling can and often does produce better than native+mehTAA results. Also OLED :)
There is a balance one must find when selecting monitor and gpu.

Its the reason I stayed at 1440 UW and a 7900XTX. If I was looking at 4K for the games I play and the settings I like to use it would have pushed me to a 4090. However the extra $1200 I would have need for this GPU would have bit the wallet hard. So I'm happy where I'm at currently.

If one plays older games you can get away with 4k and something mid range so many factors are in play when making these choices.
Posted on Reply
#45
wolf
Performance Enthusiast
MakaveliThere is a balance one must find when selecting monitor and gpu.

Its the reason I stayed at 1440 UW and a 7900XTX. If I was looking at 4K for the games I play and the settings I like to use it would have pushed me to a 4090. However the extra $1200 I would have need for this GPU would have bit the wallet hard. So i'm happy where i'm at currently.

If one plays older games you can get away with 4k and something mid range so many factors are in play then making these choices.
Theres absolutely a balance, I wouldn't pair a 6500XT with a 4k120 OLED, or a 4090 with a 1080p144 IPS, both obviously being exaggerated on purpose to make the point. 3080 and a 4k120 set though? honestly the longer it goes the more impressed I am with it, and as someone vocally for upscaling as another [optional] tool for tweaking at my disposal, along with smart (optimised) settings choices is giving some incredible gaming experiences.
Posted on Reply
#46
Beginner Macro Device
wolf4090 with a 1080p144 IPS
But it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.

Comments a-lá "bring us more power, these upscalers are donkey dong" are nonsensical because, if we don't count some specific outliers like 4060/6500 XT/etc, every gen is faster than the previous one by a very significant margin regardless of a use case. We're both getting more speed and more upscaling, thus 1080p gamers are happy because their 400 dollar GPUs are thriving at 100+ FPS in most games, and 4K gamers are also happy because they can use DLSS/FSR/XeSS and get their 60+ FPS with relative ease despite having a mid-three-figure USD GPU. And the image quality of 4K + DLSS at Quality is almost never noticeably worse than native. Rarely it's worse (= eyes hurt) than native in case of FSR and XeSS.

Badly optimised games? VRAM hogs? 20 FPS on 1500 dollar GPUs? Always has been a thing. Nobody thought of 50 FPS as of bad experience when I was a kid (late 90s, early 00s). Standards shifted.

Objectively, the main problems are:
• Artificial deficite (= wares are more expensive than they would be in normal circumstances).
• Lack of SFF/ITX friendly GPUs as well as ridiculously growing PSU requirements.
• AMD are very far behind NVIDIA in everything that's not pure raster performance and they don't seem to try to change that, ultimately leading to value stagnation.

Upscalers were toughly underwhelming 5 years ago. Today, they are fine. Tomorrow, I bet they will shine. Especially if AMD succeed with AI in upcoming FSR versions (only possible in parallel universes I'm afraid...).
Posted on Reply
#47
wolf
Performance Enthusiast
Beginner Micro DeviceBut it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.
Sure those people exist, they want extreme longevity from their purchase. I said I wouldn't make that pairing, and it is still an example at one end of the spectrum of possibilities.

As for the rest of your post, wholeheartedly agree :)
Posted on Reply
#48
Redwoodz
Beginner Micro DeviceBut it makes sense if you belong to "I'd rather not upgrade till the next decade" kind of people.

Comments a-lá "bring us more power, these upscalers are donkey dong" are nonsensical because, if we don't count some specific outliers like 4060/6500 XT/etc, every gen is faster than the previous one by a very significant margin regardless of a use case. We're both getting more speed and more upscaling, thus 1080p gamers are happy because their 400 dollar GPUs are thriving at 100+ FPS in most games, and 4K gamers are also happy because they can use DLSS/FSR/XeSS and get their 60+ FPS with relative ease despite having a mid-three-figure USD GPU. And the image quality of 4K + DLSS at Quality is almost never noticeably worse than native. Rarely it's worse (= eyes hurt) than native in case of FSR and XeSS.

Badly optimised games? VRAM hogs? 20 FPS on 1500 dollar GPUs? Always has been a thing. Nobody thought of 50 FPS as of bad experience when I was a kid (late 90s, early 00s). Standards shifted.

Objectively, the main problems are:
• Artificial deficite (= wares are more expensive than they would be in normal circumstances).
• Lack of SFF/ITX friendly GPUs as well as ridiculously growing PSU requirements.
• AMD are very far behind NVIDIA in everything that's not pure raster performance and they don't seem to try to change that, ultimately leading to value stagnation.

Upscalers were toughly underwhelming 5 years ago. Today, they are fine. Tomorrow, I bet they will shine. Especially if AMD succeed with AI in upcoming FSR versions (only possible in parallel universes I'm afraid...).
How are you going to get anywhere in anything other than raster when Nvidia has the whole software ecosystem locked down under CUDA?
Posted on Reply
#49
Dr. Dro
RedwoodzHow are you going to get anywhere in anything other than raster when Nvidia has the whole software ecosystem locked down under CUDA?
By making an ecosystem that will captivate the players' attention and marketing it well.

AMD has never been successful at either of these.
Posted on Reply
#50
kapone32
MakaveliThere is a balance one must find when selecting monitor and gpu.

Its the reason I stayed at 1440 UW and a 7900XTX. If I was looking at 4K for the games I play and the settings I like to use it would have pushed me to a 4090. However the extra $1200 I would have need for this GPU would have bit the wallet hard. So I'm happy where I'm at currently.

If one plays older games you can get away with 4k and something mid range so many factors are in play when making these choices.
You don't need a 4090 for 4K. I have the FV43U and my 7900Xt drives that just fine.

Posted on Reply
Add your own comment
Jun 12th, 2024 11:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts