• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RDNA4-Exclusive AMD FSR 4 Technology Comes to 30+ Games at Launch, Over 75 Games By End of year

It runs on the AI cores and most certainly won't be heavily taxing them. 9070 XT has 128 AI cores and I'm sure FSR4 will work on all RDNA4 cards, right down to the weakest ones with like 32.

7900 XTX has 192 AI cores. If it doesn't get FSR4 then honestly, screw AMD.

If I'm not mistaken the key issue is that RDNA 3 AI cores lack matrix multiplication which is the key reason why FSR 4 is exclusive to RDNA 4 which has matrix multiplication.
 
For FSR 4 to win market share, there are many things that need to happen.
It needs to be better than DLSS 4. Being equal, losing here, winning there, wouldn't be enough, because the press will be showing and focusing only in those cases where DLSS 4 will be better.

DLSS will always be better, but FSR 2 got much close to DLSS 2 than expected, but then DLSS 3


Tell that to Nvidia, Each DLSS is unable to run on every previous generation.

Why should AMD create inferior features just to make sure GPU's from 5+ years ago are also able to have it?

I think this is a good thing, they don't have to waste time making it work on older hardware and can just focus on the present and future!

DLSS works on Tensor Cores hence GTX GPUs not being able to use it. But all RTX GPUs can use DLSS 4 ! The Frame Generation part is the one being only available on RTX 40s/50s and Multi Frame Generation on RTX 50s only (as of now).
FSR 4 using AI Cores means that only RDNA 3 & 4 GPUs should be able to use it. But dropping RDNA 3 support would be pretty bad (even though it might not work as well, just like the DLSS 4 Performance Hit on RTX 20s/30s is more important than on 40s/50s.
 
So those Nvidia sponsored games like cyberpunk and black myth wukong never really get optimized for AMD, reviewers should eliminate these games from benchmarks.
If the game developer goes out of his way to never optimize for one vendor then how is that vendor ever going to compete in benchmarks in that game ? Nvidia will always win no matter the hardware, that superior software support could be just cheating.
 
So those Nvidia sponsored games like cyberpunk and black myth wukong never really get optimized for AMD, reviewers should eliminate these games from benchmarks.
If the game developer goes out of his way to never optimize for one vendor then how is that vendor ever going to compete in benchmarks in that game ? Nvidia will always win no matter the hardware, that superior software support could be just cheating.
1. If we excluded every game sponsored by anyone, then we would not be benchmarking at all.
2. Just because a game is sponsored by X company, it doesn't mean that nobody ever plays it / nobody is interested in how it runs on various different hardware.
 
Intel's XeSS works better than FSR3.1 even without AI cores.
I expect AMD to do the same for FSR4 and have two code path, one for AI GPUs and one based on GPU shaders.
If this won't happen it would be a shame for AMD.

I hope someone in the opensource community will work on an alternative model that can run on every modern GPU and be swapped in place of XeSS/FSR.
 
1. If we excluded every game sponsored by anyone, then we would not be benchmarking at all.
2. Just because a game is sponsored by X company, it doesn't mean that nobody ever plays it / nobody is interested in how it runs on various different hardware.
Then a disclaimer IN THE TITLE should be put so everyone understands the information, cyberpunk is basically an Nvidia tech demo, if they want to showcase a new upscaling method they point everyone to Cyberpunk and clearly AMD has no chance to win anything there.
To be fair, the best current selling games should be used for testing, that's what people supposed to be playing RIGHT NOW, looking on steam that's Monster Hunter Wilds, Marvel Rivals, Kingdome Come Deliverance 2, Star wars Outlwas, Civilization VII.
If the games used are "Nvidia path tracing" then the deck is stacked and bye bye reasonable GPU prices, we come back to the days of Intel 4 core cpu's forever.
 
Tell that to Nvidia, Each DLSS is unable to run on every previous generation.

Why should AMD create inferior features just to make sure GPU's from 5+ years ago are also able to have it?

I think this is a good thing, they don't have to waste time making it work on older hardware and can just focus on the present and future!
The new transformer model in DLSS4 works on RTX2000..

And why should AMD? Cause they need customers, Nvidia don't.
 
The new transformer model in DLSS4 works on RTX2000..

And why should AMD? Cause they need customers, Nvidia don't.
They don't need worthless whiners that only buy Nvidia anyway. They should focus on new tech and the future, not cowtow to whiners and for gpus 5 plus years old.

It's clear that fsr4 uses a lot more ai processing power and making that work on older hardware would have just slowed down progress and bogged down the team
 
They don't need worthless whiners that only buy Nvidia anyway. They should focus on new tech and the future, not cowtow to whiners and for gpus 5 plus years old.

It's clear that fsr4 uses a lot more ai processing power and making that work on older hardware would have just slowed down progress and bogged down the team
They need customers. They're a tiny part of the share of cards gamers use. They're comparing their new cards to cards from 2020 in slides lol.
The AI upscaler may be more demanding but in no way will it be maxing out a 9070 XT in upscaling an image. Hell the feature will no doubt work on the eventual 9050/9060 with far weaker AI processing. I'm sure it could be adapted to the magnificent 7900 XTX, and that would look good to consumers knowing that their cards aren't gonna be cut off from future software. And it comes at a bad time when DLSS4 has only just arrived and works on old crap like the RTX 2060.
 
They need customers. They're a tiny part of the share of cards gamers use. They're comparing their new cards to cards from 2020 in slides lol.
The AI upscaler may be more demanding but in no way will it be maxing out a 9070 XT in upscaling an image. Hell the feature will no doubt work on the eventual 9050/9060 with far weaker AI processing. I'm sure it could be adapted to the magnificent 7900 XTX, and that would look good to consumers knowing that their cards aren't gonna be cut off from future software. And it comes at a bad time when DLSS4 has only just arrived and works on old crap like the RTX 2060.
Dude, go complain about Nvidia doing that thing for the past 8 years!!!! Where have you been for 8 years?

They need to focus on the 9000 series because they need customers, not cowtow to whiners and support old hardware that can't easily or fully run these new technologies. Why should they waste time and resources to make FSR4 work for 7000 series, when they can spend that time optimizing for the new 9000 series and making FSR4 work smoothlessly on the new GPU's and on more games?
 
Dude, go complain about Nvidia doing that thing for the past 8 years!!!! Where have you been for 8 years?

They need to focus on the 9000 series because they need customers, not cowtow to whiners and support old hardware that can't easily or fully run these new technologies. Why should they waste time and resources to make FSR4 work for 7000 series, when they can spend that time optimizing for the new 9000 series and making FSR4 work smoothlessly on the new GPU's and on more games?
Because it makes them look good and they need that since everyone sees them as inferior to Nvidia? You're making a lot of assumptions there about bringing FSR4 to RDNA3. And what should I complain about to Nvidia? That DLSS has been great for many years and FSR is only getting good now?

Funny you talk about whiners when you seem to be one desperately defending AMD. They're not some tiny company struggling to find engineers to work on software development.
 
Because it makes them look good and they need that since everyone sees them as inferior to Nvidia? You're making a lot of assumptions there about bringing FSR4 to RDNA3. And what should I complain about to Nvidia? That DLSS has been great for many years and FSR is only getting good now?

Funny you talk about whiners when you seem to be one desperately defending AMD. They're not some tiny company struggling to find engineers to work on software development.
Bringing a new technology that is designed to be run through matrix AI cores on old hardware that can barely support it is stupid, and it likely would have resulted in lower quality for the older gen, while taking away from focusing on supporting FSR4 in as many games as possible and running as perfectly on the new hardware as possible.

Plus this way people have something to want to upgrade to, rather than having it on older hardware. Now they will want to upgrade to 9000 series partly for the FSR4, it is a reason to upgrade and it would have likely been an inferior form, similar to how Intel's XESS works on hardware that doesn't have those AI accelerator cores!
 
Bringing a new technology that is designed to be run through matrix AI cores on old hardware that can barely support it is stupid, and it likely would have resulted in lower quality for the older gen, while taking away from focusing on supporting FSR4 in as many games as possible and running as perfectly on the new hardware as possible.

Plus this way people have something to want to upgrade to, rather than having it on older hardware. Now they will want to upgrade to 9000 series partly for the FSR4, it is a reason to upgrade and it would have likely been an inferior form, similar to how Intel's XESS works on hardware that doesn't have those AI accelerator cores!
It's a ML based upscaler, not magic. Of course they could adapt it. As I said look at DLSS4s transformer model.
 
It's a ML based upscaler, not magic. Of course they could adapt it. As I said look at DLSS4s transformer model.
They're not the same thing, though.
 
It runs on the AI cores and most certainly won't be heavily taxing them. 9070 XT has 128 AI cores and I'm sure FSR4 will work on all RDNA4 cards, right down to the weakest ones with like 32.

7900 XTX has 192 AI cores. If it doesn't get FSR4 then honestly, screw AMD.
Keep it in your pants please…
You are missing something fundamental.

9070 has dedicated hardware/cores for any AI application and RT of course where 7900 series uses its regular shaders for AI and RT.
If you try to implement it on the 7900 series it will cripple raster performance.
At best maybe they can send a lighter version of FSR4 towards the 7900 series but with not the same capabilities, so what’s the point of that.
I’m not very optimistic about the if and when.
 
Keep it in your pants please…
You are missing something fundamental.

9070 has dedicated hardware/cores for any AI application and RT of course where 7900 series uses its regular shaders for AI and RT.
If you try to implement it on the 7900 series it will cripple raster performance.
At best maybe they can send a lighter version of FSR4 towards the 7900 series but with not the same capabilities, so what’s the point of that.
I’m not very optimistic about the if and when.
They've supposedly said they'd like to do it. Aren't some of these previous gen cards less than a year old? I'm sure a 24GB XTX card can be put to good use. If not, how long until "planned obsolescence" catches you out?! Maybe we'll all be left holding some sort of bag (quite literally in my case as the bag I bought fairly recently is now two iterations old).

 
Last edited:
They work in a very similar way though.
They do, but RDNA 3's AI cores don't.
1740911162132.png

RDNA 3's AI cores are a lot slower in several applications, and also lack instructions / formats that RDNA 4 has. This could be a reason why FSR 4 is only supported on RDNA 4.
 
They've supposedly said they'd like to do it. Aren't some of these previous gen cards less than a year old? I'm sure a 24GB XTX card can be put to good use. If not, how long until "planned obsolescence" catches you out?! Maybe we'll all be left holding some sort of bag (quite literally in my case as the bag I bought fairly recently is now two iterations old).

It doesn’t matter if any previous GPU came out even last month. It’s still RDNA3 from 2022 Q4 that lacks tech. AMD tried too long to keep things as open as possible and for as many GPUs as possible but at the end nowadays circumstances require dedicated hardware for certain things.
And if AMD wants some market share going forward they have no real choice to not go for it.

I don’t like it as a 7900XTX user but it is what it is.
And I understand it. By the time I will have to use something like that or I can’t play a game it will be time to change GPU anyway. I don’t keep GPUs more than 3-4 years.
 
It doesn’t matter if any previous GPU came out even last month. It’s still RDNA3 from 2022 Q4 that lacks tech. AMD tried too long to keep things as open as possible and for as many GPUs as possible but at the end nowadays circumstances require dedicated hardware for certain things.
And if AMD wants some market share going forward they have no real choice to not go for it.

I don’t like it as a 7900XTX user but it is what it is.
And I understand it. By the time I will have to use something like that or I can’t play a game it will be time to change GPU anyway. I don’t keep GPUs more than 3-4 years.
Yea, I agree that they did have to make the jump at some point.

I'm personally wondering if the fact that these new cards don't have ROCm support at launch is an indication of them not wanting to step on toes, for the time being at least...
 
What makes you think that?

The only thing certain in life is death.

Unless AMD figure something extraordinary they'll always be late... Nvidia have been running their DL/ML solution for DLSS for years already and DLSS 4 with the new Transformer model is much better now. Even though I don't think FSR 4 will not be that far off, I doubt it will really be on par or beat DLSS. Nvidia have too much money for R&D compared to AMD. They could double or triple the amount of GPUs they're using if they wanted to. They're just ahead and too focused on A.I. sectors, but if AMD came too close I'm sure Nvidia would release a new version even better soon after.

1. If we excluded every game sponsored by anyone, then we would not be benchmarking at all.
2. Just because a game is sponsored by X company, it doesn't mean that nobody ever plays it / nobody is interested in how it runs on various different hardware.

Sure but we all know that when games are optimized for Nvidia and mostly RTX games, they usually run poorly on AMD. Mostly with RT/PT... even though RDNA 4 might be the beginning of a new era for AMD, and UDNA should continue that path. We have to hope that RDNA 4 & UDNA are going to be great and sell well otherwise Nvidia will be in an almost Monopoly for many years to come.
 
Back
Top