• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

Let's not get into the whole upscaling/framegen debate and what is or isn't fake/bad/low-quality/etc. here, please? All that's relevant is that the market has spoken and said that it wants upscaling and framegen; whether you personally like or dislike it, and why, is irrelevant because it's here to stay.
Sure and I don't intend to, but its important to highlight why people dó prefer it over other ways to improve FPS. That's why the market speaks like it does. It ain't just marketing.
 
I have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
I think we hit that point already 2-3 generations ago. AMD even tried with chiplets, only that it didn't work as well as intended.
 
I have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
This was obvious to me from the onset. Already we saw tweaks to reduce the amount of actual calculating being done. We're already moving further and further away from truly accurate RT, but rather use singular rays to 'determine' what else needs to happen. It all looks brutally inefficient nonetheless, with PT being the king of the castle of waste. As if Nvidia doesn't know this approach is never going to be here to stay, lol.
 
"old games" includes titles like CP2077, God of War Ragnarok, Horizon Forbidden West, Elden Ring (with RT), Ratchet and Clank, Jedi Fallen Order - these are all old titles (up to 6 years old) that will cause a 4070 to stumble at 4K, and fail to qualify as "high-refresh" in 1440p.
I'd subjectively say that's a problem with the expectations. The 4070 isn't a 4K card.

We have very little info on the 9060 models yet, but presumably they'll outsell the 9070 models and need FSR4 even more.
I wouldn't make that assumption just yet. The 7600 didn't outsell the 7800 XT, as far as I know.
 
It seems like Nvidia doesn't want to take a hit on profit margins to allocate larger dies, or doesn't want to spend the R&D on creating a chiplet architecture for their consumer gpu's. I said it in a another thread if AMD can make a chiplet for consumers then Nvidia definitely can, its significantly done for cost reasons but allows more for efficient use of die space, if Nvidia wants to keep their profit margins astronomically high then they need to implement a chiplet architecture to increase raster performance, instead of resorting to graphics trickery with fake frames.
Agreed.
We have great cpus now thanks to chiplets and will be honest, considering the price and performance of my 7900xtx, i think that it’s doable on gpus.
"old games" includes titles like CP2077, God of War Ragnarok, Horizon Forbidden West, Elden Ring, Ratchet and Clank, Jedi Fallen Order - these are all old titles (up to 6 years old) that will cause a 4070 to stumble at 4K, and fail to qualify as "high-refresh" in 1440p.
Man, you just reminded me of a meme, that mentions 20 years old consoles. In my mind I’m thinking sega genesis, but in reality we are looking at ps3s.

Edit found it!

genesis.jpg
 
Last edited:
This was obvious to me from the onset. Already we saw tweaks to reduce the amount of actual calculating being done. We're already moving further and further away from truly accurate RT, but rather use singular rays to 'determine' what else needs to happen. It all looks brutally inefficient nonetheless, with PT being the king of the castle of waste.
If we assume RT /PT is the target then Upscalers are absolutely needed (and probably so is FG). Even if we get 100% gen on gen improvements that's not enough to keep up with RT demands. Im not sure people understand how insanely taxing RT.

If RT / PT isn't the target then what is? Graphics have pretty much stagnated the traditional way, RDR2 back in 2018 was already good enough, why would anyone really care about gpus anymore?
 
Last edited:
I have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
That is what we could argue nvidia is doing now. Working smarter, rather than harder. Upscale with AI with minimal loss of detail, generate a frame or two instead of rendering it and also use ReSTIR for GI and DI.
People like to say that Path Tracing performance is shit, without even considering that real time Path Tracing was a pipe dream 4-5 years ago, it took minutes to render one frame. And with ReSTIR is also working smarter because you don't need to take into account every sample. And it completely removes any trickery when it comes to lighting the game. You place whatever light you like, wherever you want and you have GI, DI in real time.
 
Last edited:
Why are people in here so salty because other people aren't buying the brand they want them to buy? The last few weeks I've been reading the same stuff over and over and over again. Sheep buy Nvidia because stores tell them not to buy amd and whatever other excuse you guys are coming up with.

Let's for once have a decent amd card and then we can look at the sales. I'm pretty confident they will be good.
Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.

If the cards are good, we can wait and see, I don't see the need for a bunch of threads spreading FUD about launch when AMD never said the launch would be right after CES.
Last I heard Cyberpunk got a long overdue FSR update but its still not great and people on TPU wondered what had changed at all.


Read this just now.. but... that's very simple, Nvidia shows us how that works right? They introduce features that will make games run like you bought a low - end card (create problem) and then they introduce features that fix that performance and still allow you to use the performance hogging features and feel like you didn't actually buy something underspecced for the features on offer.

And the kicker is, Nvidia isn't wrong because they pull this forward and innovate on it. Because they lead, they get to dictate how the paradigm works. AMD never takes that risk/chance, and a big reason they do not do this is because they feel they can't = lack of relevant competence in the company.
I'm not surprised, because Cyberpunk is essentially an Nvidia tech demo game. CDPR put little effort into actually implementing good FSR like most AAA devs because it's easier to add DLSS when Nvidia hands companies piles of money to implement the feature first.

As for marketing features, I agree AMD sucks horribly at it, they really could've made sure features worked right before launch and put some ambition behind making sure consumers know about the feature, and I think AMD should be spreading more awareness on their features which aren't locked down to their brand.
 
Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.
But that's not what's happening. It's you and a bunch of other people beating the ngreedia drum every damn thread man.
 
Last I heard Cyberpunk got a long overdue FSR update but its still not great and people on TPU wondered what had changed at all.
Just frame-gen really. FSR3 with framegen was still pretty smeary, but so is DLSS in CP2077, so it's hardly an AMD-only problem.
 
Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.

If the cards are good, we can wait and see, I don't see the need for a bunch of threads spreading FUD about launch when AMD never said the launch would be right after CES.
My favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling, and AMD is generally worse at these things so they buy nvidia is just a totally lost concept; no the consumer just buys what they are told, they have no free will!
I'm not surprised, because Cyberpunk is essentially an Nvidia tech demo game. CDPR put little effort into actually implementing good FSR like most AAA devs because it's easier to add DLSS when Nvidia hands companies piles of money to implement the feature first.
As for marketing features, I agree AMD sucks horribly at it, they really could've made sure features worked right before launch and put some ambition behind making sure consumers know about the feature, and I think AMD should be spreading more awareness on their features which aren't locked down to their brand.
So....... game devs use DLSS and ignore FSR because nGREEDia pays them, and at the same time AMD's features suck? Surely their focus on DLSS has NOTHING to do with AMD's features sucking. Hmmm......
 
My favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling so they buy nvidia is just a totally lost concept, no the consumer just buys what they are told, they have no free will!


So....... game devs use DLSS and ignore FSR because nGREEDia pays them, and at the same time AMD's features suck? Surely their focus on DLSS has NOTHING to do with AMD's features sucking. Hmmm......
Just FYI, since I'm using fsr excessively on my amd laptop, FSR is present in 95% of nvidia sponsored games. There are nvidia sponsored games that have fsr and not dlss,lol. Now are you ready to know what % of amd sponsored games have dlss? It will be a shock...
 
That is what we could argue nvidia is doing now. Working smarter, rather than harder. Upscale with AI with minimal loss of detail, generate a frame or two instead of rendering it and also use ReSTIR for GI and DI.
People like to say that Path Tracing performance is shit, without even considering that real time Path Tracing was a pipe dream 4-5 years ago, it took minutes to render one frame. And with ReSTIR is also working smarter because you don't need to take into account every sample. And it completely removes any trickery when it comes to lighting the game. You place whatever light you like, wherever you want and you have GI, DI in real time.
Maybe, but at the same time, think about how the current options do remove data from the visible field of view (lowering the res then upscaling) and worse, insert data (fake frames) that were never there or placed by the game developer.

Its something that can be and will be abused. There are rumors that Ngreedia wants to insert 10 or more frames per each real one and since the market will be ok with that, AMD will have to do the same.

So developers will do what then? Its crazy to think about what is coming.
 
If we assume RT /PT is the target then Upscalers are absolutely needed (and probably so is FG). Even if we get 100% gen on gen improvements that's not enough to keep up with RT demands. Im not sure people understand how insanely taxing RT.

If RT / PT isn't the target then what is? Graphics have pretty much stagnated the traditional way, RDR2 back in 2018 was already good enough, why would anyone really care about gpus anymore?
I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
 
I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
Agreed.

I honestly cant justify the performance hit for what is given in return.

Granted, some few games do look better, but very few in my book.
 
It isn't, but it can still play a good chunk of games at 4K.
especially with DLSS Q. The 4070 is a 1440p card but the most common TV resolution has been 4K for the better part of a decade now, and consoles have been targeting 4K for over 8 years with the XB1S / XB1X / PS4 Pro

I also fall into the "would rather not use DLSS unless I have to" but sometimes the display resolution outpaces the hardware power available, and sometimes the game isn't optimised very well so you have to do something to get the framerate up to acceptable speeds.
 
I don't disagree, but selling one's sympathy for DLSS as a "must-have feature" rather than a personal opinion is a bit daft, especially when 45% of people on a front page poll say they don't use any upscaling.
And 80% of ppls use upscaler in RTX gpus.
Also 40% of those 45% is AMD users,FSR is so bad its better to not use it.
 
Just frame-gen really. FSR3 with framegen was still pretty smeary, but so is DLSS in CP2077, so it's hardly an AMD-only problem.
I havent bought that game yet, but will admit that on the videos in youtube, it does always looks "smeary" as you said and "grainy" and I recall other games where the devs placed extra efforts, where both dlss and FSR looked really good.
 
I also fall into the "would rather not use DLSS unless I have to" but sometimes the display resolution outpaces the hardware power available, and sometimes the game isn't optimised very well so you have to do something to get the framerate up to acceptable speeds.
Yessir. Same boat with the Ti :)
 
've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form.
RT is here to stay. It will only continue to be developed. AMD jumped on board this time because they support it, not like last gen where they had to pretend to support it.
 
My favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling, and AMD is generally worse at these things so they buy nvidia is just a totally lost concept; no the consumer just buys what they are told, they have no free will!
And my favorite thing to see repeated is people ignoring reality while repeating the same "but nvidia" marketing points reviewers always highlight.
So....... game devs use DLSS and ignore FSR because nGREEDia pays them
Exactly, and everyone whined when FSR got added to a game first, but there was nothing when Indiana Jones only had DLSS.
 
Exactly, and everyone whined when FSR got added to a game first, but there was nothing when Indiana Jones only had DLSS.
That's easily explained. When FSR is missing nobody complains since as you and your comrades keep telling us, upscalers are crap and you are not using them. So whos gonna complain exactly and why? Do you want me to complain that you can't use a feature that you weren't going to use anyways cause you think it's crap?
 
I have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
There's already something that's being worked on : neural rendering. Not as a after rendering filter, but to render the game itself. Some of it looks dodgy (neural faces looks like they don't belong in the game, something just looks off) but other seems promising, like neural texure compression, using machine learing to enable more complex shaders at a lower cost, using machine learing to lower the computational cost of path tracing... Because nvidia was the first to heavily market it, people are already doubting the tech, but they aren't the only one who've been researching it.

Intel Is Working on Real-Time Neural Rendering
Neural Supersampling and Denoising for Real-time Path Tracing - AMD GPUOpen
 
Back
Top