• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Debuts Radeon RX 9070 XT and RX 9070 Powered by RDNA 4, and FSR 4

Now we know why AMD bailed on presenting RDNA 4 this afternoon. It’s bottom market trash compared to what Nvidia is showing now.

This Nvidia announcement isn’t a stomping, it’s a shredding. A year from now AMD will have 5% market share from their loyalists and that’s it.

Radeon is dead. Long live Radeon.
Wait for it. Frame generation enabled fake data doesn't mean anything.
 
Now we know why AMD bailed on presenting RDNA 4 this afternoon. It’s bottom market trash compared to what Nvidia is showing now.

Radeon is dead. Long live Radeon.
nah, can't be THAT bad.....
what I find funny though is hardware locking fsr4, this has to come as a cognitive shock to people who thought FSR2/3 has been open because AMD wants older cards to have equal features.
 
Last edited:
No, it's that bad. If you were paying attention to all the stuff that nVidia was NOT talking about during the presentation, all the wild features they've put into their cards means AMD is going to be playing catch up for a LOOOOONG time. The neural rendering thing alone is basically a long list of cheats at every turn. DLSS4 is SHARP and Reflex 2 is basically dropping a mini nuke on everything Radeon. Having such insanely low input latency isn't something that should be walled off to either company and it's been an issue in VR for a while but <1ms desktop is actually insane. Most of the other stuff hasn't even been covered. It was all AI this and that x200. Absolute massacre. It's no wonder AMD didn't even bother but we need something. Where's the effort? I really wanted to give AMD a chance but they never miss an opportunity to miss the opportunity. Meme reflects reality.
 
No, it's that bad. If you were paying attention to all the stuff that nVidia was NOT talking about during the presentation, all the wild features they've put into their cards means AMD is going to be playing catch up for a LOOOOONG time. The neural rendering thing alone is basically a long list of cheats at every turn. DLSS4 is SHARP and Reflex 2 is basically dropping a mini nuke on everything Radeon. Having such insanely low input latency isn't something that should be walled off to either company and it's been an issue in VR for a while but <1ms desktop is actually insane. Most of the other stuff hasn't even been covered. It was all AI this and that x200. Absolute massacre. It's no wonder AMD didn't even bother but we need something. Where's the effort? I really wanted to give AMD a chance but they never miss an opportunity to miss the opportunity. Meme reflects reality.
All gimmicks, the whole lot, with not much performance data to go by. AMD has a chance to respond with a better price-to-performance ratio. I hope they won't mess it up.
 
All gimmicks, the whole lot, with not much performance data to go by. AMD has a chance to respond with a better price-to-performance ratio. I hope they won't mess it up.
not revealing performance numbers doesn't bode well. if they really reached the targetted "4080 pefromance", they'd say that.
if this, by chance, gets beat by $550 5070, it's gonna be really disappointing, at least for mid/upper-mid range buyers like me.
that said, nvidia never showed 5070-5080 numbers, and 5090 is the only one shown because it has a hefty core count upgrade over 4090.
imo it looks like 5070 will be 4070 + 25%, nothing earth shattering, 4070TiS performance at $250 less.
 
Word has leaked - AMD pulled their RDNA 4 and Z2 extreme presentations at the last minute due to what Nvidia was going to show. That‘s why the tech tubers had their videos all produced and posted at the start of the keynote.

Information comes from developers that were supposed show their wares. If you know the forums where the AA and AAA devs hang out you can see their posts that they had already arrived in Vegas before they were told of the change of plans.

Radeon division is in crisis mode. Don’t be surprised if RDNA 4 gets delayed to mid-year.
 
At present I'm confused. They knew Nvidia had their event today. They could have withheld pricing information until later. Launch later at $500 to $450 depending on actual performance of the RTX 5070. It doesn't look like the RTX 5070 is faster than expected, just more fake frames.
 
not revealing performance numbers doesn't bode well. if they really reached the targetted "4080 pefromance", they'd say that.
if this, by chance, gets beat by $550 5070, it's gonna be really disappointing, at least for mid/upper-mid range buyers like me.
that said, nvidia never showed 5070-5080 numbers, and 5090 is the only one shown because it has a hefty core count upgrade over 4090.
imo it looks like 5070 will be 4070 + 25%, nothing earth shattering, 4070TiS performance at $250 less.
Well, if the 5070 performs at 4080 level for $550, and the 9070 XT performs at 7900 XT level, then it'll have to be priced at around $450 max. It'll be an interesting match, that's for sure.
 
Don't spin this around, if it's anti-consumer when Nvidia does it, warranting years of rhetoric and scorn, it's anti-consumer when AMD does it. Where is all the outrage of AMD going back on their word and "betraying the trust of the loyal Radeon customer"? It's simple, really, it's pure hypocrisy. You people never once cared for it being closed source software, you only ever cared that it was better and you couldn't use it.
Is that surprising? Didn't we know that already? Wait - there is worse. If FSR 4 is actually GOOD suddenly upscaling will become a necessity and TAA a smearing mess. Wait for it, wait for it :D

Hey look it's another one. RX 6800xt/6900XT/7900XT/X cards were impossible to get for months after launch. Surely that wasnt because they didnt make any, right?
Are you in Europe? Most places in Europe (UK excluded, it's not in EU) didn't have listing of RDNA2 for months, let alone stock :D
 
Is that surprising? Didn't we know that already? Wait - there is worse. If FSR 4 is actually GOOD suddenly upscaling will become a necessity and TAA a smearing mess. Wait for it, wait for it :D
It won't become a necessity for me, that's for sure. I just hope raw performance on Blackwell and/or RDNA 4 will be decent relative to price, but I don't have high hopes considering that AMD gave us nothing while Nvidia only gave us faked data with different versions of FG used on the 40 and 50 series.
 
I think you’re being unreasonable here. Unless you know what it cost AMD to design, develop, build, and support these cards, then you have no idea at what price “greed” kicks in
Sure, I don't know what it costs them, but you know who does? AMD!! Which means they could have easily announced a price that would make them profitable. But they didn't, cause they just wanted the highest margins possible, which is basically nvidia - 9.99$.

And since 5070 is at 549$ now, I wasn't far off - the 9070xt should be 399-449$ at the most, and that's assuming FSR 4 is good and RT is up there.
 
Sure, I don't know what it costs them, but you know who does? AMD!! Which means they could have easily announced a price that would make them profitable. But they didn't, cause they just wanted the highest margins possible, which is basically nvidia - 9.99$.
How do you know what low price still makes them profitable? How do you know they're not currently selling at that price?

And since 5070 is at 549$ now, I wasn't far off - the 9070xt should be 399-449$ at the most, and that's assuming FSR 4 is good and RT is up there.
We know nothing about the 5070's performance as of yet. We only know FG vs FG 4x data which is nothing to go by.
 
How do you know what price makes them profitable? How do you know they're not currently selling at that price?
As I've said, I don't, AMD does. They are not currently selling the 9070xt at that price, since they haven't announced any prices for it.

We know nothing about the 5070's performance as of yet. We only know FG vs FG 4x data which is nothing to go by.
Well we know it's faster than the 4070, and we know that the 9070xt isn't faster than the 7900xtx.
 
As I've said, I don't, AMD does. They are not currently selling the 9070xt at that price, since they haven't announced any prices for it.
Ah, I see what you mean. But then, why would you want to sell a card at the lowest possible profit level if you could sell it for a bit more?

Nvidia asking what they're asking is fine, but AMD only undercutting it by a small bit is wrong? What's this double standard?

Well we know it's faster than the 4070, and we know that the 9070xt isn't faster than the 7900xtx.
The 7900 XTX is 62% faster than the 4070 (according to the TPU database). That's a lot of room for both the 5070 and 9070 XT. We don't know where they land relative to each other.
 
Ah, I see what you mean. But then, why would you want to sell a card at the lowest possible profit level if you could sell it for a bit more?
Because you (presumably of course) would want to get marketshare?

Nvidia asking what they're asking is fine, but AMD only undercutting it by a small bit is wrong? What's this double standard?
Doesn't the same apply in reverse? People are calling nvidia ngreedia because they want to make more, but when amd does it its' fine? Btw I have no issue with how much money amd wants to charge, I just don't like them hiding it / reacting to nvidias prices. Just grow a pair and ask what you think your card is worth.

The 7900 XTX is 62% faster than the 4070 (according to the TPU database). That's a lot of room for both the 5070 and 9070 XT. We don't know where they land relative to each other.
True, I thought they were closer together. Still I expect the 2 to land within 15-20% but time will tell
 
Because you (presumably of course) would want to get marketshare?
Why is everyone obsessed with marketshare? A company's goal is to make profit.

Doesn't the same apply in reverse? People are calling nvidia ngreedia because they want to make more, but when amd does it its' fine? Btw I have no issue with how much money amd wants to charge, I just don't like them hiding it / reacting to nvidias prices. Just grow a ball and ask what you think your card is worth.
I don't care what people call who. I think 50 series prices aren't bad at all, provided there's a half-decent performance uplift, too... which we didn't see because the presentation was hidden behind frame-generated fake data instead giving us raw performance numbers. Nvidia is hiding just as much as AMD is (even though Nvidia maintains the illusion that they gave us something when they didn't). Neither is good, imo.

True, I thought they were closer together. Still I expect the 2 to land within 15-20% but time will tell
Yep, we'll see. Personally, I don't trust the fake data Nvidia gave us in the presentation, so I go by the assumption that we don't have any info on either card. We'll see it all in the reviews.
 
Why is everyone obsessed with marketshare? A company's goal is to make profit.


I don't care what people call who. I think 50 series prices aren't bad at all, provided there's a half-decent performance uplift, too... which we didn't see because the presentation was hidden behind frame-generated fake data instead giving us raw performance numbers. Nvidia is hiding just as much as AMD is (even though Nvidia maintains the illusion that they gave us something when they didn't). Neither is good, imo.


Yep, we'll see. Personally, I don't trust the fake data Nvidia gave us in the presentation, so I go by the assumption that we don't have any info on either card. We'll see it all in the reviews.
Well marketshare does 2 things. Reduces the cost of r&d per gpu, and increases software penetrration. See fsr for example, regardless of how good or bad it is it is widely adopted because it works on nvidia gpus as well. If it only worked on amd, no developer in their right mind would put it in their games instead of dlss.

If fsr4 is indeed only working on amds card, they need a big marketshare to drive this into games.
 
Well marketshare does 2 things. Reduces the cost of r&d per gpu,
Huh? Market share is the number of products sold relative to other members of the market. How does a relative unit number reduce any cost?

and increases software penetrration. See fsr for example, regardless of how good or bad it is it is widely adopted because it works on nvidia gpus as well. If it only worked on amd, no developer in their right mind would put it in their games instead of dlss.

If fsr4 is indeed only working on amds card, they need a big marketshare to drive this into games.
That I agree with. Personally, I don't think making FSR 4 AMD exclusive is / would be a good idea.
 
Who did there keynote first? Nvidia or AMD? I havent seen any of CES yet.
 
Huh? Market share is the number of products sold relative to other members of the market. How does a relative unit number reduce any cost?
Oh come on, higher marketshare would mean higher number of cards sold.
 
Who did there keynote first? Nvidia or AMD? I havent seen any of CES yet.
AMD did. They masterfully avoided talking about RDNA 4. Then Nvidia followed, giving us some great-looking prices (not much higher than Ada except for the 5090), only to follow it up with frame generation 4x (it makes 4x the frames on Blackwell) and false performance data with FG 4x enabled on Blackwell, but only standard FG enabled on Ada cards. So real performance remains a complete mystery.

This whole CES show was a big nothing burger GPU-wise.

Oh come on, higher marketshare would mean higher number of cards sold.
Not necessarily. If AMD sells 2x as many cards as they did last year, and Nvidia sells 4x as many, then AMD's market share is still in decline. Or if AMD sells only half as many while Nvidia sells only 1/3 as many, then AMD's marketshare is trending up while still operating at a loss. You also didn't consider profit margins in your equation.
 
Not necessarily. If AMD sells 2x as many cards as they did last year, and Nvidia sells 4x as many, then AMD's market share is still in decline. Or if AMD sells only half as many while Nvidia sells only 1/3 as many, then AMD's marketshare is trending up while still operating at a loss. You also didn't consider profit margins in your equation.
But the whole market won't grow by 10x in a year... You are just grasping at straws now, come on man. Plus it doesn't even matter, amd selling twice as many cards as last year and their marketshare further drops to 3%, their software penetration would still take a nose dive.
 
But the whole market won't grow by 10x in a year...
It was an example to show that market share isn't the be-all-end-all of business. As a company, you don't need to be a dominant force in a market. You just need to be profitable.

Plus it doesn't even matter, amd selling twice as many cards as last year and their marketshare further drops to 3%, their software penetration would still take a nose dive.
That much is true, I give you that. Like I said, that's why I think making FSR 4 dependent on AMD hardware is a bad idea.

Besides, when DLSS came out, I was all against the idea of hardware-dependent software, and was a supporter of open standards. I still am.
 
It was an example to show that market share isn't the be-all-end-all of business. As a company, you don't need to be a dominant force in a market. You just need to be profitable.


That much is true, I give you that. Like I said, that's why I think making FSR 4 dependent on AMD hardware is a bad idea.

Besides, when DLSS came out, I was all against the idea of hardware-dependent software, and was a supporter of open standards. I still am.
In a competitive market hardware dependent software is fine, marketshare being 30-70 40-60 50-50 etc both brands would get coverage and support. And in such a competitive market open solutions don't make sense, why would amd waste money to develop a good solution only for nvidia users to benefit from it? Thats never going to happen.
 
Back
Top