• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 Series Technical Deep Dive

Everyone complaining about GPU prices, complaining that what they call the "off-brand" of Radeon only undercuts Nvidia by $150 for a comparable GPU: if you have an Nvidia card in your system, you only have yourself to blame. People who a few years ago considered any GPU over $400 to be "excessive" will now line up to pay $750 for a third-rate GPU that gives the same performance as a $600 card. Do you want AMD to sell the cards at cost?

Of course, unless it's a 4090/5090, you use CUDA or some other reason prevents you from going AMD. Then by all means, choose Nvidia. Radeon has been doing very poorly. RDNA2 was a great generation with great potential for the future, and they dropped the ball massively with their silly ideas for RDNA3.

I will also point out, as others have stated, the 9070 is a slap in the face and a disgrace of a SKU. How long will AMD keep doing this nonsense where the harvest chips only exist to upsell the full die? I guess their yields must really be spectacular. This is almost as bad as the 5080 turd's scenario, where the first tier card is over 50% faster than the second tier at 4K to force everyone to buy a $2000 GPU. That is really anti consumer. Remember when the second tier chips actually saved you a nice chunk of change for only a 5% difference, in every tier and every generation?
 
I don't get you.

The lowest price for a 7900XT is $1,168.90 on new egg. It's nowhere near the performance announced either if you factor in the RT perf, and it's announced at only $600. What's not to like?
Learn to camelcamelcamel:

1740781202167.png
 
Learn to camelcamelcamel:

View attachment 387251

Doesn’t really change his point much. At $599, lower than the average selling price of a 7900XT of ~$700 over the entire retail life time, you’d be getting a ~15% faster card, improved ML/AI upscaling, new connectivity, significantly better RT performance per hardware, overall a better product for less (sans the 20gb vram buffer).

Rather silly if someone were to upgrade from a 7900XT sure, but this will crush the 5070 if sold anywhere near msrp. Overall its a solid value in todays shitty GPU market.
 
Wait. You're saying the company with a 4.6:1 ratio on the Steam survey is gonna outsell their competitor by 5:1?

Crazy. Just walk back slowly from that ledge and it'll aaaalllll be OK.

If you're going to start using the Steam survey, you might want to do your homework first. The Steam survey is all GPUs - including laptop and integrated graphics.

Nvidia is recently outselling AMD 9:1 in discrete GPUs.

AMD lol.png

5:1 is giving them credit.
 
It will be interesting to see how nvidia RTX 5070 will be cheaper than RX 9070 XT...... :laugh: IQ may lose to stupidity this time. MSRP is one thing but actual price is something completely different.
 
Last edited:
Thanks for the article @W1zzard . The article says that N4C is more advanced than N4P, but TSMC claims that it's a lower cost variant:
It's not inferior or superior process, it's just have other priorities, from the 3 pillars (PPA), density (area) is better in N4C and a suppose probably it's a little bit inferior in frequency (performance) (but i guess very close) and probably same in power (if you account frequency/area difference) if there were exactly same in performance and power characteristics TSMC probably would have mentioned that.But that is my guess, i may be wrong, there are more specialized sites that you can check their forums like semiwiki.
index.php
 
Perhaps it’s time to upgrade my WC 3080 10GB, only IF there’s no coil whine. Haven’t had that since my last AMD card.
 
Perhaps it’s time to upgrade my WC 3080 10GB, only IF there’s no coil whine. Haven’t had that since my last AMD card.
You're going to need some luck when it comes to coil whine. Even different samples among the same model of card can have varying levels of coil whine. External factors like your PSU can also affect it.
 
There a lot things to like about RDNA4.
I don't know if I'm a sucker for believing in their quality claims, but i like this architecture.
RDNA4 shouldn't have support for neural rendering because it needs to have support for cooperative vectors.
My understanding with the statement that FSR4 (just an ML upscaler + frame generator) is neural rendering ready, means that when UDNA launches all the titles that support FSR4 will be supported instantly in the new UDNA tensor shading array architecture (that will support cooperative vectors then).
But this is my interpretation, i may be wrong.
Regarding performance I'll still stick to the 5% difference for the leaked RX 9070 XT Red Devil specs vs 4080S in 4K raster:

RTX 4080 SUPER108
RTX 5070Ti106
9070XT 3100Mhz 340W (Nitro or Mercury)104
9070XT 3060Mhz 329W Red Devil103
9070XT 2970MHz 304W Reference100
RTX 4070 Ti SUPER90
RX 9070 Reference86
RX 7900 GRE75
Edit: corrected RTX 4070Ti S from 91 to 90
 
Last edited:
You're going to need some luck when it comes to coil whine. Even different samples among the same model of card can have varying levels of coil whine. External factors like your PSU can also affect it.
Spose so yeah. My AX1500i has seen no AMD cards yet so that may be a contributor why I’ve had zero issues for years.
 
With this kind of performance $599 is a competitive price for RX9070XT Reference (if the quality improvements claims are true) for the higher tier models it depends from the price premium but only $50 difference for RX 9070 won't cut it, if (big IF) RTX 5070 reference models have indeed $550 street price AMD will be forced to lower the price.
 
AMD snatches defeat from the jaws of victory again... $600/$550 are too expensive for these cards to be compelling.

Single digit market share incoming within two years. I hate this garbage GPU timeline we're on.
I disagree. Yeah, for $600 it doesn't have that wow factor and I think at $550 it would have been a screamer, it would be an instant buy from everyone, but FSR4 is clearly shaping up to be a major upgrade to FSR3 and according to some of the leaks its actually competing with DLSS4 and outperforms DLSS 3.5. If that is true, then Nvidia's so called advantage there is gone, I don't use upscaling, I'm not inclined to use it even now, but for those that do care and do use it, then if AMD are close to or on par with DLSS4 then there is no reason to buy Nvidia for that!

We know that RT is improved significantly and remember most of the RT games are optimized specifically for Nvidia, Cyberpunk 2077 is essentially a Nvidia tech demo! So RT optimized for all GPU's would likely put the 9000 series on par with Nvidia's latest 5000 series in terms of RT performance! So that is a big win for AMD, even if RT ends up being 10% slower on the 9000 series, it is still 23% cheaper from the $750 MSRP of the 5070ti and over 45% cheaper than the actual $950 prices.

So FSR4 being close to or equal to DLSS4, very close, equal or even faster than Nvidia in RT, impressive new media engine with quality improvements to streaming, decoding and encoding, same power consumption to their competition and even lower in the case of the 5070 vanilla, all that for $150 less than MSRP 5070ti is a huge deal and remember in reality $350-400 cheaper than most 5070ti right now!
 
Well here's to hoping the bottom end card and all the other cards get the new media engine unlike the RX6400.........
 
It's not inferior or superior process, it's just have other priorities, from the 3 pillars (PPA), density (area) is better in N4C and a suppose probably it's a little bit inferior in frequency (performance) (but i guess very close) and probably same in power (if you account frequency/area difference) if there were exactly same in performance and power characteristics TSMC probably would have mentioned that.But that is my guess, i may be wrong, there are more specialized sites that you can check their forums like semiwiki.
index.php
Wow, so AMD really is trying to squeeze us for that extra margin. At least Intel has the right idea and sells at cost/at a loss for market share; AMD can't even do that after going from 40% to 10% in less than a decade and having a successful CPU division that can soak up most of the cost.
Well here's to hoping the bottom end card and all the other cards get the new media engine unlike the RX6400.........
Navi 44 lacks any hardware encoders. It's over.
 
Navi 44 lacks any hardware encoders. It's over.
Where did you get that?
RDNA4 has them or else there would be nothing to improve.
 
Doesn’t really change his point much. At $599, lower than the average selling price of a 7900XT of ~$700 over the entire retail life time, you’d be getting a ~15% faster card, improved ML/AI upscaling, new connectivity, significantly better RT performance per hardware, overall a better product for less (sans the 20gb vram buffer).

Rather silly if someone were to upgrade from a 7900XT sure, but this will crush the 5070 if sold anywhere near msrp. Overall its a solid value in todays shitty GPU market.

Why do you think it will sell anywhere near MSRP? This isn't an AMD question, the only way it will stay near MSRP is if there is more supply than demand.

We all know these cards will sell out in the first few days, if not first few hours/minutes. It will not stay near MSRP, same as the 5070 Ti etc. have not.
 
The 9070XT is the same as the 5070Ti in raster and the 4070Ti Super in Ray tracing, all for $600 instead of $750, well done AMD.

the 5070SuperDOA, if I were Jensen I would call it that
its slower than 5070Ti because its slower than 4080
 
Wow, so AMD really is trying to squeeze us for that extra margin. At least Intel has the right idea and sells at cost/at a loss for market share; AMD can't even do that after going from 40% to 10% in less than a decade and having a successful CPU division that can soak up most of the cost.
Everyone is trying to squeeze us for that extra margin, not just AMD, but my answer wasn't about that.
Don't believe that Intel is selling at cost/ at a loss , their margins are just too low, AMD just doesn't want a price war with Nvidia, it will serve none of them any good (on the other hand it would be great for us consumers) Probably they choose N4C also because all of their latest CPU (high margin) products are in N4P and they want every wafer they can get for this process (my guess no info)
 
Everyone is trying to squeeze us for that extra margin, not just AMD, but my answer wasn't about that.
Don't believe that Intel is selling at cost/ at a loss , their margins are just too low, AMD just doesn't want a price war with Nvidia, it will serve none of them any good (on the other hand it would be great for us consumers) Probably they choose N4C because all of their CPU (high margin) products are in N4P and they want every wafer they can get for this process (my guess no info)
A "price war" would be what they need to actually gain market share, and it's one I think NVIDIA might not actually be willing to win. It's awful that a card that costs nearly the same to produce as a 7800 XT gets upcharged 20%. They don't have customers! Who do they think they're gonna sell their next generation of products to?

Intel may or may not be selling at cost/at a loss, but I think it really illustrates what's needed to actually gain market share.
 
Why do you think it will sell anywhere near MSRP? This isn't an AMD question, the only way it will stay near MSRP is if there is more supply than demand.

We all know these cards will sell out in the first few days, if not first few hours/minutes. It will not stay near MSRP, same as the 5070 Ti etc. have not.

Thats what the “if” covers…
 
Hah, we sure do live in interesting times.

I want to play in 4K these days, so I sold my 4070 and was ready for MSRP 5080. That didn't go too well, what with the unimpressive reviews and the stock fiasco, so I switched to 5070Ti. Better reviews, 0 stock. Ok. Time to play retro games and roguelikes until (or if?) this clown world normalizes a bit.

Now AMD, which tbh I didn't have high hopes for, entered the chat. The price for the XT can not be ignored, and they promise loads of stock on launch, so....hmmm....what's a girl to do?
I guess, wait for the in depth benchmarks. I need to see the new FSR vs new DLSS image quality comparison, and also latency for AMD's FG. The big problem is that from these initial "vs TI" slides it seems the XT lags behind Ti in games I'm most interested in (RT CP2077, Stalker, Warhammer).

But, if the FSR is on par with DLSS quality-wise nowadays, and the FG decent enough and could help drag them over the 60fps threshold in 4K, then I'll certainly consider getting one. At least to tide me over until there's more Nvidia stock, or maybe even till the next year Super refresh, which might be more interesting than Blackwell launch.
 
5070 Ti for $150 less, and with AMD's famous Linux support? Sign me up.

While RT is massively improved, I doubt it'll reach Nvidia's level, but that's hardly a problem for me.
 
Wish they would sell the Reference model.. Looks the best and I don't trust AIB/Resellers to offer 9070XT at that price.

Also sucks that its UHBR13.5 and not UHBR20. Will need DSC for some high refresh displays.
 
Back
Top