• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 Series Technical Deep Dive

Learn to camelcamelcamel:

View attachment 387251
Probably right about where my trust in Amazon is. Saying that, I don't think I've once received a price alert from pcpartpicker. Maybe I'm setting my expectations too high?!

RDNA4 shouldn't have support for neural rendering because it needs to have support for cooperative vectors.
My understanding with the statement that FSR4 (just an ML upscaler + frame generator) is neural rendering ready, means that when UDNA launches all the titles that support FSR4 will be supported instantly in the new UDNA tensor shading array architecture (that will support cooperative vectors then).
The definition I keep reading is "Neural rendering broadly defines the suite of techniques that leverage AI/ML to dramatically transform traditional graphics pipelines." Perhaps this is similar to "HD Ready" vs "Full HD" regarding the inclusion of cooperative vectors?!
 
Last edited:
I guess for what i read here this will sell like crazy (I'll admit at launch they probably will because of the lack of new cards in the market) but I'm very doubtful. Usually places like this are a very poor representation of the real world. I like to keep it real and it doesn't seem like a good deal, sure the xt it's less 150, not less 50, but it's still less 150 from some insane pricing. The pricing on the non xt is just stupid and a shitty upsell move.

This will probably replicate the success of the original RDNA series (5700xt) and nothing more, and that's still a big improvement, but nothing to write about.
 
$599 is probably the upper limit that buyers will accept for that GPU, though a much much much better deal then what NVidia will give you. If the slides hold up to be true then it looks like RTX 4080 raster performance and 4070 ti ray tracing performance for $600. I do think that this will create problems for the 5070 and 5070 ti for NVidia.
 
I do think that this will create problems for the 5070 and 5070 ti for NVidia.
At nvidia's msrp there will be very tight competition but honestly i don't see nvidia going for msrp it probably will be something like this when prices come down RTX 5070 = 650$, RTX 5070 Ti = 850$ amd usually is cheaper at least in eu.
 
$599 is probably the upper limit that buyers will accept for that GPU, though a much much much better deal then what NVidia will give you. If the slides hold up to be true then it looks like RTX 4080 raster performance and 4070 ti ray tracing performance for $600. I do think that this will create problems for the 5070 and 5070 ti for NVidia.

Nvidia will release a cheaper super or just lower prices as soon as stock normalises, and we will still have overpriced gpus.

The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep thanking the wolf for being eaten.
 
Ah, pricing in Europe, 636 EU for the non XT version and 696 EU for XT version. USD will be lower likely.
That's still not horrible. It should also drop a bit over time.

Nvidia will release a cheaper super or just lower prices as soon as stock normalises, and we will still have overpriced gpus.

The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep tanking the wolf for being eaten.
Yes... and in 2017 the flagship card was a mere 30%-40% more expensive.
The 9070XT is... 66-70+% cheaper than the 5090.

Seems like things are quite neatly in order here. I'm sure I'm missing the mark by some %, I didn't calculate anything, but you're omitting a lot of facts in that bolded part I think. The better card to compare 9070XT to relative to something like Pascal, is probably the 5080 as 'flagship card' against the 1080ti. The 5090 is 'special', and clearly in its own price bracket. Just because Ada nudged people to a 4090 because the 4080 was >1200 bucks and made no sense, doesn't make the x90 suddenly 'a regular release' imho. Though I don't disagree the pricing is bonkers overall.
 
Last edited:
The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep tanking the wolf for being eaten.
For 3000$ that is not a gaming gpu it's only for millionaires or for true idiots. GTX 1080 Ti was 699$ not 3000$! GTX 1080 Ti kicks the shit out of the RTX 5090 "easy"
 
Last edited:
Yes... and in 2017 the flagship card was a mere 30%-40% more expensive.
The 9070XT is... 66-70+% cheaper than the 5090.

Seems like things are quite neatly in order here. I'm sure I'm missing the mark by some %, I didn't calculate anything, but you're omitting a lot of facts in that bolded part I think. The better card to compare 9070XT to relative to something like Pascal, is probably the 5080 as 'flagship card' against the 1080ti. The 5090 is 'special', and clearly in its own price bracket.

You guys keep doing over and over this idiotic justification for things. So the justification for AMD's greed is oh look Nvidia is even greedier so it's ok. How does that make even a little bit of sense in your head?
 
You guys keep doing over and over this idiotic justification for things. So the justification for AMD's greed is oh look Nvidia is even greedier so it's ok. How does that make even a little bit of sense in your head?
Don't make me do the math... Its not justification for anything, who said I'm buying?

Markets are what they are, at best we can try to nudge people to make a smarter choice. But you and I both know it doesn't work like that, for most.

For 3000$ that is not a gaming gpu it's only for millionaires or for true idiots. GTX 1080 Ti was 699$ not 3000$! GTX 1080 Ti kicks shit out of RTX 5090
Exactly. Different time, different market, the comparison doesn't really make sense. There was no x90.
 
Don't make me do the math... Its not justification for anything, who said I'm buying?

omg, how ar you not justifying a stupid price by presenting me a even stupider price and using it as an argument?
i guess if the 5090 cost 5000 msrp you would go and buy a dozen of 9070xt because the math would look even better. If you don't see what you're doing i don't think you can be helped
 
Nvidia will release a cheaper super or just lower prices as soon as stock normalises, and we will still have overpriced gpus.

The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep thanking the wolf for being eaten.
What flagship? There's no flagship in the 9000 series. The 9070 XT is $150, or a good 20% cheaper than the 5070 Ti, and that's that. If AMD's own -2% performance number is correct, then the 9070 XT is definitely the better choice. There's no point talking about some imaginary flagship in this market segment.
 
omg, how ar you not justifying a stupid price by presenting me a even stupider price and using it as an argument?
i guess if the 5090 cost 5000 msrp you would go and buy a dozen of 9070xt because the math would look even better. If you don't see what you're doing i don't think you can be helped
Dude. We are looking at a comparison of a 1070ti, so, a subtop-performance card that you say is 600 usd today counting inflation.
And then you're saying a 599 USD GPU that covers the exact same performance bracket in 2025 (competitive with similar bracket GPUs of today) is 'greed that we are justifying' ? Its literally THE SAME offering as we had in front of us relatively with Pascal, lauded as one of the last/best great GPU gens by many. .....

I think you need a coffee. AMD gave us major progress on perf/$ here. The 9070XT is finally moving the performance bracket forward, after generations of subpar offerings in this segment. From the 2060 onwards its mostly been so-so offerings, either VRAM short, or tech missing, or priced way beyond sense.
 
Well here's to hoping the bottom end card and all the other cards get the new media engine unlike the RX6400.........
Hopefully circumstances will never arise when a GPU like the 6400/6500 series ever have to exist as a discrete GPU ever again.

IMO they're fantastic GPUs for their intended purpose (providing extra shader power to weak integrated Radeon laptop graphics that already have a full media engine and all the expected display outputs).

They were never designed to be discrete GPUs, because they're not a complete GPU by modern expectations. I bet the engineers who worked on them cry themselves to sleep at night knowing how their great idea was misused in the most inappropriate and disappointing way imagineable :\
 
You guys keep doing over and over this idiotic justification for things. So the justification for AMD's greed is oh look Nvidia is even greedier so it's ok. How does that make even a little bit of sense in your head?
And your justification for Nvidia's greed is "oh look, AMD is doing it, too" even for a much lesser extent, right?

Hopefully circumstances will never arise when a GPU like the 6400/6500 series ever have to exist as a discrete GPU ever again.
The 8700G has a 6400 level iGPU in it, and the one in the new Ryzen mobile chip is much stronger, so I think what you're saying can be pretty much guaranteed.
 
Dude. We are looking at a comparison of a 1070ti, so, a subtop-performance card that you say is 600 usd today counting inflation.
And then you're saying a 599 USD GPU that covers the exact same performance bracket in 2025 (competitive with similar bracket GPUs of today) is 'greed that we are justifying' ?

I think you need a coffee. AMD gave us major progress on perf/$ here. The 9070XT is finally moving the performance bracket forward, after generations of subpar offerings in this segment. From the 3080 onwards its mostly been so-so offerings, either VRAM short, or tech missing, or priced way beyond sense.

now we are using artificial naming to justify pricing, it gets even worst. :shadedshu:
forget naming than and just use pricing vs performance vs the flagship, if that makes it easier on your brain

the xxxx is 70% lower than the flagship now
the yyyy was 33% lower than the flagship in 2016
and they cost the same adjusted for inflation

the names are irrelevant, it just happens that the 1070ti and the 9070xt are the same price so it was (i thought) a easy comparison, but i guess brains stopped working.
AMD just artifically changed the naming, and i guess it worked, people eat it, there is no more brain usage today
 
That's just marketing bulshit for childern... It's just a name...

apparently that worked, people can't understand that. They are being played and can't see it. They are using names to fool the less smart among us and it's working
 
now we are using artificial naming to justify pricing, it gets even worst. :shadedshu:
forget naming than and just use pricing vs performance vs the flagship, if that makes it easier on your brain

the xxxx is 70% lower than the flagship now
the yyyy was 33% lower than the flagship in 2016
and they cost the same adjusted for inflation

the names are irrelevant, it just happens that the 1070ti and the 9070xt are the same price so it was (i thought) a easy comparison, but i guess brains stopped working.
AMD just artifically changed the naming, and i guess it worked, people eat it, there is no more brain usage today
Okay. Here's a real comparison for you. You are aware of the concept of die size, I hope... You keep saying these are similar products.

1740823255897.png
1740823306730.png


We're looking at a kinda different 'flagship' don't ya think?
And with that, this discussion and your idiotic comparison ends abruptly.
 
apparently that worked, people can't understand that. They are being played and can't see it. They are using names to fool the less smart among us and it's working
If you understand that, then why are you talking about some flagship in the context of the 9070 XT?
 
The MOST important take from this release, 5070Ti and 9070XT have the same die size.

IF the performance figures are accurate...they finally reached parity in performance/area/power with nvidia (last time was 10+ years ago)
Although to be honest, nvidia didn't advance much since 3xxx series

I still would prefer separate versions without any RT/AI dedicated transistors - just pure raster (GTX vs RTX never happened)
 
That's just a marketing bulshit for childern... It's just a name...
No its not. The x90 is a new entry in Nvidia's line up and they even added an SKU to cover the gap and the increased disparity in shader counts between top and bottom end. In other words, since Ampere, we've gotten an x103 SKU in addition to the x102, and the shader counts are miles apart from x104, a gap that was widened substantially since then, and widened further with Blackwell.

You can't just compare across four-five gens ago, even if its funny to do to make a silly point. Still, I think the comparison works if you omit the weird link to 'flagships'. The 9070XT is quite similar to the 1070ti, in a relative sense:

1740823624350.png
1740823675609.png


600 USD adjusted for inflation in.... vs 599 USD today for a somewhat larger die.

The 9070XT is a BETTER offering than what Pascal gave us ;) AMD just gives you 43 mm2 for free.
 
Last edited:
If you understand that, then why are you talking about some flagship in the context of the 9070 XT?

how do you make a fair comparison price vs performance than?

this is the only thing that makes sense to compare, you can't run away from it:
The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep thanking the wolf for being eaten.

you're no paying more and getting much less than in 2016
 
IF the performance figures are accurate...they finally reached parity in performance/area/power with nvidia (last time was 10+ years ago)
Although to be honest, nvidia didn't advance much since 3xxx series
If i'm not mistaken last time it was RX 6800

performance-per-watt_2560-1440.png
 
The MOST important take from this release, 5070Ti and 9070XT have the same die size.

IF the performance figures are accurate...they finally reached parity in performance/area/power with nvidia (last time was 10+ years ago)
Although to be honest, nvidia didn't advance much since 3xxx series

I still would prefer separate versions without any RT/AI dedicated transistors - just pure raster (GTX vs RTX never happened)

5070 TI is cutdown though. Only 70 out of 84 SM enabled on GB203.

5080 would be the direct comparison based on die size alone. (Both fully enabled at similar mm²).
 
how do you make a fair comparison price vs performance than?
9070 XT vs 5070 Ti. That's it. What other comparison do you want in this segment?

this is the only thing that makes sense to compare, you can't run away from it:
The 1070ti adjusted to inflation was 600usd back in 2016, but the 1070ti was 33% slower than the flagship card. The 9070xt costs the same 600usd but it's 70% SLOWER than the flagship. You're all just normalising greed. It's like the sheep thanking the wolf for being eaten.

you're no paying more and getting much less than in 2016
So tell me, where is the 9000 series flagship?

Why are you attacking AMD for being greedy when you have Nvidia with a similar card for $150 more?

Also, why bring up the 1070? It's not 2016 anymore, you have to think in terms of what you can buy now.
 
Back
Top