• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 9070 Series Technical Deep Dive

AMD snatches defeat from the jaws of victory again... $600/$550 are too expensive for these cards to be compelling.

Single digit market share incoming within two years. I hate this garbage GPU timeline we're on.
I don't have an issue with price if the reports on performance are in the ballpark. I think the non XT should have been a bit cheaper, maybe $525.
 
I'm doing the same thing. I was dead set on buying a 5090, but at this level of greed, I simply can not support. I know I won't get anything remotely close to the performance I want, but to hell with it team red for another 2 years or so.
I mean yeah it maybe only 60% of the performance of the 5090 but it’s also 20-30% the price
 
Got to say, yesterday I was cautiously optimistic. After reading about 70-80% of this thread, I am realizing the train has left the station FOREVER and I'm just living in past.

We're here talking about 700-800$ GPUs being fine, and many selling out at 1000+$ - FOR WHAT?!? Playing rehashed games, just this time remastered with "RTX"?

As someone said, I AM getting old. But is 40+ really that old? I do remember GPUs always being most expensive part of gaming, but I also remember drooling from getting a 300€ (tax included!) card that could run ANYTHING I threw at it, and it payed for itself in 2 months as I worked as game reviewer at a time. I no longer in review business, that was "just" a job that got me through college without parents/banks/debts... I make roughly 8x more easily these days.

Then I look back, and see my last card being GTX 1070Ti which I got free from business partner when it was already old and leftover in inventory. That PC is also now rotting in a closet, because I switched to just using my company laptop (with built in mobile RTX3060). Realistically last GPUs I actually bought with own money (that I can remember) were the likes of Radeon 9700, then some Nvidia I think GeForce 6800 LE with unlocked units throwing it one tier up, then Radeon X1950Pro, then HD4870/4890, then long nothing until I found I think second hand R9 2x0 or something, and later swapped that for R9 3xx something, barely remember those two cards anymore. Most of that was 2002-2009 or there about, and cards from that period had high OCs, custom (air) cooling, tweaked BIOS, unlocked features and the likes. It was hobby and fun, and relatively affordable even for a student and first low payed jobs. But that means I haven't bought a real GPU since 2014/2015?! That GTX1070Ti was a gift around 2019/2020, and my current laptop was bought around Sept/Oct 2022 as EoL model released about a year before that.

The way things are going I'll become a console gamer, maybe switch to handheld like Steam Deck and just forget about all this bullsh** forever.

Tell me ppl, is it logical that laptop with RTX3060 was 1000€ (with 25% sales tax included!!), including screen, battery, and all the PC parts, while at the same time roughly same specs in a desktop WITHOUT screen and battery cost more? It's same with this topic at hand. We're talking about companies having 50-60% margins, but let's not forget, that's 50% ASML margins selling to TSMC, then 50% TSMC margins selling to AMD, then 50% margins AMD selling to OEM, then more sane 15% margins for OEM, and then God knows how much margins when retailers and scalpers take over, and another 20-25% sales tax depending on country. If we start at 100$ that ends up at 700-800$ by the time it reaches our consumer homes. So unless you're in the stock market and one profiting from all that, everyone else is getting milked to the limit... Actually beyond limits. Ask your boss for 50% raise and see what they think about it when you try to milk them. You'll get slapped to reality. Then why on earth are people still buying god damn 1k+ GPUs... Slap yourselves out of it - please!

Sorry for rant but it was heavily needed...

Edit: Oh, and someone said we'll end up 10y from now with people unable to afford it all... It's already happened, like 3 years ago. They're just running on inertion until it hits everyone. With 8.5 billion people, in 2025 selling under 75 million COMBINED iGPU + dGPU is telling that already but note only 10% of that is dGPUs... If I haven't forgot my math thats about 0.08% of people that can (and is willing) to buy dGPU. And it's already declining.

Source: https://www.tomshardware.com/pc-com...te-igpus-increase-while-discrete-gpus-decline
 
Last edited:
If fsr 4 ends up being close to dlss 4 I'm inclined to go 9070/xt instead of 5070ti. Msrp on 5070ti is a joke, actual price where I live is 1300usd for cheaprst model in stock. If I can get an 9070 xt for about 700usd which it will retail for in Norway msrp and get same performance, powerconsumption and fsr4 being close to dlss 4 I'm going team red again. This reminds me of 5700/5700XT where they offered similar performance to 2060s/2070s at a much lower price.
 
Got to say, yesterday I was cautiously optimistic. After reading about 70-80% of this thread, I am realizing the train has left the station FOREVER and I'm just living in past.

We're here talking about 700-800$ GPUs being fine, and many selling out at 1000+$ - FOR WHAT?!? Playing rehashed games, just this time remastered with "RTX"?

As someone said, I AM getting old. But is 40+ really that old? I do remember GPUs always being most expensive part of gaming, but I also remember drooling from getting a 300€ (tax included!) card that could run ANYTHING I threw at it, and it payed for itself in 2 months as I worked as game reviewer at a time. I no longer in review business, that was "just" a job that got me through college without parents/banks/debts... I make roughly 8x more easily these days.

Then I look back, and see my last card being GTX 1070Ti which I got free from business partner when it was already old and leftover in inventory. That PC is also now rotting in a closet, because I switched to just using my company laptop (with built in mobile RTX3060). Realistically last GPUs I actually bought with own money (that I can remember) were the likes of Radeon 9700, then some Nvidia I think GeForce 6800 LE with unlocked units throwing it one tier up, then Radeon X1950Pro, then HD4870/4890, then long nothing until I found I think second hand R9 2x0 or something, and later swapped that for R9 3xx something, barely remember those two cards anymore. Most of that was 2002-2009 or there about, and cards from that period had high OCs, custom (air) cooling, tweaked BIOS, unlocked features and the likes. It was hobby and fun, and relatively affordable even for a student and first low payed jobs. But that means I haven't bought a real GPU since 2014/2015?! That GTX1070Ti was a gift around 2019/2020, and my current laptop was bought around Sept/Oct 2022 as EoL model released about a year before that.

The way things are going I'll become a console gamer, maybe switch to handheld like Steam Deck and just forget about all this bullsh** forever.

Tell me ppl, is it logical that laptop with RTX3060 was 1000€ (with 25% sales tax included!!), including screen, battery, and all the PC parts, while at the same time roughly same specs in a desktop WITHOUT screen and battery cost more? It's same with this topic at hand. We're talking about companies having 50-60% margins, but let's not forget, that's 50% ASML margins selling to TSMC, then 50% TSMC margins selling to AMD, then 50% margins AMD selling to OEM, then more sane 15% margins for OEM, and then God knows how much margins when retailers and scalpers take over, and another 20-25% sales tax depending on country. If we start at 100$ that ends up at 700-800$ by the time it reaches our consumer homes. So unless you're in the stock market and one profiting from all that, everyone else is getting milked to the limit... Actually beyond limits. Ask your boss for 50% raise and see what they think about it when you try to milk them. You'll get slapped to reality. Then why on earth are people still buying god damn 1k+ GPUs... Slap yourselves out of it - please!

Sorry for rant but it was heavily needed...

Edit: Oh, and someone said we'll end up 10y from now with people unable to afford it all... It's already happened, like 3 years ago. They're just running on inertion until it hits everyone. With 8.5 billion people, in 2025 selling under 75 million COMBINED iGPU + dGPU is telling that already but note only 10% of that is dGPUs... If I haven't forgot my math thats about 0.08% of people that can (and is willing) to buy dGPU. And it's already declining.

Source: https://www.tomshardware.com/pc-com...te-igpus-increase-while-discrete-gpus-decline
Yep, I know the feeling. 2nm is the last time SRAM gets more dense, then we're hardstuck forever. Crazy people shilling "$699 actually good because REAL msrp for NVIDIA $999", it's just embarassing how much people will refuse to take their own side.

Too bad none of these specialized parts will be worth making at some point in the future, iGPUs deprecating the low end and creeping upward and all. Least they could do is pack them with VRAM to keep things relevant for a while but nope. All just gonna be sand in the wind at the end of the day.

Maybe node shrinks dying will finally force devs, engines, and publishers to push optimization to the fore, but I feel like it's more likely the whole industry will crash before it gets to that point.
 
AMD should have priced the 9070 XT less than $600 because the 5080 which is $1000 has a similar die size? Are you serious? :kookoo:


Let me try to justify why I mentioned this earlier.
It was not about Nvidia but AMD catching up in performance per area

1. Foundries have 300mm wafers, they are cut into parts which we call dies, the dies have a physical limit
(YES I KNOW there is a new method to use the full wafer, but no consumer chip is using that)
3. The cost of a single wafer has gone up
4. Transistors/area density has hit a few walls, no more >2x density for a single 50% shrink, so it's more difficult to cool down tiny dies when they dissipate 300W...so peak frequencies have hit a wall too
5. TDP has hit the 300-400W wall due to material properties limitations and the average consumer doesn't want a 1kW heater on their desk

So what is left, if you want to make a guess for the best possible high-end GPU on a specific architecture, for example "what would be the peak theoretical performance of a Navi 4x at 750mm2 die size" the only route of improvement left for monolithic dies, is the size.
Since TSMC releases their "average density" improvements for each node, we can use that.

If you take Navi 3x and AD10x and calculate what a shrink to the new process node would be, then compare to what was actually released, you end up with a performance number which is approximately what the architectural improvements are between previous/next-gen
It's not a perfect 1:1 scaling, usually it's around 70-90% for the largest dies depending on power constraints, however you can get a very good estimate.


While it's nice to compare perf/watt, absolute performance and all the other metrics, there is one metric that matters most for the companies and that's the cost per GPU in order to market these.
AMD has been behind in performance per die size for a long time, i think since Pascal, which is one of the reasons they were not releasing "high-end" models every now and then
It makes no sense for AMD to compete in performance when nvidia can achieve the same performance at a much lower cost.

Before that, it was the opposite. At some point, they got so far ahead that nvidia had their own GPU oven memes, but then Bulldozer happened and AMD has been playing catch-up.
Unlike Intel, nvidia did not sit back and watch but actually innovated, so there wasn't a "Zen moment" on the GPU side.

While Navi 4x architecture seems to be closest they have been, they are still not there, but maybe they're close-enough

I don't think we'll ever see 350mm2 dies sold for 400$ again, since it's not only the cost of the die that increased but everything around it like VRAM chips, increased cooler and MOSFET cost due to higher power, increased PCB quality due to higher signal requirements...etc etc

HOWEVER IF AMD reaches parity on performance per die size, the cost of the GPU board is pretty much the same, so while NV might be pricing their top-end higher and higher...AMD desperately wants that market and would definitely release be able to release cheaper cards, without losing money and I WISH Intel stays in the game, then we might see interesting innovations.





Edit:
All that inflation and VRAM requirement talk is bullshit, the 980Ti was a 600mm2 die with 6GB VRAM released for 650$ and the 1080Ti was 470mm2 die with 11GB VRAM released for 700$ after 10months because there was no competition from AMD anymore, this was ~9 years ago
Funny - all those cherry-picked charts for GPU die size and prices begin at 2014 when AMD stopped competing
 
Last edited:
AMD opting to go for performance comparison against 4 year old GPUs. Great! And to top it off majority of slides feature MBA designs that we were told were "only artist concept" which I strongly doubt.
I don't see an issue here since there's a lot of us Ampere users like me and you.
 
Power Color recommends a 900W psu for their 9700 XT and Asus recommends a 750W psu for theirs so I'm going to guess a 850W is the sweet spot?
 
Power Color recommends a 900W psu for their 9700 XT and Asus recommends a 750W psu for theirs so I'm going to guess a 850W is the sweet spot?
On average RX 9070 XT will not be as power hungry as RX 7900 XTX so good quality 750w psu should be fine.
 
What a "shot in the foot". This way, AMD encourages people to buy mid and low-end cards from Nvidia.

What "great" administrators AMD has...
Or any and every single Intel dGPU on the planet. They must think it's okay because they can just have their CPUs do it.
 
Lets see if the lack of integer cores vs nvidia 5000 series makes it boost higher like 3.5GHz - 3.7GHz for a good oc. Almost becoming cpu-like frequency. Also the scalar unit works like cpu i guess. But is it for scheduling?
 
The Hyper-RX slide's performance chart makes no sense. What the hell is "native" supposed to be? Which game and what fps figure? It's clearly not an average of measurements at native 4K in the 3 games shown in the chart.

And why do they only show performance mode? What about the other profiles???
 
At one point I might have said they needed to price the 9070XT lower, but Nvidia isn't even trying in the consumer GPU space this generation, and their supply is SO low that prices are exorbitant to the point that 5070ti's are selling for the real world price of well over a thousand bucks, so IF you can actually get a 9070XT at retail prices it will be at least $300-400 less money than the competing Nvidia card, and that's a LOT of persuasion. We've already seen that Nvidia loyalists will happily pay $200 more for the brand name, but make it $400 and they are going to start rethinking things.

Personally, I think Nvidia is devoting all the production capacity they have allocated at TSMC to datacenter and AI, and consumer GPUs only get whatever production is leftover after they've fabbed all the DC/AI parts they can sell, because those are several times as profitable. I'm probably not alone in this belief, it seems kind of obvious that AI is all the rage ATM.

As for the 9070 non-XT, that's the x600X3D of their GPU line, in other words it's just "bad" Navi 48s that won't run at full clocks and/or have bad CUs on them, but can be run slower and with fewer CUs. I don't think you'll be seeing a huge supply of them, as the process is pretty mature and it costs the same to produce as the full Navi 48, and who in their right mind isn't going to spend $50 / 8% more money to get 35% better performance if they have a choice between the two? I believe THIS is why they priced it so close to the XT, it's intentional because the worst thing that could happen is they end up with more demand than they have throwaway Navi 48s and end up having to cripple good ones that they could sell for $600 to meet demand for the lower priced part. So then I expect that like the 7900XT, the 9070 will drop in price so there's a bigger difference when inventory of bad Navi 48s builds up a little, to increase demand.
 
Power Color recommends a 900W psu for their 9700 XT and Asus recommends a 750W psu for theirs so I'm going to guess a 850W is the sweet spot?
1) It's 9070 XT, not 9700 XT.
2) False. Depends on the model, e.g. Reaper has a recommendation of 750W.

Realistically, I'd never hit over 500W unless I was running power viruses on CPU and GPU at the same time (and then only by 50W max). Yeah, by that point I'd be on the downward slope of the efficiency curve with my 750W PSU, but nowhere near actual overload.
 
Probably no Big Navi coming with RDNA4 but there's some speculation from some on X and on Wccftech that a higher version 9080xt is coming.


Screenshot_20250213-131755~2.png
 
Back
Top