• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA 4 and Radeon RX 9070 Series Unveiled: $549 & $599

I have a 7900xtx already.I see Nothing here to make me upgrade. Loss of vram,using a 7900gre as a comparison,fsr4-what little was seen seems good.If Nvidia had not had such a terrible problems/performance I would have bought.there’s nothing here for me from both camps. This gets better by the day!
 
I have a 7900xtx already.I see Nothing here to make me upgrade. Loss of vram,using a 7900gre as a comparison,fsr4-what little was seen seems good.If Nvidia had not had such a terrible problems/performance I would have bought.there’s nothing here for me from both camps. This gets better by the day!
It is better though isn't it, because now you keep your money. Any progress you would buy into is fleeting anyway. I mean this gen is a complete and utter standstill, unless you consider a 600W GPU on a 375W specced connector at 2,5k progress.
 
I will be upgrading to the 9700 XT 16 GB once the Ollama developers support it.
 
[a lot of true things]
This is why 9070 should be cheaper, and 9070xt should be the same price.

I totally agree.

I *hope* people realize the limitations of <45TF and 12GB of ram, but it really does take certain ways of showing performance for people to understand. You can skew a lot of things depending on data shown.
Games used; DLSS used or not; minimum frame rates. W1zard sometimes has a problem in his reviews by choosing not to show these limitations, and why I've championed a testing overhaul.
For instance, RT mins, or games that require >12GB VRAM (especially at 1440p/4k). This is why you shouldn't take his reviews as gospel, even though they have a lot of good information, especially outside FR.

If you want more proof of how important these two metrics are, look at how 9070 will overclock. If it exceeds 45TF or so, consider my mind blown (as a low-tier product, not for this price).

45TF/7168 = 3139mhz. Watch for reviews...The point of this is because product segmentation. Many games need 45TF for many common scenarios.
That is why they (and nvidia) limit cards below this....and other scenarios...but those aren't important (right now). The next tier is >60TF and >16GB. This is why 5080 needs >16GB, but doesn't have it.

With 9070xt things are trickier, because you have to think in terms of actual usable raster performance.

As I've said before, regardless of if 64 ROPs or simply bandwidth, 9070xt is pretty much limited to ~45TF with 20gbps memory. Yes, you can overclock it, and you SHOULD. This card will then pass that threshold.
So does an OC 7900GRE...but we don't talk about that, IG, even though that is literally it's reason to exist. That little tiny bit over 7800xt that actually does make a difference.

Which is why 9070xt is a good card, especially with decent RT/up-scaling (unlike GRE) and many nVIDIA cards are not. Because nVIDIA (and sometimes AMD) put these purposeful limitations in place per product.
I can show it 37 different ways, but it's true, and are the 'tiers' of performance. This is why a 9070xt, even if slower, is as good as anything *up to* something that can exceed 60TF and/or 16GB.
7800xt/any 12GB 4070 sku/5070/probably 9070 are all limited to <45TF, pretty much, one way or another. This is why AMD didn't want people to overclock GRE, and 7800xt limited below this as well.
7900xt exceeds 16GB RAM, but purposely really not >60TF raster perf. 5080 can exceed 60TF it in compute, but 16GB. This limitation isn't understood by many for some reason, but will begin to make sense.
At some point I hope people get it, as it really bares out across a ton of game scenarios; just understand 9070xt fills a perfect market niche as cheaply as possible, right now. That niche is 1080pRT/1440p FSR.
Well, cheap for AMD. Hopefully for us eventually. :p

You can see why either are so hesitant to do this, because it is the bare minimum for a lot of things long-time. 4090 is also similar, but a different tier. Astonishingly (but not at all), it is literally double these thresholds.
90TF/24GB, where neither is a limitation and 90TF making the most out of 24GB of RAM.
Will they add features and/or update their software so this is no longer the case at some point? Hmm...Gonna say yes. BC that is what they do. It is performance regression, some people just don't understand it.
Now you have to overclock a 4090 to really get to the perf threshold I'm talking (bc DLSS4, was 60fps at stock in games like Wukong w/ DLSS3), but it will (just barely, on purpose) and that's the scale.
What about the next DLSS? For 24GB cards just a *little* faster than 4090, but 256-bit/cheaper for nVIDIA to make. What do you think will happen to 4090? What almost or has already, depending on if you oc.
Essentially, think 1440p->4k 'quality' upscaling 60fps mins in a game like Wukong. 45TF/16GB will probably get you 1080p60 mins in a game like Wukong, or at least 960p->1440p 'quality' with FSR4.
The next important tier is upscaling 1080p->4k, and that is the point of >60TF, best matched with >16GB. Like I say, next-gen 18GB.
The reason those 2GB are important are for things like the added RAM hit for things like mins, using the up-scaler, RT, and framegen vs running pure raster (which is all most cards currently account for).

This is why some will argue the next-gen 18GB cards that replace 5080 are 'amazing progress per card per generation'. The truth is they just fucked you out of the ram you needed to make it last on 5080 16GB. :p
Some will see it as 'less of a performance hit', 'better mins', or 'greater performance using those features'. It will not be those things; simply exposing the limitation they put in place by not giving 5080 >16GB.

Other cards will not hit their former thresholds due to other 'feature improvements', that somehow is always just barely enough to relegate old cards to a lower tier. So, so, weird how that always happens.
This is why I say AMD should not focus on pure IQ this gen; they should focus on hitting the frames for each feature set/resolution; improve those things when they can make better options (on 3nm).

I know a lot of people don't understand this stuff...I truly do. But that's why you read this forum, right? How I explain everything isn't perfect, I know that, but hopefully as things progress you're prepared.

I could post a bazillion links to show this over the lifetime of nVIDIA cards, but it really is a pain in the ass, most don't care/get it, and at some point I hope people just be like "That makes sense".

IOW...Trust me, bro. It's easiest to show on the 12GB nVIDIA cards, but there are many more (sometimes more granular and difficult things to explain than pure VRAM limitations) instances in other cards as well.
 
Last edited:
press-deck-19.jpg


If its beating the GRE at those margins, that essentially 7900 XTX/4080/4080S level perf? or better, no?

If $600 is the magic number - count me in! Scalp me at $700? Still in! But lets be real, these cards are probably going to hover between $800+. AMD clearly took notes from Nvidia's "How to rob em silly". Maybe, a lucky few will snatch one at $600 in an early drop, but not me, I don’t beta test hardware unless it comes with a free stress therapist.
 
Last edited:
This is why 9070 should be cheaper, and 9070xt should be the same price.

I totally agree.

I *hope* people realize the limitations of <45TF and 12GB of ram, but it really does take certain ways of showing performance for people to understand. You can skew a lot of things depending on data shown.
Games used; DLSS used or not; minimum frame rates. W1zard sometimes has a problem in his reviews by choosing not to show these limitations, and why I've championed a testing overhaul.
For instance, RT mins, or games that require >12GB VRAM (especially at 1440p/4k). This is why you shouldn't take his reviews as gospel, even though they have a lot of good information, especially outside FR.
So much this. I've found as well that my personal experience with games and how they stress systems has been a more useful guide than just relying on review numbers. The benches are great though and I applaud the way TPU does its reviews and how the testing is shown. Its extremely neutral and free of bias. Sure, it doesn't cover every single in and out. But a review is always a snapshot of the moment - you will barely if ever see reviewers cover the 'lifetime of 12GB VRAM' in depth over time. Nobody does that. At best you get the occasional re-test on a set of GPUs on a very small selection of games, that still doesn't show you the better half of reality. The same applies to 8GB, that's why Nvidia's still selling that crap.
 
yes, because feature wise, AMD was
No.
Not because "feature wise".

3050 outsold 6600 that was:
1) a FULL TIER faster (there goes bazinga upscaling argument)
2) faster even at RT gimmick
3) ran cooler
4) wac CHEAPER

Not simply outsold, outsold 4 to 1.

Because lots of buyers are rater clueless.

Stop seeking rationale where there is none.

I will be upgrading to the 9700 XT 16 GB once the Ollama developers support it.
Ollama developers support ROCm, I thought.
So your point should sound like "once ROCm suppports it".

Which it should.

If its beating the GRE at those margins, that essentially 7900 XTX/4080/4080S level perf? or better, no?
"Up to" 40%-ish. From 20%-ish.
 
Ollama developers support ROCm, I thought.
So your point should sound like "once ROCm suppports it".

Yes and no. My current 6750 XT is not "supported" by Ollama but runs using ROCm with an additional flag. Ollama is actively working on direct support of more AMD cards. Direct software support means faster TPS for a lot of their models.
 
Ollama developers support ROCm, I thought.
So your point should sound like "once ROCm suppports it".

Which it should.
ROCm has no support for most consumer GPUs, you need workarounds to get it to work, which is what most folks and even developers have to do.
So yeah, it's a matter of Ollama supporting the new generation, not only ROCm.
 
So AMD partially learned and priced the 9070 XT at $599 ($549 would have been better, but ok, this isn't unreasonable). But then they go and price the 9070 at $549. A mere 9% less, but we can already see it has 14% fewer CUs (56 vs 64), and at default power limits, its 38% less than the 9070 XT (220W vs 304W). This is going to be a 15-20% slower card at only a 9% cheaper cost. Its going to get all the same bad press the 7900 XT did at launch and inevitably drop in price a few months from now, but not until it got called "garbage value" by the review press.
 
Which competed with the even worse 4060 and 4060Ti, whats your point?
Two wrongs do not make a right. If you are going to criticize nvidia for something, it only makes sense that you criticize AMD when they do the same thing. Otherwise your whole argument starts to devolve into "my side good your side bad" which isnt debate, its blind fanboyism.
I know you and the rest of the geforce buyers want AMD to hand out 9070XT's for free though.
You should check the sig before making a further fool of yourself.
You say that yet post things like this. I expect the hyperbolic childish BS from reddit, not here.
Hyperbole.
 
So AMD partially learned and priced the 9070 XT at $599 ($549 would have been better, but ok, this isn't unreasonable). But then they go and price the 9070 at $549. A mere 9% less, but we can already see it has 14% fewer CUs (56 vs 64), and at default power limits, its 38% less than the 9070 XT (220W vs 304W). This is going to be a 15-20% slower card at only a 9% cheaper cost. Its going to get all the same bad press the 7900 XT did at launch and inevitably drop in price a few months from now, but not until it got called "garbage value" by the review press.
I can imagine the discussion at AMD HQ went like this.

"We really need the most competitive offering for the XT"
"Well something's gotta give, 599 is going to eat into margins bigtime"
"Perhaps we'll recoup some of the lost margin on the XT, by pricing the non XT slightly closer to it? Market should eat up any cards we throw at it in this performance/price segment... right? Stores can set the gap straight"
 
I can imagine the discussion at AMD HQ went like this.

"We really need the most competitive offering for the XT"
"Well something's gotta give, 599 is going to eat into margins bigtime"
"Perhaps we'll recoup some of the lost margin on the XT, by pricing the non XT slightly closer to it? Market should eat up any cards we throw at it in this performance/price segment... right? Stores can set the gap straight"
I think its more likely the suits in charge are incredibly slow to learn, and figure that if they just do it again like they did last time, then THIS time it will definitely work!
 
I think its more likely the suits in charge are incredibly slow to learn, and figure that if they just do it again like they did last time, then THIS time it will definitely work!
Have I told you the definition of insanity?
 
I think its more likely the suits in charge are incredibly slow to learn, and figure that if they just do it again like they did last time, then THIS time it will definitely work!
Well its not the same thing really is it. The 9070XT has the perfect price, especially if/since AMD gets a decent margin on that. And its not on the level of a 7900XTX this time, which is unobtanium for most.
 
The trajectory of FSR is fast becoming a trajectory similar to DLSS, which makes me an instant non-supporter. If you can deploy this tech GPU agnostic (to a reasonable degree, say, all DX12 feature enabled cards for example), by all means. If not? GTFO - I don't need it, don't want it, won't use it. End of. History only repeats, we've been here before, we know how it ends.
I've said that before, but it seems that each ML based upscaling tech seems to require a specific set of hardware to run, and you've got to convince the competition to not just support the software, but to design the hardware around the software as well it seems. Intel can't make Xess run with Nvidia or AMD ai accelerator, so other GPUs ends up with an inferior version.

So far Miscrosoft has only done an half assed Ml upscaler that doesn't compare and is still limited to snapdragon.
 
I've said that before, but it seems that each ML based upscaling tech seems to require a specific set of hardware to run, and you've got to convince the competition to not just support the software, but to design the hardware around the software as well it seems. Intel can't make Xess run with Nvidia or AMD ai accelerator, so other GPUs ends up with an inferior version.

So far Miscrosoft has only done an half assed Ml upscaler that doesn't compare and is still limited to snapdragon.
There's just no effort into it. But they all do the exact same thing, and are likely to become ever more similar as we go forward. All these companies achieve doing it separately is burn cash. Consensus takes time.
 
ROCm has no support for most consumer GPUs, you need workarounds to get it to work, which is what most folks and even developers have to do.
So yeah, it's a matter of Ollama supporting the new generation, not only ROCm.
yhea, from what I read ROCm is very HPC focused, when HIP is what is supposed to be the "option of choice" for consumer level Radeon
 
$599 for the XT is amazing (provided there are models sold at this MSRP). $549 for the non-XT, though? Why even bother?
Why even bother? Well, potential buyers of vanilla 5070 will be asking themselves serious questions when they realize that 9070 has significantly better hardware than 5070, for the same price.

And then, many of them will also realize that 9070XT is even better for $50-70 more. So, 9070 has a role to play here.
 
I'll give my input (you won't like it, frankly): The XT price seems like like "fine" ish, but only... only in comparison to 5070Ti. However, the non-XT version is too pricey. By the AMD's own slides, it's about 16% less powerful, but is only 8.5% cheaper. It's unacceptable, even in "direst" GPU market conditions. Except this is intentional, and is 7900XT all over again, just to upsell the XTX version. Or 7700XT to upsell 7800XT.

Also, let's not forget, that these MSRP set by AMD are as fake as nVidia's ones. Hence, AMD this time does not make any MBA cards. Meaning the MSRP is an utter BS, as AIBs are about to charge whatever prices they want. And knowing how much AMD's partners keep the MSRP, and QA, this basically would end up in Sapphire, TUL, and XFX would end up $50-$150 on top of MSRP, sans the VAT/tarrifs, even for most basic "close-to-reference-design" cards. The "GPU vendor agnostic/neutral" AIBs like Asus, Gigabyte will end up with even more price hike, for even more inferior solutions.

Also, I don't get the everyone's hysteria and euphoria, while neither real performance reviews have been seen, nor the real, final AIB prices are set. It's clear, that any company's slides, be it (AMD, Intel or nVidia), are as good and trustful, as the "WC paper".

Happy with the prices, I was worried they were. going for $699.
It will! Do not even doubt, for even a second.
Hopefully this will shake Nvidias ridiculous pricing up.
How it will shake? Even if AMD will somehow "keep" the AIB prices within the MSRP, while having enough supply, they simply do not have enough supply, to "eclipse" nVidia's quickly dwindling previous gen cards stock. Once (if!?) nVidia would fix their issues with RTX5000 series, AMD once again will end up in the dire situation. Simply, because AMD have lost the perfect "window", a gap to fill, by releasing the cards earlier, before nVidia, and while they have basically a "paper" launch, and basically no stock. While AMD on the other hand, had the stock gathering dust, by some rumours, as early as the late december. But they didn't, due to waiting the nVidia's final pricing, on both 5070Ti, and 5070.
$599 for the XT is amazing (provided there are models sold at this MSRP). $549 for the non-XT, though? Why even bother?
Exactly! Lame and lazy upsell. Who knows, maybe the 9070XT has more "sufficient"/less deffective dies, so the supply of it is even better than the non-XT. Or... this is simply, just another cash grab.
What makes you think these launch at MSRP as well?
Indeed. People have too much hope and cope. At least for the last five to six years, people could build up the experience, regarding the GPU company's pricing tactics/strategy. How much more people need to suffer to understand this obvious thing? AMD just did "nVidia move" a "unleashed" themselves, to shift the "guilt" over gouging/overpricing to the AIBs instead. Who gonna know, how much of the end price it would go directly to AMD. The companies will just find another BS, like "these cards are expensive to make..." "the foudry time is expensive..." etc.
I have to agree, just spend the extra $50 at that point. AMD should have made it $499 for non-XT. 9070 XT is going to be sold out for awhile I expect, so I guess a lot of people will be impatient and buy non-XT anyway.

If I could get a 9070 XT in hand at MSRP for $599, I'd gladly sell my 7900 xt. I'm not selling until I have it in hand and tested that it is working good though, learned my lesson on supply and demand once too many times.
Yep! $500 for 9070 is a nicer price, if the performance claims, are real.

However, 9070 is much superior to 7900XT, not only due to fake frames FSR4, but due to potentially superior encoding/decoding capabilities, And presumably better power efficiency, regardless of TBD, and power connector amount. Also, the NPU CU amount is bigger, which means it's much more desirable for AI crowd. Which BTW (much like with crypto) is the crucial factor in (predatory) price formation.
 
I have a 7900xtx already.I see Nothing here to make me upgrade. Loss of vram,using a 7900gre as a comparison,fsr4-what little was seen seems good.If Nvidia had not had such a terrible problems/performance I would have bought.there’s nothing here for me from both camps. This gets better by the day!
You are 18 months late to this discussion. We have known for over a year that AMD was not going to produce top die this generation, as the project Navi 41 with 9 GCDs was too complicated to be developed in time for RDNA4 launch. This project should come back on UDNA architecture, so you can upgrade next gen, like myself and many other owners of 7900XTX.
 
yhea, from what I read ROCm is very HPC focused, when HIP is what is supposed to be the "option of choice" for consumer level Radeon
HIP is part of the ROCm stack. Think of it as a small component within an ecosystem, similar to something like cuBLAS within CUDA.
ROCm itself has a second grade support to consumer GPUs, their main focus is indeed on CDNA (and even that is far from good).

The move to UDNA should improve that by quite a bit.
 
I can imagine the discussion at AMD HQ went like this.

"We really need the most competitive offering for the XT"
"Well something's gotta give, 599 is going to eat into margins bigtime"
"Perhaps we'll recoup some of the lost margin on the XT, by pricing the non XT slightly closer to it? Market should eat up any cards we throw at it in this performance/price segment... right? Stores can set the gap straight"
The 9070 costs AMD the same to produce as the 9070XT, same die, same memory config. The 9070 only exists so AMD can harvest some defective Navi 48 dies, but TSMC still charge AMD the same (you pay by wafer, not by die). So if AMD isn't getting a lot of defective Navi 48 dies (which is likely the case, given the relatively mature process node and reasonable 357mm2 die size), then AMD likely doesn't want to sell many 9070s when it could sell 9070XTs that cost them the same to produce for more money. And that is how they have priced both products, the 9070 won't be overly popular, so low demand will match low supply, and the 9070XT will be popular and will be where most of the supply is.

I don't expect to see a price drop for the 9070, relative to the 9070XT. AMD might drop the price of both in the future depending on supply and demand, but the 9070 is deliberately priced to be less appealing than the 9070XT, it isn't because AMD are clueless. It is the exact same think Apple does with all its iPads, create a pricing ladder that encourages you to reach for the next model because the lower tier offers disproportionately less compared to the price differential.
 
The 9070 costs AMD the same to produce as the 9070XT, same die, same memory config. The 9070 only exists so AMD can harvest some defective Navi 48 dies, but TSMC still charge AMD the same (you pay by wafer, not by die). So if AMD isn't getting a lot of defective Navi 48 dies (which is likely the case, given the relatively mature process node and reasonable 357mm2 die size), then AMD likely doesn't want to sell many 9070s when it could sell 9070XTs that cost them the same to produce for more money. And that is how they have priced both products, the 9070 won't be overly popular, so low demand will match low supply, and the 9070XT will be popular and will be where most of the supply is.

I don't expect to see a price drop for the 9070, relative to the 9070XT. AMD might drop the price of both in the future depending on supply and demand, but the 9070 is deliberately priced to be less appealing than the 9070XT, it isn't because AMD are clueless. It is the exact same think Apple does with all its iPads, create a pricing ladder that encourages you to reach for the next model because the lower tier offers disproportionately less compared to the price differential.

In the future, there will probably be a price drop. Unless there's barely any defective dies, I imagine at only $50 difference 9070 stock will start to build. Then in 8 months everyone who wanted the 9070 XT will have it, and they'll cut the 9070 down to $500 or even $450 (if that's still profitable) to move the volume and keep selling Navi 48, albeit at lower margins.
 
Back
Top