• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Plans Aggressive Price Competition with Radeon RX 9000 Series

Nvidia and AMD are both for-profit companies. It's not within their interest to sell anything at a lower price than market conditions dictate. Good morning, the Easter bunny isn't real.
I know, but they need to offer a product that is mainsteam, which amd said they were targeting, the midrange. And it needs to be positioned that you would have to be a complete brainwashed idiot to buy the alternative, as AMD offering so good. For market share.
It will always be the poor relative in GPUs unless the customer base is large enough so devs have to give as much time to amd optimisation as nvidia.

Nvidia on the other hand are one of the most anti consumer companies on the planet that treat their business partners like shit and I would gladly move away from them. But they make good products and have no competition.

Are you getting paid to spread this BS around?
The GPUs are 9070 and 9070XT and not what your agenta wants to call it.
It’s against the 5070 and 5070Ti at $100-150 less.

My question is the weird one and not what you want to spread around. Ok…
Yeah lets end it. All those cards are grossly over priced, amd not aggressive enough, was my point.
 
Yeah lets end it. All those cards are grossly over priced, amd not aggressive enough, was my point.
We are going to let benchmarks to the final verdict and how they compared to previous gen and competition.
But yeah, me too would like a 7900XTX performance with +50% RT, for half the MSRP at 500$ but here we are.
 
I know, but they need to offer a product that is mainsteam, which amd said they were targeting, the midrange.
What's midrange? I thought there was no definition on it. Therefore, AMD and Nvidia can call anything midrange that they want. Your choice is whether you buy it or not.

And it needs to be positioned that you would have to be a complete brainwashed idiot to buy the alternative, as AMD offering so good. For market share.
$750 for the 5070 Ti, or $600 for the 9070 XT. If they match up in performance, then I'd say you'd have to be brainwashed to buy the 5070 Ti purely for gaming.

It will always be the poor relative in GPUs unless the customer base is large enough so devs have to give as much time to amd optimisation as nvidia.
You keep forgetting about consoles that all run on AMD.

But they make good products and have no competition.
That's just bullshit being spread around by Nvidia fans.

Yeah lets end it. All those cards are grossly over priced, amd not aggressive enough, was my point.
Why would they be more aggressive? What would be the point?
 
What's midrange? I thought there was no definition on it. Therefore, AMD and Nvidia can call anything midrange that they want. Your choice is whether you buy it or not.


$750 for the 5070 Ti, or $600 for the 9070 XT. If they match up in performance, then I'd say you'd have to be brainwashed to buy the 5070 Ti purely for gaming.


You keep forgetting about consoles that all run on AMD.


That's just bullshit being spread around by Nvidia fans.


Why would they be more aggressive? What would be the point?

Did the 7900xt at 650 competing with 4070ti super that at the time was 800+ make a dent in the market?

Radeon cards sold well when they had the same features, better build quality, better long term driver support and a ~20% price/pref advantage.
Now on the productivity side they are very limited, ppl that take advantage of cuda cores will go Nvidia out of necessity.
RT is beginning to get forced onto games as the only option, so RT and upscaling performance are no longer something you can brush off.

And according to AMD 9070xt is at best a 7900xt in raster with 4 less GB of memory, even if it has lot better RT and upscaling, a 50 discount doesn't look that great, if you didn't buy a 7900xt at 650 are you really buying a 9070xt at 600?

The reality of the market is this. If AMD wants to be relevant again they need a huge price advantage. In my opinion, even 30-40% less won't guarantee the market share of 10 years ago.
 
Did the 7900xt at 650 competing with 4070ti super that at the time was 800+ make a dent in the market?

Radeon cards sold well when they had the same features, better build quality, better long term driver support and a ~20% price/pref advantage.
Now on the productivity side they are very limited, ppl that take advantage of cuda cores will go Nvidia out of necessity.
RT is beginning to get forced onto games as the only option, so RT and upscaling performance are no longer something you can brush off.

And according to AMD 9070xt is at best a 7900xt in raster with 4 less GB of memory, even if it has lot better RT and upscaling, a 50 discount doesn't look that great, if you didn't buy a 7900xt at 650 are you really buying a 9070xt at 600?

The reality of the market is this. If AMD wants to be relevant again they need a huge price advantage. In my opinion, even 30-40% less won't guarantee the market share of 10 years ago.
The 9070XT more like in between 7900XT and XTX, closer to XTX on raster and 20~50% better RT again from XTX… and FSR4.
And since RT and upscalers are “in” for year 2025 and onwards it’s quite the leap from 7900XT.
 
The 9070XT more like in between 7900XT and XTX, closer to XTX on raster and 20~50% better RT again from XTX… and FSR4.
And since RT and upscalers are “in” for year 2025 and onwards it’s quite the leap from 7900XT.
What is your source for those values? I'm basing my argument on the CES slide from AMD, as far as I know they're the only real info we have.

How do you assign a % improvement to fsr4? Image quality is intrinsical to it's quality not just FPS.

In the AMD presentation there was no reference to RT performance.

Your entire argument seems based on wishful thinking and not in any info from official or trustworthy sources.

The reality is that we will have to wait till the 28 to have any real info besides what AMD already told, and for real performance till reviews. Getting hyped on baseless speculation will only lead to a less favourable view of the product when it comes out.
 
I was referring to vanilla drivers, not FSR 4 which could have been shipped later on - just like any non-critical software update.
The "later on" does NOT work.

Majority of reviewers will not go full blown re-review "later on".

The verdict on AMD GPUs will be stuck as it was at the moment they were rleleased.

So polishing FSR4 prior to the major launch event is one of the best things to do.

After somehow cracking up OEMs, that is.

Do you really need to log in to an account to update NVIDIA drivers on laptop? I've just been using the installer here and MSI Afterburner for clock stuff for my desktop system.
If you want a comfortable "one click" experience like with AMD drivers, yes, the goddamn green drivers ask you to log in.

Devs don't bother their ass targeting AMD hardware because its such a small %.
This is so delusional it hurts.

Merely: Xbox, PS5, PS4, Older XBox, Steam Deck and 16.5% of the PC gaming market, vs NV's 75%.

Another example: ray traced games run like ass on AMD because AMD's RT hardware is inferior, and not because games are targeted at Nvidia.
Yeah, cool story, bruh:

1739701537992.png


1739701565586.png



A lot of per-vendor optimization is possible, as with raster, a lot of good old shader code runs to support RT, as with raster, monumental gaps in pieces of scheisse like Control down to nothing in Metro EE.


And, yeah, "but in many games RT on doesn't make a difference".
Indeed. In many games RT does only one major difference: FPS drop.


so devs have to give as much time to amd optimisation as nvidia.
The devs that give "much time" to optimize for Filthy Green to non green sponsored games, are they in the room with you?
 
The "later on" does NOT work.

Majority of reviewers will not go full blown re-review "later on".

The verdict on AMD GPUs will be stuck as it was at the moment they were rleleased.

So polishing FSR4 prior to the major launch event is one of the best things to do.
I don't buy this. It's all a matter of communication.

If AMD could provide a compelling price/perf (including good drivers) from the get go, along with a clear timeline regarding FSR4's release and features (which AFAIK it has been unable to do up to now), then reviewers would be able to recommend the card on the basis of the existing features - and wouldn't forget to mention that FSR4 will be a real perspective to make the card even better.

But of course, with non-dismissed people like Frank Azor at the helm of marketing operations, this wouldn't be possible ‍♂️
 
Yeah, cool story, bruh:

View attachment 385132

View attachment 385133


A lot of per-vendor optimization is possible, as with raster, a lot of good old shader code runs to support RT, as with raster, monumental gaps in pieces of scheisse like Control down to nothing in Metro EE.


And, yeah, "but in many games RT on doesn't make a difference".
Indeed. In many games RT does only one major difference: FPS drop.
You managed to pick the few rare examples where RT is light enough so that even AMD's hardware works well. Congratulations!

I agree with the rest of your post, though.
 
Starting at $750 for the 9070 XT, is not an aggressive price.
Is it an official pricing? I know it is for nvidia but we all know that that is highly unlikely to be the street pricing.
 
Is it an official pricing? I know it is for nvidia but we all know that that is highly unlikely to be the street pricing.
There is, and will be no official pricing on the 9070 XT until 28th Feb. The latest rumour indicates $600 as the starting price.
 
$750 for the 5070 Ti, or $600 for the 9070 XT. If they match up in performance, then I'd say you'd have to be brainwashed to buy the 5070 Ti purely for gaming.
I agree with the purely for gaming part. Compute and production are pretty important in their own right though, and I'd personally only start considering it at $550. $500 and I'm buying day one no questions asked, especially if the 9070 is $400. But that's just me.

Why would they be more aggressive? What would be the point?
Because Intel has jumped in and said they don't have to be #1 in the market, they just have to be #2. Also to increase market share.
 
I agree with the purely for gaming part. Compute and production are pretty important in their own right though, and I'd personally only start considering it at $550. $500 and I'm buying day one no questions asked, especially if the 9070 is $400. But that's just me.
Do you know how it performs in compute? Do you know if it's $500 worth powerful?

People used to say $50 cheaper than Nvidia is not good enough, we need at least -$100. Now, -$150 is not enough, you want -$250. If you had that, what would you want? -$400? Why not give away GPUs for free? Or even better, with a cashback? This bullshit never ends.

If you rely on CUDA, then no discount will make any AMD card attractive to you ever, so there's no point in complaining about its price.

Because Intel has jumped in and said they don't have to be #1 in the market, they just have to be #2. Also to increase market share.
Market share doesn't mean jack if you're selling at a loss.
 
Do you know how it performs in compute? Do you know if it's $500 worth powerful?

People used to say $50 cheaper than Nvidia is not good enough, we need at least -$100. Now, -$150 is not enough, you want -$250. If you had that, what would you want? -$400? Why not give away GPUs for free? Or even better, with a cashback? This bullshit never ends.

If you rely on CUDA, then no discount will make any AMD card attractive to you ever, so there's no point in complaining about its price.
No, but I do know how well the 7000 series performed in compute for the first ten months after release: 0% the performance of a 1030 GT because they didn't bother building ROCm for RDNA 3. Even then with ROCm support when Intel cards can meet the same performance in a lot of workloads for half the price, there's not a lot of pull for compute.

Giving away GPUs for free or for cashback is bad because you're the product in those cases. Not accepting free stuff from big companies is a given.

Market share doesn't mean jack if you're selling at a loss.
The production costs for the 7800 XT and the 9070 XT should be around the same. They're not selling at a loss unless the 7800 XT has been selling at a loss since release.
 
No, but I do know how well the 7000 series performed in compute for the first ten months after release: 0% the performance of a 1030 GT because they didn't bother building ROCm for RDNA 3. Even then with ROCm support when Intel cards can meet the same performance in a lot of workloads for half the price, there's not a lot of pull for compute.
So you're talking about the past, and have zero clue how the 9070 XT will perform. But "it should be cheaper" based on that nonexistent knowledge. Right.

The production costs for the 7800 XT and the 9070 XT should be around the same. They're not selling at a loss unless the 7800 XT has been selling at a loss since release.
And you know that from where exactly?
 
Another Danish retailer has listed a RX 9070 and RX 9070 XT online, it's the Gigabyte Radeon RX 9070 and RX 9070 XT Gaming OC model

1739955143062.jpeg


Link: https://www.computersalg.dk/i/21878227/gigabyte-grafikkort-16-gb-pcie-4-0

Price for the non-XT is about £845 / 1.066USD

1739955296428.png

Link: https://www.computersalg.dk/i/21878227/gigabyte-grafikkort-16-gb-pcie-4-0
Price for the XT is about £1.021 / 1.288USD

I really hope this is placeholder prices otherwise these AMD cards have to perform or they are gonna flop with these prices :kookoo:
 
Last edited:
I really hope this is placeholder prices otherwise these AMD cards have to perform or they are gonna flop with these prices :kookoo:
Then again, why would placeholder prices be credible enough to look like actual ones ?

All in all, either these are real prices and AMD is too incompetent to price its cards correctly ; or these are fake prices, and AMD is too incompetent to rein in its resellers ‍♂️
 
Then again, why would placeholder prices be credible enough to look like actual ones ?

All in all, either these are real prices and AMD is too incompetent to price its cards correctly ; or these are fake prices, and AMD is too incompetent to rein in its resellers ‍♂️

As mr. Frank Azor pointed out the starting price was never going to be 899USD but it looks like it mit be sadly and the RX 9070 most punch above expectation if it should sell for the price in Denmark because if it's lower performance than a RTX 4080/Super people will be disappointed.

Personally I was hoping for a RX 9070 XT but not for the 9k price in DK it's too much for me to justify, the price of the RX 9070 is still high and a little above of what I really would love to pay but I guess the world has changed and not it's not crypto demanding graphics power but ai still leaving the gamer without many options unless you pay more than what's realistic for many.
 
So you're talking about the past, and have zero clue how the 9070 XT will perform. But "it should be cheaper" based on that nonexistent knowledge. Right.
"Ohh no just become an early adopter of our new cards, ROCm support will come out soon this time, promise!"

Yeah nah, not happening.

And you know that from where exactly?
The specs. It's the same card as the 7800 XT with the rest of the cores unlocked on a monolithic 4nm die. Every other component on it is identical.
 
"Ohh no just become an early adopter of our new cards, ROCm support will come out soon this time, promise!"

Yeah nah, not happening.
Still talking about the 7000 series, I assume.

The specs. It's the same card as the 7800 XT with the rest of the cores unlocked on a monolithic 4nm die. Every other component on it is identical.
Navi 32 is a full die with 3840 cores. Navi 48 is a different one with 4096 cores. Navi 32 was made on 5 nm, and can clock in the 2.4 GHz range. Navi 48 is made on TSMC 4N, it can clock up to 3 GHz, and comes with a new RT engine, new AI cores, and who knows what else. If you think they're the same, you're delusional.

Nvidia's recent architectures since Turing have a lot more in common with one another.
 
Still talking about the 7000 series, I assume.
Yeah I'm talking about the 7000 series. I'm not gonna baghold like that this generation for a slight discount. Period.
Navi 32 is a full die with 3840 cores. Navi 48 is a different one with 4096 cores. Navi 32 was made on 5 nm, and can clock in the 2.4 GHz range. Navi 48 is made on TSMC 4N, it can clock up to 3 GHz, and comes with a new RT engine, new AI cores, and who knows what else. If you think they're the same, you're delusional.
I said production costs should be similar, not performance. If you have more info on what the hardware differences are between the cards I'd like to know. R&D is not a production cost by the way.
 
Yeah I'm talking about the 7000 series. I'm not gonna baghold like that this generation for a slight discount. Period.
Then we have nothing to talk about here. The past doesn't interest me with regards to the 9070 XT.

I said production costs should be similar, not performance. If you have more info on what the hardware differences are between the cards I'd like to know. R&D is not a production cost by the way.
You said specs. You can find the latest (rumoured) versions here: 7800 XT and 9070 XT.
 
You said specs. You can find the latest (rumoured) versions here: 7800 XT and 9070 XT.

The rumours I have seen is that the RX 9070 XT is built on the Navi 48 XTX and the RX 9070 is built on the Navi 48 XT which is why there ain't a higher end card from AMD this time around.
 
Back
Top