• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Denies Radeon RX 9070 XT $899 USD Starting Price Point Rumors

As usual, the most obvious explanation proves correct. AMD had an over-optimistic pricing scheme at the outset, then panicked when Nvidia announced the MSRP on Blackwell. That much has been pretty clear from the outset. The only remaining question was, "so how delusional were AMD's intended prices?" Padded-room levels of delusional, it turns out.

Whatever you want to say about the products specifically, or the market generally, it's clownish for AMD to spend the last year making a big deal about how they're bowing out of the high end and then turn around and price their next GPU at just ever so slightly less than their previous "high end" cards. Given that context, even $750 would look goofy, irrespective of the competition's pricing.

I don't have a lot of sympathy for the "AMD can't just give their products away" defense. Profit margins are sky high on these chips. If AMD wants to continue their decline into total irrelevance on the GPU market, then by all means, cling to those profit margins. But cutting prices to move product seems like a much smarter play. AMD doesn't even have the excuse that all these consumer GPUs could have gone for a much higher price in the enterprise market. From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.
”From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.”

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
 
”From that perspective, I'm almost coming around to the idea that AMD is greedier than Nvidia.”

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
Nvidia isn't greedy because it offers last gen performance for last gen price - no price increase is a positive thing, right? AMD is greedy because... well... the same. They should decrease prices, or offer more performance, obviously. :oops:

People will say anything to justify buying Nvidia. It boggles the mind.
 
No, they do not. Not for the past 10 years.
RDNA 2 is the generation that put AMD back onto the map after GCN. You can disagree with me all you want, but I won't budge on this one.

Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.

what?. Nvidia’s gross margins are 50 percent greater/higher than AMD inc OR apple inc. look it up.
IMO it is not possible for a company to be greedier than a company that has a significantly higher gross margin…
The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
 
This sounds interesting. I vaguely remember hearing something about Forza using Ray Tracing for their audio.
Bwhahahaa ray traced audio. Imagine that. What's next, AI shoelaces?
 
Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.
Chiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.
What I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.

The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
That's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
 
Yeah, RDNA 2 was the last time AMD had a full stack with competitive perf and sensible pricing/timing, more or less from top to bottom--or it would have been, if not for the crypto-mining boom. RDNA 2 was also the first time AMD had a full stack scheme with all of the above characteristics in many years prior (since the 200 series in ~2013?). Obviously I can't say how things would have played out without the crypto/COVID mess, but RDNA2 sure looked like a genuine contender on paper, a serious product launch by a serious GPU company.

That was then. It feels like AMD is still punch drunk from that bit of bad luck in 2021. "We tried to do things properly for once, and God smote us, so let's do all sorts of random shit instead!" Of course, COVID wasn't the only bit of bad luck. Their chiplet strategy didn't pan out as well as they'd hoped, either.

I'm clearly more pro-AMD than JustBenching here. I do think AMD makes good products. Unfortunately, everything surrounding those products ranges from mediocre to dumpster fire. What's so frustrating about these discussions is that on the one hand, despite its second-place status, AMD is still one of the most high-tech organizations in the history of human endeavor, staffed with world class engineers. On the other hand, their public behavior is goofy enough to make you forget that fact on a near daily basis. It's as if someone handed a toddler the keys to a Ferrari.


The point is that Nvidia at least has the excuse that their consumer GPUs could have been sold for much more on the enterprise market. I don't quite subscribe to the popular notion that Nvidia's consumer GPUs represent "charity," but a case could be made. AMD has no such excuse.
I wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
 
Chiplets are all about cost saving on throwing away smaller chips when they turn out to be defective - nothing else. Whoever expected AMD to introduce chiplets into the GPU space and dominate Nvidia a million times over, lives in lala land. Sure, it didn't work out because they couldn't divide the compute die into smaller chunks, but at least they tried something new, which deserves a point in my books.
I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
 
What I don't understand is if we have people like yourself saying that AMD makes good products, then why on Earth do we have to talk about their "public behaviour"? Yes, they're goofy, yes they're awkward on stage, their marketing is f*ed up beyond measure, but who the heck cares if the product is good? I pay to be able to play games, not to watch a CEO waffle some shit in front of a bunch of journalists.
We all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.

That's not a case. The consumer and enterprise markets are entirely different. Besides, AMD has enterprise stuff as well. I've heard their MI Instinct stuff isn't half bad (I don't know, I've just heard).
AMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.

I wouldn't say they were competitive from top to bottom with rdna 2 but surely they had some really compelling options. Especially the 6800 (non xt) was a killer product. It cost more than the 3070 but I'd gladly pay it for 2x the vram and faster raster. Mining killed their momentum though.
Yeah, banger of a card. I have one. There still isn't a compelling upgrade option for it, and it looks like there won't be for some years to come. Not that I'm in the market anyway. I don't play demanding games often enough to care. Frankly every time I look at the AAA space, I wonder why anyone does.
 
Last edited:
I can't seem to find the article I read but AMD were mentioning how it provides a lot of flexibility to tailor chips (or configurations) to use cases. Now they seemingly have communication between CCUs on the Halo chip and with Deepseek taking advantage of parallelization (something apparently CUDA can't do) and with Neural rendering on the horizon I'm interested to see what they have planned for their next architecture.
It provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.

We all have an interest in healthy competition. I want AMD to stop stepping on its own dick. What I don't understand is why you'd defend them when they make avoidable errors. "B-But Nvidia is bad too" doesn't move me at this point. I have dozens of posts shitting on Nvidia. I'm a Linux cultist, after all. But AMD is not some battered wife or tiny baby in need of coddling. It is a multi-billion-dollar high-tech company with some of the world's most brilliant engineers, who by the way are probably annoyed than I am with corporate/marketing's mistakes.
I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)

AMD uses different chips for enterprise vs consumer. Nvidia does not. That is the point. AMD will be merging their architecture with the upcoming UDNA. Hopefully that will be successful.
That's a fair point.
 
[Took this from another post and migrated it here, because it was a bit off topic in the other tread]

The price difference between the cheapest 7900xtx & 4080Super/5080 is pretty big right now in my region (VAT included):

The cheapest 4080 Super is 1132 € (and steadily increasing price)
1738326096644.png

The cheapest 5080 will be 1169€
1738326013672.png

Currently the almost cheapest 7900xtx is 899 €
1738326049211.png


There is now a 230€ Price difference between the cheapest 4080 Super/5080 and the 7900xtx (where 3 Months in the past it was a 100€ difference)

Seems like Nvidia really doesn't care about the 7900xtx or they want to signal AMD with "Hey, you don't want to increase P/P this gen, right? Just sell your next card linear (P/P) to your old stack (7000series)"

either way, this gen looks rather bad, pretty sad. Hopefully I am wrong. *pls*

Edit: Also the 899 Price makes absolutely no sense if the 7900xtx costs the same right now, unless they value FSR4 & less energy used highly
 
Last edited:
It provides a lot of flexibility on CPUs that don't need a super tight latency between cores. Not so much on GPUs. That's why their CPU business is booming while RDNA 3 as a whole generation was just a bit meh despite all the money they pumped into making chiplets work.


I'm not defending the company. I'm defending their products which you yourself admitted were good. That's all I care about - products. Dr Su herself could come on stage during the next show and say that RDNA 4 is shit and no one should buy it, but I don't care. If it's good, I'll buy it. Marketing is for idiots.

Same goes for Nvidia (despite all the crap I've given them lately) - if they produce something solid, they'll get my vote (again). The problem is that they don't. They keep pushing the same architecture again and again for a higher price, hidden behind more smoke and mirrors with every gen. AMD is at least trying (even if they fail sometimes), but Nvidia clearly doesn't give a crap about gaming.

Oh, and I'm a (recent) Linux cultist as well thanks to Bazzite. :)


That's a fair point.
Chiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.

1738333073277.png

Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
1738333132396.png
 
Last edited:
Chiplets are good for yields and reducing costs, but they also allow scaling to higher performance than a single die: see the MI300X. As for latency, GPUs actually don't have better core to core latency than CPUs.

View attachment 382693
Contrast GPUs' core to core latency with CPUs: latencies range from sub 20 ns to 80 ns for Zen 4. On the same CCD, latencies are much lower than GPUs could ever dream of.
View attachment 382694
I see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
 
I see. But what about splitting the GPU into chiplets? As far as I recall, AMD experimented with that on RDNA 3, and that introduced the unwanted latency. That's why they ended up cutting only the cache / memory controller off of the main die in the end.
That apparent increase was an error due to AMD's aggressive power saving on he GCD to MCD link. There's a latency difference, but it isn't as dramatic as initial tests suggested.

We see a similar pattern with vector accesses, where measured Infinity Cache latency dropped from 199 ns to 150.3 ns with the card in a higher power state.

For the monolithic 7600, power saving didn't impact it as badly.

Vector accesses show similar behavior, with just a 9.5% Infinity Cache latency difference depending on whether the Infinity Fabric was in power saving state.
 
When AMD does have a design win, it sells well but Nvidia still outsold them, while AMD continued to lose marketshare. I don't know how well consistency would help Radeon as they would have to do something to attract customers, something that Nvidia doesn't have since Nvidia has been successful in selling software at a significant premium. Before RDNA, AMD did have consistency and it didn't seem to help them enough.
However the CPU market isn't the same as the GPU market, AMD would still have to overcome the software features which are proprietary to Nvidia, and compete on gaming when most of those features run better with Nvidia hardware. AMD would have to launch faster cards than Nvidia for several generations, and have better RT and upscaling in order to get the win over the market that Ryzen has achieved. AMD would also have to get away from the old stigma of bad drivers, and the complaints of their cards being too expensive, AMD always has to undercut to the point of having low margins, there can't be consistency when there is likely barely enough R&D to develop the next product.
Nvidia also gets away with a lot of anti-consumer things like GPP, handing game devs piles of money to develop games with features only avaible to Nvidia users, marketing lies like saying the 5070 is faster than a 4090, or the MSRP's being completly fake as the MSRP only applies to the FE card which are usually purposely supply limited.
AMD is always been seen as the underdog even though the AMD products are pretty good in the last years
But they also need more capacity for build the company used AI gpu clusters ( supercomputers) because they do well at that front as far as i know.
The info from within AMD has dried up for me as the friend worked there had retired, he knew every secret and never ever told any info.
That of course did not stop me to try to get the latest and greatest, but he always waited till the secret was already leaked :laugh:
 
Back
Top