• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Mentions Sub-$700 Pricing for Radeon RX 9070 GPU Series, Looks Like NV Minus $50 Again

Absolutely nobody should believe leaks. Nor should they believe marketing. "5070 = 4090" anyone?

The assumption should be, at best, the 9070xt will be a 7900xt in raster and a 7900gre in RT, at BEST, given AMD's previous promises with RT performance. Anything above that is candy.
We shouldn't discount an efficiency bump because they've gone back to monolithic, there is no doubt in my mind chiplets introduced latency. It'll be a 7900XT match, almost certainly, and likely a speck more in RT - perhaps even in raster.
 
AMD wasn't ready in January, but now they're finally prepared to disappoint everyone!
I bet $679
 
If you say so :roll:

Leaks is not marketing. countless "leaks" have been right, especially when close to launch. Few examples here.

Countless leaks have also been wrong.

Dont allow survival-ship bias to cloud your judgement.
What are your assumptions based on, especially with RT?
You remember all the hay made about rDNA3, all the performance claims and huge increases in RT performance?

None of it was true. Core for core, clock for clock, rDNA3 is no faster then rDNA2. Same as ampere VS ada. AMD claimed their new shader design would boost performance. Went nowhere.

There is no reason to believe that rDNA4 will be any different.

On performance: the 7900 gre keeps coming up in conversation over the 9070 series, and given how AMD marketing has overestimated performance on their cards for the last decade, it's safe to assume the 7900 gre was their target. If we go by the leaks you swear up and down are legit, the 9070xt is shaping up to have 4096 cores, with 64 RT cores and 96 ROPs.

The 7900 GRE has 5,120 cores, 80 RT cores and 160 ROPs. The 7900 xt 5376, 84, and 192 respectively.

You'd be putting a LOT of faith in that 2.97 ghz clock speed to make up such a gulf in hardware difference. There's an alleged 30% clock speed bump, and a 25% reduction in hardware compared to the 7900xt. Somehow, the 7900xtx doesnt look to be anywhere in sight.

People have, for over 15 years, hyped up AMD's performance only to be massively disappointed when the cards finally release. There's no evidence that AMD has changed their ways here.

We shouldn't discount an efficiency bump because they've gone back to monolithic, there is no doubt in my mind chiplets introduced latency. It'll be a 7900XT match, almost certainly, and likely a speck more in RT - perhaps even in raster.
Here's the thing: if AMD has such efficiency gains, performance gains, ece in store.....why be so coy about the cards? Why not show them off as intended back in January?

The only reason to do this is if the performance leaks are wildly inaccurate. Otherwise, this would be a great opportunity to crush Nvidia by the balls with the disaster that is blackwell.
 
AMD should follow the competition: use a fake MSRP to game reviews. Have 2 cards which are never in stock for $550. Sample that to reviewers as the MSRP AIB review.
A solid plan.

you could already get the
Last gen obsolete cards for less.

I don't think AMD would mind anyhow.

We shouldn't discount an efficiency bump because they've gone back to monolithic, there is no doubt in my mind chiplets introduced latency. It'll be a 7900XT match, almost certainly, and likely a speck more in RT - perhaps even in raster.
5000 series fare worse at RT gimmick, why would it be different for AMD.
 
Countless leaks have also been wrong.

Of course some leaks are wrong. You're not taking into account what I've stated, which is "leaks" "towards the release of a product tend to be a lot more accurate, certainly when the source is reliable. I've just showed you so. The "leaks" for the 5070 Ti perfs were 100% accurate.

There is no reason to believe that rDNA4 will be any different.

There is: the "leak" that shows a huge increase in RT performance, released by a reliable source a week before the official reviews.

Don't believe it, sure. But saying it should never be trusted doesn't make any sense, see point #1.
 
Countless leaks have also been wrong.

Dont allow survival-ship bias to cloud your judgement.

You remember all the hay made about rDNA3, all the performance claims and huge increases in RT performance?

None of it was true. Core for core, clock for clock, rDNA3 is no faster then rDNA2. Same as ampere VS ada. AMD claimed their new shader design would boost performance. Went nowhere.

There is no reason to believe that rDNA4 will be any different.

On performance: the 7900 gre keeps coming up in conversation over the 9070 series, and given how AMD marketing has overestimated performance on their cards for the last decade, it's safe to assume the 7900 gre was their target. If we go by the leaks you swear up and down are legit, the 9070xt is shaping up to have 4096 cores, with 64 RT cores and 96 ROPs.

The 7900 GRE has 5,120 cores, 80 RT cores and 160 ROPs. The 7900 xt 5376, 84, and 192 respectively.

You'd be putting a LOT of faith in that 2.97 ghz clock speed to make up such a gulf in hardware difference. There's an alleged 30% clock speed bump, and a 25% reduction in hardware compared to the 7900xt. Somehow, the 7900xtx doesnt look to be anywhere in sight.

People have, for over 15 years, hyped up AMD's performance only to be massively disappointed when the cards finally release. There's no evidence that AMD has changed their ways here.


Here's the thing: if AMD has such efficiency gains, performance gains, ece in store.....why be so coy about the cards? Why not show them off as intended back in January?

The only reason to do this is if the performance leaks are wildly inaccurate. Otherwise, this would be a great opportunity to crush Nvidia by the balls with the disaster that is blackwell.
9070XT has a similar number of transitors as 7900XTX, while having less cache, it is a super dense design overall; this means that there are significant changes in the architecture, and there's no denying that. C'mon use logic
 
9070XT has a similar number of transitors as 7900XTX, while having less cache, it is a super dense design overall; this means that there are significant changes in the architecture, and there's no denying that. C'mon use logic
As we all know, it's the density of transistors that determines performance, not core counts, not ROP counts, and not clock speeds!
C'mon, use logic.
Of course some leaks are wrong. You're not taking into account what I've stated, which is "leaks" "towards the release of a product tend to be a lot more accurate, certainly when the source is reliable. I've just showed you so. The "leaks" for the 5070 Ti perfs were 100% accurate.



There is: the "leak" that shows a huge increase in RT performance, released by a reliable source a week before the official reviews.

Don't believe it, sure. But saying it should never be trusted doesn't make any sense, see point #1.
You are free to believe leaks, and free to be disappointed once again when performance doesnt line up to the leaks. Exceptions do not prove the rule. Remember when the 7900xtx was supposed to be a 2x performance increase over the 6900xt?


How did that work out?

Capture.PNG

Oh, um....erm......huh. Guess that didnt pan out too well.....
 
As we all know, it's the density of transistors that determines performance, not core counts, not ROP counts, and not clock speeds!
C'mon, use logic.

You are free to believe leaks, and free to be disappointed once again when performance doesnt line up to the leaks. Exceptions do not prove the rule. Remember when the 7900xtx was supposed to be a 2x performance increase over the 6900xt?


How did that work out?

View attachment 386647

Oh, um....erm......huh. Guess that didnt pan out too well.....
You're now pushing even further from what was discussed. You've gone from leaks close to launch from reliable sources to obscure leaks to.... rumors. lol. That's great logic.

I'm not here to prove you wrong, pal. Leaks are leaks. Stating you can't believe any is not very smart, as demonstrated, that's all.
 
Well, $699 is 40% more than the "price of a 7800XT" they repeatedly bragged about last autumn (blog, investor presentation slides, tweets), but I guess Nvidia's offerings are selling for $900 and that's the reality of capitalism. The 5070Ti is artificially scarce right now and there's no competition from the 40-series stock on shelves either. When the $749 models start to appear on shelves again, the $699 XT is going to look pointless.

"Why take a risk on AMD if you're only saving $50?" It's the single biggest reason their otherwise excellent RDNA2 and RDNA3 offerings failed in the market. If the 5070Ti ever becomes available at $749 again, then an equivalent AMD card needs to be $600 or less in order to appeal to existing Nvidia users.
 
That's why it's funny and stupid to think that the two teams are really making a real competition just like demoncratics and republicans or left and right here in Europe that make a theater of fake controversies and then all go to dinner together in the face of the consumer, it is easier to think instead that there are under-the-table market agreements so that one takes the market share for the CPUs and leaves it open The road to the other for GPUs, while in the meantime we are all enchanted by the marketing made up of drops and fake rumors all piloted to lead us to buy more and more hardware that we don't need and above all more and more defective and even dangerous:banghead:
 
As we all know, it's the density of transistors that determines performance, not core counts, not ROP counts, and not clock speeds!
C'mon, use logic.

You are free to believe leaks, and free to be disappointed once again when performance doesnt line up to the leaks. Exceptions do not prove the rule. Remember when the 7900xtx was supposed to be a 2x performance increase over the 6900xt?


How did that work out?

View attachment 386647

Oh, um....erm......huh. Guess that didnt pan out too well.....
Shaders mean nothing when comparing different architectures. The 7900XTX's GDC has only 45B transistors, add 64MB of L3 and you get 53B, which is close to the value confirmed by AMD for the N48: 53.9B. So how difficult is it to understand that RDNA4 is targeting XTX performance if they are so close in transistor budget?

Well, $699 is 40% more than the "price of a 7800XT" they repeatedly bragged about last autumn (blog, investor presentation slides, tweets), but I guess Nvidia's offerings are selling for $900 and that's the reality of capitalism. The 5070Ti is artificially scarce right now and there's no competition from the 40-series stock on shelves either. When the $749 models start to appear on shelves again, the $699 XT is going to look pointless.

"Why take a risk on AMD if you're only saving $50?" It's the single biggest reason their otherwise excellent RDNA2 and RDNA3 offerings failed in the market. If the 5070Ti ever becomes available at $749 again, then an equivalent AMD card needs to be $600 or less in order to appeal to existing Nvidia users.

Because competitors are defective and could burn your house down? :')
 
Yeah and...80% of gamers buy below $600 and 75% of gamers buy below $500. With only 10% marketshare, I think you can go way lower AMD. It's time to boycott both AMD and Nvidia. This is ridiculous.
 
You're now pushing even further from what was discussed. You've gone from leaks close to launch from reliable sources to obscure leaks to.... rumors. lol. That's great logic.

I'm not here to prove you wrong, pal. Leaks are leaks. Stating you can't believe any is not very smart, as demonstrated, that's all.
You can reclassify and recontextualize all you want, if you want to believe leaks, that is your hot potato to hold. Dont blame me when the pattern continues.
Shaders mean nothing when comparing different architectures. The 7900XTX's GDC has only 45B transistors, add 64MB of L3 and you get 53B, which is close to the value confirmed by AMD for the N48: 53.9B. So how difficult is it to understand that RDNA4 is targeting XTX performance if they are so close in transistor budget?
Convenient you ignored the ROP and RT core count, which I also quoted.

It's not difficult to understand: denser transistors do not mean that 96 ROPs perform like 160 do. Do you believe that AMD will see greater then 50% gains from ROP count?
 
id likely buy at 649 ( $971 cad) assuming you can get it for that and its half decent. A 5070ti in canada is ~$1300 right now before taxes.
 
I want to be hopeful, specially after hearing that apparently some tech reporters and reviewers were asked about pricing (don't ask me for a source, I think I heard it from daniel owen, but it could've just been a dream), but AMD is capable of missplacing gold that was given to them in directly to their hands.

649 euro is what I am hoping for, but I am a fool.
 
Absolutely nobody should believe leaks. Nor should they believe marketing. "5070 = 4090" anyone?

The assumption should be, at best, the 9070xt will be a 7900xt in raster and a 7900gre in RT, at BEST, given AMD's previous promises with RT performance. Anything above that is candy.
Dude they have a dedicated engine for RT, the new GPU's will be on average 60% faster in existing RT games over their predecessors. So if we compare to the 7800XT which is the same specs essentially, the 9070XT RT will be over 60% faster.

AMD will miss a huge opportunity if they price this at $699 or $649, its needs to be $600 or less to actually be disruptive and for them to actually gain marketshare. If they want to keep on having 9-11% marketshare then they should price it above $600, nothing will change.

If they want to actually have a great product launch and gain marketshare and have some excitement follow them, then the 9070XT should cost at most $599, heck I'd even say $550 for them to actually make a big dent in marketshare, if they actually want to go from 9% marketshare to 25% or more. The 9070 should also not be a penny over $499, in fact IU think a way more opportunistic pricing would be $449 and really make a dent in the market.

That would allow them to price the 9060XT for $350 and 9060 for $250
 
Let's wait for the actual presentation. I'm suffering from leak overload already.
 
You can reclassify and recontextualize all you want, if you want to believe leaks, that is your hot potato to hold. Dont blame me when the pattern continues.
Wait, while you were busy stating "Absolutely nobody should believe leaks", another leak that was right was just used in today's article.

Trying to help.
 
Anyone worrying about the leak from BestBuy should calm down a bit. Bestbuy is not known for their website accuracy, have any of you shopped at one lately?
Not to mention, the SKU numbers in that photo can't even be verified to be a RX9000 series card. Theres nothing linking them yet. (At least from what I can tell)
Also, digital numbers can be updated a moments notice and who knows (If those are indeed RX9000 series cards) how long ago since those prices have been updated?

Trust me, I totally believe AMD can and will botch this up. It seems like something they'd totally do. But I think we'll just have to wait until Friday to know for sure.
Ill be keeping my fingers crossed until then.
 
"Sub-$700 price" + street-increase-for-fake-price + dollar/euro-scam-exchange-rate + Vat = "Well over 1.200 euro real price"
 
My 7900XT boosts to 2900, doesn't mean its a clock you can really do a lot on. Its far more comfortable running at around 2500-2600. The example bench used in your link is bloody Unigine Heaven. That's yesterday's tech and sure, cards boost high on that, because a lot of processing systems just aren't being used, or used to the fullest. Heck they should have used Valley, I can hit 3 Ghz there already :)

Now enable PT in Black Myth Wukong and be happy if you see 2500mhz consistently, and your undervolt might just fail too so now you're pushing 300+W to get there and there's simply no more board power to give. That also shines some light on AMD's choice to give this much leaner GPU a rather high power target. It needs that to run RT and keep reasonable clocks. Its fair to assume AMD's RT gain is at least 10% out of just better clocking. RDNA4 special sauce? We might be unpleasantly surprised at very minor architectural changes here.

Let's not oversell what AMD is offering here. It looks to be a somewhat faster 7900XT. A card with 20GB that sells for 620-650. AMD wants to launch something similar with -4GB and some +30% RT performance for about the same price? What's the point? FSR4?

I'm getting a nagging feeling this release is not much other than a refresh and a better margin proposition for AMD with very minor tweaks in the architecture.
If 64 CUs manage to outpace the 7900 XT and catch up to the XTX, then it certainly isn't a minor tweak to the architecture.

Well, $699 is 40% more than the "price of a 7800XT" they repeatedly bragged about last autumn (blog, investor presentation slides, tweets), but I guess Nvidia's offerings are selling for $900 and that's the reality of capitalism. The 5070Ti is artificially scarce right now and there's no competition from the 40-series stock on shelves either. When the $749 models start to appear on shelves again, the $699 XT is going to look pointless.

"Why take a risk on AMD if you're only saving $50?" It's the single biggest reason their otherwise excellent RDNA2 and RDNA3 offerings failed in the market. If the 5070Ti ever becomes available at $749 again, then an equivalent AMD card needs to be $600 or less in order to appeal to existing Nvidia users.
I think it's likely to be closer to 649 as 699 is essentially 700 and that would make a mockery of the slide that VideoCardz is using to support their argument.

AMD says that 85% of gamers buy cards below $700, and this is what the RDNA4 series will focus on.
 

Looks Like NV Minus $50 Again​

Oh, does it?

Where can I get 5070Ti for $750? :D

1740516940396.png
 
Well, AMD didn't said if 699 will be the XT model of the 9070 NON XT model. So, it could be 699 and 799. (plot twist)


The negativity on a news title shouldn't be there, except if this isn't a news story but a forum thread.
This negativity only helps Nvidia's narrative and I will keep repeating the Civilization VII conclusion example, where the lack of DLSS was mentioned with references to Starfield RUMOR about AMD blocking DLSS and something about devs fearing gamers' wrath or something. It seems that gamers go on full rage mode when Nvidia's techs are missing, based on TPU.

In any case, if AMD starts at $699 for the top model, I am expecting the price to drop fast IF Nvidia isn't lying and we start seeing more RTX 5000 series in the market. If we don't see RTX 5070 Tis in the market soon, then please, spare me Nvidia's lies and it is NOT -$50 it's -$300


Obviously while most people would/should avoid buying AMD or Nvidia cards at these prices, OEMs and people who pay for pre build systems will probably pay these prices anyway. And companies like AMD or Nvidia or Intel don't care what we will say here, because we are probably single digit part of the market.
 
Last edited:
Well, AMD didn't said if 699 will be the XT model of the 9070 NON XT model. So, it could be 699 and 799. (plot twist)


The negativity on a news title shouldn't be there, except if this isn't a news story but a forum thread.
This negativity only helps Nvidia's narrative and I will keep repeating the Civilization VII conclusion example, where the lack of DLSS was mentioned with references to Starfield RUMOR about AMD blocking DLSS and something about devs fearing gamers' wrath or something.

In any case, if AMD starts at $699 for the top model, I am expecting the price to drop fast IF Nvidia isn't lying and we start seeing more RTX 5000 series in the market. If we don't see RTX 5070 Tis in the market soon, then please, spare me Nvidia's lies and it is NOT -$50 it's -$300
That's not what the article says.

The article says the series is under $700, that includes the XT model.
 
Back
Top