Monday, January 29th 2024

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.

Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.
When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources: Moore's Law is Dead (YouTube), Tweaktown
Add your own comment

292 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

#126
Minus Infinity
Kohl BaasNot happy about this. The RX7900XTX is a bit lacking in 4K and apparently I won't have any options anytime soon to remedy that problem...:banghead:
I'm gobsmacked with RDNA4 high end dead and RDNA 5 not out for nearly 2 years, they are not doing a refresh of 7900 series even more so in light of Nvidia Super refresh and with 4090 Super back on the table. It would look ridiculous to see a 8700 class card match 7900XTX in raster for half the price.
Posted on Reply
#127
Dr. Dro
kapone321. I guess you are in the camp that thinks a 5800X3D is just as fast as a 7900X3D. There is also no comparison between a 7600 and 5600 but anyway.
2. Radeon 7, really. How many months was that on the market. I should tell you my story where Nvidia disabled features on my card.
3. I love this one. X370 was said by AMD that most of the BIOS files were too small. You should be glad that community pressure made them get a fix for that. They did not stop giving us Threadripper. Do you enjoy the modern media, well that was a real market for Threadripper and AMD made AM4 faster than Threadripper anyway. Don't worry I still have my X399 board.
4. Relatively high price because reviewers that make their money making videos complained about it. I know I paid $219 for mine and at that time 6600 was $650, much less a 6800XT for $1400 and 6900XT for $1500. What was crazy was all of the media reviews were the exact opposite for users who bought the card.Though in a world where you have a 120hz TV Freesync. You can enable it with a 6500XT. Yes the full 45-120 HZ that TV support. That translates to smooth Gaming vs 60Hz
5. All of that was before the US banned them but you can go on. Most Governments were for Chinese 5G until they were not.
6. AMD did not have to release Vcache to their 12 and 16 core CPUs. The community lamented, chastised and demanded that. AMD tried to give us 2 Vcache but it was not a tangible benefit. I guess when your Game crashes and that message comes up that AMD does nothing with that. I guess when people complained about boot times for AM5 AMD did nothing. I bet AMD gave us Hyper RX to help make Gaming worse. I guess the 8700G is not what we have been asking for but anyhow, you are entitled to your opinion.
1. I never said that, I said AMD didn't introduce any unorthodox technologies since 3D V-Cache. Ryzen is a "safe" design with a very traditional "big core only" architecture.

2. It was their flagship, $700+ graphics card that was based on their strongest architecture then. It really was better at compute than gaming, no wonder it became the foundation to what is now known as CDNA. 16 GB of HBM2 across four fully enabled stacks, the first GPU to breach the 1 TB/s mark, and the one I had comfortably hit 1.25 TB/s effortlessly because it also overclocked the HBM well. I think a GPU like this should have been supported for more than just about 4 years... yes, of course AMD gets a pass. I assure you if Nvidia came out tomorrow and said "that's it folks, your 7 year old 1080 Ti's had enough of a good run, no more drivers for you", the pitchforks would be scorching hot and ready to poke. But then again, AMD has been no stranger to just axing hardware they don't want to spend resources in maintaining, like the R9 Fury X before it (the card was what, 5 years old on the clock). And don't get me started on Vega FE, which I also had.

3. No use sugar coating it. AMD lied. All it took was Alder Lake to crash their $300+ Ryzen 5 5600X party with $180 12400F's outperforming them that X370's suddenly supported everything and the CPU prices cratered. TRX40 owners are still waiting, btw. Oh wait, AMD alone entered the HEDT market this generation with Zen 4 Threadrippers at prices that not even Intel dared when they had utter domination with the Core i7 Extreme CPUs that were several times faster than the FX-9590 to begin with back in the day... with Zen 4 HEDT starting at $1,499 all the way to $4,999 MSRP. Not a word from the legions of AMD fans calling them the devil or "emergency edition", I see.

4. I won't criticize the pricing much as I said before it was an unfortunate market situation (mid-mining craze and Covid overlap), but you should be well aware that said alleged limitations regarding maximum refresh rate were always unfounded to begin with. As long as the display on the other end is compatible and the port has enough bandwidth, you should get an image. Now I must question the usefulness of a GPU such as Navi 24 for 120/144 Hz gaming, unless you're playing games from the mid-2000's I don't think you're getting good frame rates on anything like that

5. That doesn't excuse it, after all, the RTX 4090 D's sole reason for existing is US sanctions on China. It's just about powerful enough to comply with the government's regulation and thus is a legal product to export. I don't think that gamers should be punished and prevented from buying a product because of their nationality, I'd already feel quite slighted to get something nerfed simply because I am not American.

6. Overlapping with point one, the 7900X3D and 7950X3D rely on a software driver for core allocation as only one of the CCDs is equipped with the 3D V-Cache. No tangible benefit because they refused to provide such a chip anyway - why threaten their own HEDT or server business?

If you get messages after your game crashed (it shouldn't crash to begin with), it's because you got a TDR and this is so incredibly common with AMD that they've actually decided to use it as a point for data collection. I can't remember the last time my RTX 4080 crashed and caused a TDR, probably because it hasn't happened since I bought it. Boot times were caused by buggy AGESA, meaning this platform should never have launched in that state to begin with. It was never the DDR5 training, it was never the fact that it's "new", it's just AMD's low level firmware code being horribly broken - as it had traditionally been.

Most of the features that AMD has implemented in their graphics drivers in recent memory, if not all, are clones of existing Nvidia technologies. If you look at what they introduced in the latest, GeForce has supported HAGS, video upscaling, etc. for years now - and overlapping with your initial point, AFMF is exclusive to RDNA 3, not that you'd want to play with frame reprojection on... since it's clearly not generative, and the frame rate counter being unchanged rats that out big time.
Posted on Reply
#128
kapone32
1. I guess you are referencing Intel slapping a Mobile and desktop chip together and calling that advancing.
2. I wanted a Vega 7. I too salivated over the specs. Though you seem to forget that AMD drivers are Universal. I had Vega 64 in crossfire so what are you saying? I am pretty sure my 5600G still gets driver updates and I am pretty sure that is Vega.
3. Yep here we go the 12400F. This is after 5 different chipsets that just about every CPU was compatible with on AM4. At least you don't lose 8 lanes if you install an M2 in the top slot. Oh wait that is 12th Gen I guess that means Z590 and no PCIe 5.
4. Unfounded, I had a RX570 before and I can promise you that a 6500XT is much smoother with Freesync support than that. I was one of those people too. I used to argue in the PC store that refresh rate did not matter. Until I stared playing Division. I had a 60Hz panel and using the Machine Gun the gun would go all over the screen. I updated to a 32QC and all of a sudden I could use a scope with the machine gun to make head shots.
5. US Sanctions, Who is the Government? You make them seem like there are no geopolitics that are influencing that. Seeing that as Gamers being punished in a Country that drove down Online Gaming stock down with it's tactics and I did not know the entire stack was banned?
6. Here we go with "they rely on software". Show me a piece of PC hardware that does not rely on software to work.

I am so happy that you have had no crashes. You see I play a modded version of TWWH. When Creative Assembly updates the Game, it breaks the mod. That is what causes the issue to bring up the message. I know because they were not perfect releasing a brand new platform because they are human but at least memory support is better and Memory long times persist if you pick the 2nd Expo profile but it doesn't matter.

It is because Nvidia uses propaganda to make people feel triggered you mention that AMD could be competitive and in some cases better. Therefore ad naseum talk about Ray Tracing (In the 2 Games that supported it) made what we only were promised in one Game seem moot. Ashes of the Singularity was one of the first DX12 Games and is still one of the most optimized but of course we want DLSS and FSR is always blurry. Video Upsclaing, you should Google did Sapphire Trixx support Upscaling across all Games. You see that is where you miss the argument. I have a 7900XT. I was watching a PC World podcast and they were talking about "Bad Console ports". I asked a Super chat "I have a 7900X3D/7900XT combo why do I not have any of these issues". The response from Gordon was "Well if you have the horsepower it's not an issue for you". Maybe in 3 years when my 7900XT is older I will look into Upscaling. For my previous argument Upscaling makes the most sense at the low end. The thing is that though Nvidia may create, AMD makes it for the masses. AFMF is another Freesync moment that will improve with time and people that buy an 8700Gs will be happy for it.
Posted on Reply
#129
Dr. Dro
kapone321. I guess you are referencing Intel slapping a Mobile and desktop chip together and calling that advancing.
2. I wanted a Vega 7. I too salivated over the specs. Though you seem to forget that AMD drivers are Universal. I had Vega 64 in crossfire so what are you saying? I am pretty sure my 5600G still gets driver updates and I am pretty sure that is Vega.
3. Yep here we go the 12400F. This is after 5 different chipsets that just about every CPU was compatible with on AM4. At least you don't lose 8 lanes if you install an M2 in the top slot. Oh wait that is 12th Gen I guess that means Z590 and no PCIe 5.
4. Unfounded, I had a RX570 before and I can promise you that a 6500XT is much smoother with Freesync support than that. I was one of those people too. I used to argue in the PC store that refresh rate did not matter. Until I stared playing Division. I had a 60Hz panel and using the Machine Gun the gun would go all over the screen. I updated to a 32QC and all of a sudden I could use a scope with the machine gun to make head shots.
5. US Sanctions, Who is the Government? You make them seem like there are no geopolitics that are influencing that. Seeing that as Gamers being punished in a Country that drove down Online Gaming stock down with it's tactics and I did not know the entire stack was banned?
6. Here we go with "they rely on software". Show me a piece of PC hardware that does not rely on software to work.

I am so happy that you have had no crashes. You see I play a modded version of TWWH. When Creative Assembly updates the Game, it breaks the mod. That is what causes the issue to bring up the message. I know because they were not perfect releasing a brand new platform because they are human but at least memory support is better and Memory long times persist if you pick the 2nd Expo profile but it doesn't matter.

It is because Nvidia uses propaganda to make people feel triggered you mention that AMD could be competitive and in some cases better. Therefore ad naseum talk about Ray Tracing (In the 2 Ganes that supported it) made what we only were promised in one Game seem moot. Ashes of the Singularity was one of the first DX12 Games and is still one of the most optimized but of course we want DLSS and FSR is always blurry. Video Upsclaing, you should Google did Sapphire Trixx support Upscaling across all Games. You see that is where you miss the argument. I have a 7900XT. I was watching a PC World podcast and they were talking about "Bad Console ports". I asked a Super chat "I have a 7900X3D/7900XT combo why do I not have any of these issues". The response from Gordon was "Well if you have the horsepower it's not an issue for you". Maybe in 3 years when my 7900XT is older I will look into Upscaling. For my previous argument Upscaling makes the most sense at the low end. The thing is that though Nvidia may create, AMD makes it for the masses. AFMF is another Freesync moment that will improve with time and people that buy an 8700Gs will be happy for it.
No, the AMD drivers are not universal. Pre-RDNA drivers are on a separate maintenance branch... and of course a 6500 XT is going to feel better than a RX 570, I'd be surprised if it didn't, the 570 is an ancient relic at this point.

www.amd.com/en/support/kb/release-notes/rn-rad-win-24-1-1



More on this:

www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus

Mate... 12th Gen is for LGA 1700, that means Z690 and Z790... they all have PCIe 5.0 support... DDR5 support... it was just AMD price gouging while Intel had no competition.

Geopolitics aren't relevant to money... Chinese gamers aren't getting cards for free... and there's no propaganda here, you're just... wrong, man.
Posted on Reply
#130
Dawora
DavenThere is also a good chance that AMD is betting on Nvidia abandoning the mid-range to budget discrete desktop GPU space. This would leave AMD (RDNA4) and Intel (Battlemage) to compete in the sub $500 price bracket.
Hahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.
RedwoodzNice try dude. Remove that useless , budget busting raytracing and you see why AMD is very comfortable where they are. No one really wants a $1600 gaming gpu for a personal PC. It's just stupid. /
Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.
Kohl BaasNot happy about this. The RX7900XTX is a bit lacking in 4K and apparently I won't have any options anytime soon to remedy that problem...:banghead:
U can buy new upcoming High end Nvidia.
Posted on Reply
#131
wolf
Performance Enthusiast
Beginner Micro DeviceOr about
For me the simple sum is, playable vs non playable on competing products. percentages, frame-times etc are academic and useful, but ultimately RTX cards in combination with the software stack, give you playable RTX across the stack, can't say the same for AMD.
Posted on Reply
#132
Jism
3valatzyIt's not just the rumours - it is pretty straightforward visible that the Navi 31 in RX 7900 series lacks performance. Nvidia's RTX 4090 with disabled AD102 is 25% faster in ordinary raster and whooping 60-65% faster in ray-traced gaming. AMD failed miserably with the chiplets approach, what do they think?
AMD opted for the use of chiplets due to the ever rising cost of going or building monolithic GPU's as Nvidia is doing. On top of that with monolithic you risk having more "faulty" chips out of a wafer, thriving up costs because you can sell less.

The RDNA3 series might not have bin that impressive, it will take a few generations to fully uncork it's potential. It was just the start.

The good margins are within the midrange of GPU's - not the 2000$ costing GPU's. Failed? I think you half understand what they are doing.

Whole CDNA (=Mi300X) is expected to skyrocket in regards of sales.
Posted on Reply
#133
kapone32
Dr. DroNo, the AMD drivers are not universal. Pre-RDNA drivers are on a separate maintenance branch... and of course a 6500 XT is going to feel better than a RX 570, I'd be surprised if it didn't, the 570 is an ancient relic at this point.

www.amd.com/en/support/kb/release-notes/rn-rad-win-24-1-1



More on this:

www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus

Mate... 12th Gen is for LGA 1700, that means Z690 and Z790... they all have PCIe 5.0 support... DDR5 support... it was just AMD price gouging while Intel had no competition.

Geopolitics aren't relevant to money... Chinese gamers aren't getting cards for free... and there's no propaganda here, you're just... wrong, man.
Of course you are going to use the 3 months that AMD did not give people the monthly update they were used to. Did those other cards suddenly fail? Yes you claimed that only RDNA3 supported AFMF and look my 5600 GPU gets it as well.

Of course a 6500XT us going to feel better than a RX 570 when people all claimed that the Rx 570 was a better card.

Forgive me for forgetting that Intel likes to release 2 main chipsets a year. I stand corrected forgive me.

If you think the Geo politics and money are not connected....The US did not ban the 4080,4070 or 4060. F me they did not even ban the 3090. Just the 4090. You can go ahead and blindly trust that China is no different than the West. I guess the US only has Carrier Strike Groups to waste resources in that part of the World.

I saw that story too and I still get updates for my 5600G but go on. They are releasing new Vega skus anyway but we can go on.
Posted on Reply
#134
Firedrops
napataDéjà Vu.
MLID literally repeats this for every generation of GPUs.
Posted on Reply
#135
Daven
DaworaHahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.


Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.


U can buy new upcoming High end Nvidia.
Its already happening with the MX mobile series gone and no 4050. Nvidia is packing up its budget and midrange as iGPUs and additional players enter that space. But mostly because fab allocation will go to AI compute GPUs. Again, the 5000 series might start with the 5070 at $500 or higher. Its similar reasoning for why the rumors say AMD is abandoning the high end: increased competition and wafer allocation to AI compute GPUs.

Sometimes you have to reprioritize your product offerings based on what’s hot and capacity constrained manufacturing.
Posted on Reply
#136
theouto
kapone32I guess you are right though a video can be many things. The thing is of all the Youtubers I trust Wendell at Level 1 more than most.
I mean, I trust wendell too, but wendell was simply the person going to AMD, he is not a private inspector or an undercover detective, he was likely invited by AMD to boost PR. It would be the same if it was GN, HUB, LTT, JTC, W1zzard or anyone else, it's not that these people are paid to put a good show, but rather AMD is making sure that AMD is putting on a good show.
Posted on Reply
#137
AusWolf
evernessinceThere are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row. It's nonsense that the Radeon group doesn't have the resources, AMD have been pouring money into them. I'd assume after the first generation of failure they have a general idea of the bandwidth required for multiple GCDs. At the very least if they couldn't reach the required bandwidth number I'd expect them to further modularize their GPU, stack cache, ect. There are plenty of options for AMD to reach the high end while maintaining RDNA3's small die size and without the use of multiple GCDs.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost. This is particularly pertinent because for Nvidia the cost increase is exponential at the high end due to fact that yield drastically decreases when you get to the size of a high end GPU. By extension this means AMD has a large cost advantage in the high end (not that it really needs it given Nvidia's margins have always been large on those high end cards). AMD might not compete with Nvidia's high end chip with a single GCD but at the very least I'd expect them to stack up as many chiplets to have a competitive high end product simply because that's what stands to make them the most money. There's really no reason for AMD to simply leave money on the table.
I've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?
Posted on Reply
#138
Daven
AusWolfI've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?
Some of this is complicated by fab allocation and the AI surge.
Posted on Reply
#139
Broken Processor
evernessinceThere are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row. It's nonsense that the Radeon group doesn't have the resources, AMD have been pouring money into them. I'd assume after the first generation of failure they have a general idea of the bandwidth required for multiple GCDs. At the very least if they couldn't reach the required bandwidth number I'd expect them to further modularize their GPU, stack cache, ect. There are plenty of options for AMD to reach the high end while maintaining RDNA3's small die size and without the use of multiple GCDs.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost. This is particularly pertinent because for Nvidia the cost increase is exponential at the high end due to fact that yield drastically decreases when you get to the size of a high end GPU. By extension this means AMD has a large cost advantage in the high end (not that it really needs it given Nvidia's margins have always been large on those high end cards). AMD might not compete with Nvidia's high end chip with a single GCD but at the very least I'd expect them to stack up as many chiplets to have a competitive high end product simply because that's what stands to make them the most money. There's really no reason for AMD to simply leave money on the table.
Why even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.
remekraDon't know if they will implement it or not in their consumer cards, but their Instinct MI300X is already having 2xGCD and it's working and is being treated by software as a single GPU. I know that there is a difference between that arch and RDNA, but still seems they are making good progress on that.

If they could make a GPU with 2GCD, each performing as 7900XTX, with MCDs on them, improve RT perf then I would sell my 7900XTX and buy it instantly.

EDIT
Did a read up on that Instinct card:

chipsandcheese.com/2023/12/17/amds-cdna-3-compute-architecture/

It uses XCDs and has eight of them and they are all exposed as a single GPU.
Question is does it convert to consumer market and a GPU used for gaming, we'll see I guess.
It's easier for compute less latency sensitive.
Posted on Reply
#140
Redwoodz
DaworaHahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.


Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.


U can buy new upcoming High end Nvidia.
Corvette. Very good analogy. Really one of the biggest POS road cars out there. Did ok on the track but they are really useless to drive around in. Miserable even.
Posted on Reply
#141
kapone32
theoutoI mean, I trust wendell too, but wendell was simply the person going to AMD, he is not a private inspector or an undercover detective, he was likely invited by AMD to boost PR. It would be the same if it was GN, HUB, LTT, JTC, W1zzard or anyone else, it's not that these people are paid to put a good show, but rather AMD is making sure that AMD is putting on a good show.
For me it was the the excitement that was obvious talking about products between him and the employees he spoke to.
Broken ProcessorWhy even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.


It's easier for compute less latency sensitive.
I want to know which Games you are talking about I have a 7900XT and have had no issues.
Posted on Reply
#142
Makaveli
Broken ProcessorWhy even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.


It's easier for compute less latency sensitive.
lol I went from a 6800XT to a 7900XTX the broken experience you had I didn't see it.

"screw that up because of greed"

Its like you guys don't understand AMD has shareholders and they demand profit. They are not pricing it to do you a favor Shareholders > Consumers when it comes to priority. They are a business at the end of the day.

Team Green has 4090's for $2000 are they also greedy?
Posted on Reply
#143
Kohl Baas
DaworaU can buy new upcoming High end Nvidia.
That company is not an option.
Makaveliyou do have options drop settings or using upscaling.

But to be honest if one is looking to push 4K the 4090 is a better option but it will cost you.
nVidia is not an option here so I probably have to do that upSCAMing thing...
Posted on Reply
#144
TechLurker
I read this as more that RDNA4's potential would allow for a strong multi-chip GPU, if they can improve the links between cores more. RDNA3 is a worthwhile attempt that showed it can work, they need to mature it more and if the cores for RDNA4 are as good as claimed, pairing two of them for even more performance would be a viable option. But the weak link has been the interconnects and programming to make the most of it, which AMD is probably still working on.

That said, rumors claim RDNA5 is going back to a monolithic chip, or a hybrid monolithic+chiplet, as it's showing more promise in the near-term, while RDNA4 is a second attempt to refine the multi-GPU chiplet route until they can improve it for RDNA6 or 7. At the same time, RDNA3 is showing very good performance in AI workloads, so it's possible that their approach there will quickly lead to dividends in the rising AI sector (moreso since CDNA3 was based on RDNA3 work), and if RDNA4 is just refined RDNA3, could lead to even better AI performance.

If AMD is also leveraging 2 rival groups within for GPU development like they are for Ryzen (to ensure they always have options), then I expect we'll probably see alternating periods between performance uplift (monolithic or semi-monolithic) and efficiency gains (chiplet-based) until the chiplet method becomes good enough to compete in both while saving cost.
Posted on Reply
#145
evernessince
AusWolfI've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?
Me for one given I use a 4090 for AI where I could use a much faster card and I'm sure plenty more people as well. $2,000 - $2,500 is well within the range Nvidia has charged for their prosumer parts with their top end die in the past. Factoring in inflation that price range would be reasonable for a prosumer card.
Broken ProcessorWhy even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah.
RDNA3 actually has improves memory subsystem latency over RDNA2: chipsandcheese.com/2023/01/07/microbenchmarking-amds-rdna-3-graphics-architecture/

There are other advantages to chiplets like modularization, increased yields, reduced costs, vastly improved binning, scalability, and design simplicity but I assume that AMD 100% intended to have multiple GCDs when it first set out to design the RDNA3. AMD explicitly stated that it was unable to make it work due to bandwidth restrictions, which means they put significant effort into trying to make it work in order to find that out.

Maybe latency would be worse if AMD did have multiple GCDs or maybe it would be possible for them to keep data from hopping over often. In any case latency is not something that has regressed as compared to RDNA2.
Broken ProcessorAnd let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
I think perhaps you were just unlucky and got a defective product / had driver issues. This is not something that has been reported to be a widespread issue.
TechLurkerThat said, rumors claim RDNA5 is going back to a monolithic chip, or a hybrid monolithic+chiplet
I have never heard this rumor and likely for good reason, it's the most nonsensical I've read to date. Going back to monolithic would require them to re-incorporate the MCDs back into the GCD, which makes no sense to do given it increases cost, decreases yield, decreases flexability, and a whole host of other negatives. MI300 would be impossible on such a design and AMD would again have to make different tapeouts for different products as compared to a chiplet based design where AMD makes 2 tapeouts, one for the MCD and one for the GCD.

Hybrid monolithic? If you have one big chip and smaller chips working together that's just a chiplet based architecture. You can't have a Hybrid between a chiplet architecture and a monolithic architecture as they are mutually exclusive. Monolithic implies one primary chip carries out all the functions of the architecture whereas chiplet splits those functions between multiple chips installed on an interposer. In otherwords, you either have one chip (monolithic) or chiplets (more than one) at the same time. You have products with multiple monolithic CPUs / GPUs on the PCB but those are just two monolithic chips in a group, not a hybrid or chiplet. Monolithic and chiplet refers to the architecture of the silicon product (CPU / GPU), not the grouping. You wouldn't call SLI GPUs or dual GPU cards chiplet architecture.
Posted on Reply
#146
Dr. Dro
Makavelilol I went from a 6800XT to a 7900XTX the broken experience you had I didn't see it.

"screw that up because of greed"

Its like you guys don't understand AMD has shareholders and they demand profit. They are not pricing it to do you a favor Shareholders > Consumers when it comes to priority. They are a business at the end of the day.

Team Green has 4090's for $2000 are they also greedy?
No, AMD is our friend, the company is just wonderful, everyone is happy and they really care about the small people, you meanie. nGreedia is not an option!
Posted on Reply
#147
Makaveli
Dr. DroNo, AMD is our friend, the company is just wonderful, everyone is happy and they really care about the small people, you meanie. nGreedia is not an option!
Alot of people in forums not just this one seems to either not understand or skip over the business side of both of these companies.
  1. The enterprise market means more to them than to consumers.
  2. Shareholders take priority over consumers
  3. Profit > feels good or doing the right thing
Pay more attention to what they do and how they operate and less of the AMD vs Nvidia.
Posted on Reply
#148
Dr. Dro
MakaveliAlot of people in forums not just this some seems to either not understand or skip over the business side of both of these companies.
  1. The enterprise market means more to them than to consumers.
  2. Shareholders take priority over consumers
  3. Profit > feels good or doing the right thing
Pay more attention to what they do and how they operate and less of the AMD vs Nvidia.
That's what I've been trying to say all along :D
Posted on Reply
#149
Tek-Check
evernessinceI have never heard this rumor and likely for good reason, it's the most nonsensical I've read to date. Going back to monolithic would require them to re-incorporate the MCDs back into the GCD, which makes no sense to do given it increases cost, decreases yield, decreases flexability, and a whole host of other negatives. MI300 would be impossible on such a design and AMD would again have to make different tapeouts for different products as compared to a chiplet based design where AMD makes 2 tapeouts, one for the MCD and one for the GCD.
Agreed. Chiplets are the way, both on CDNA and RDNA. There is no going back. At the end of the day, chiplets are the first stage in preparations for High-NA EUV lithography that will half current reticle size for chips to just above ~400 mm2 per chiplet. Nvidia is squeezing out last years of profits on monolithic design baked on scanners with EUV litography. They will too be forced to move onto chiplets once High-NA EUV production starts on dies below 2nm in 2026/2027.
Posted on Reply
#150
AusWolf
evernessinceMe for one given I use a 4090 for AI where I could use a much faster card and I'm sure plenty more people as well. $2,000 - $2,500 is well within the range Nvidia has charged for their prosumer parts with their top end die in the past. Factoring in inflation that price range would be reasonable for a prosumer card.
The article was on about consumer cards. Would I ever pay more than £1,000 for a gaming card? Well, if you triple my salary, then sure. Otherwise, not a chance.
Posted on Reply
Add your own comment
Jun 17th, 2024 13:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts