• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

Joined
Jun 2, 2017
Messages
8,969 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
1. Arguably, yes. There have been no major breakthroughs in their CPUs since 3D V-Cache was introduced. All chips ever since have been incremental upgrades, and the next major leap since 2022's Zen 4 launch is pretty much the Zen 5 core scheduled for this year. In the meantime, Intel has developed a hybrid architecture processor with hardware-based thread scheduling, enhanced upon it, reheated it shamelessly with the "14th Gen" scam, and then moved onto a processor that unites those technologies with tile-based packaging and developed a neural co-processor for it, which is what you see in Meteor Lake.
2. Yes. They've discontinued less than 5 year old GPUs recently (such as the Radeon VII) and heavily prioritize features for latest-generation cards. Most of these features may eventually reach older cards, but they certainly were never a priority. Reminder that le ebil nGreedia still supports the GTX 900 series from 2014.
3. Yes, this has happened in the past with the X370 chipset, the elaborate lie surrounding BIOS ROM capacity, and the TRX40 chipset that was aborted mid-way and never received a Zen 3 Threadripper upgrade, leaving people with expensive workstations that never received a CPU upgrade and on a previous-generation architecture.
4. Yes, the 6500 XT launched in an unfortunate market situation and it commanded a relatively high price, and it was one of the most pathetic GPU launches in history (perhaps just not as bad as the GTX 1630). It also has several limitations that not even the 1630 has, such as the complete absence of a hardware encoder and a hard limit of two display outputs (hardware limitation, you will never see any Navi 24 design with more than two display outs).
5. AMD has historically released several China-specific SKUs and has openly licensed its processor IP to Chinese technology companies, going as far as investing itself in joint ventures in order to keep it "by the books" (see: Hygon Dhyana). Not exactly a clean sheet if you want to bring that conversation up.
6. No, AMD does not listen to the community.
1. I guess you are in the camp that thinks a 5800X3D is just as fast as a 7900X3D. There is also no comparison between a 7600 and 5600 but anyway.
2. Radeon 7, really. How many months was that on the market. I should tell you my story where Nvidia disabled features on my card.
3. I love this one. X370 was said by AMD that most of the BIOS files were too small. You should be glad that community pressure made them get a fix for that. They did not stop giving us Threadripper. Do you enjoy the modern media, well that was a real market for Threadripper and AMD made AM4 faster than Threadripper anyway. Don't worry I still have my X399 board.
4. Relatively high price because reviewers that make their money making videos complained about it. I know I paid $219 for mine and at that time 6600 was $650, much less a 6800XT for $1400 and 6900XT for $1500. What was crazy was all of the media reviews were the exact opposite for users who bought the card.Though in a world where you have a 120hz TV Freesync. You can enable it with a 6500XT. Yes the full 45-120 HZ that TV support. That translates to smooth Gaming vs 60Hz
5. All of that was before the US banned them but you can go on. Most Governments were for Chinese 5G until they were not.
6. AMD did not have to release Vcache to their 12 and 16 core CPUs. The community lamented, chastised and demanded that. AMD tried to give us 2 Vcache but it was not a tangible benefit. I guess when your Game crashes and that message comes up that AMD does nothing with that. I guess when people complained about boot times for AM5 AMD did nothing. I bet AMD gave us Hyper RX to help make Gaming worse. I guess the 8700G is not what we have been asking for but anyhow, you are entitled to your opinion.
 
Joined
May 3, 2018
Messages
2,881 (1.21/day)
Not happy about this. The RX7900XTX is a bit lacking in 4K and apparently I won't have any options anytime soon to remedy that problem...:banghead:
I'm gobsmacked with RDNA4 high end dead and RDNA 5 not out for nearly 2 years, they are not doing a refresh of 7900 series even more so in light of Nvidia Super refresh and with 4090 Super back on the table. It would look ridiculous to see a 8700 class card match 7900XTX in raster for half the price.
 
Joined
Dec 25, 2020
Messages
6,509 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
1. I guess you are in the camp that thinks a 5800X3D is just as fast as a 7900X3D. There is also no comparison between a 7600 and 5600 but anyway.
2. Radeon 7, really. How many months was that on the market. I should tell you my story where Nvidia disabled features on my card.
3. I love this one. X370 was said by AMD that most of the BIOS files were too small. You should be glad that community pressure made them get a fix for that. They did not stop giving us Threadripper. Do you enjoy the modern media, well that was a real market for Threadripper and AMD made AM4 faster than Threadripper anyway. Don't worry I still have my X399 board.
4. Relatively high price because reviewers that make their money making videos complained about it. I know I paid $219 for mine and at that time 6600 was $650, much less a 6800XT for $1400 and 6900XT for $1500. What was crazy was all of the media reviews were the exact opposite for users who bought the card.Though in a world where you have a 120hz TV Freesync. You can enable it with a 6500XT. Yes the full 45-120 HZ that TV support. That translates to smooth Gaming vs 60Hz
5. All of that was before the US banned them but you can go on. Most Governments were for Chinese 5G until they were not.
6. AMD did not have to release Vcache to their 12 and 16 core CPUs. The community lamented, chastised and demanded that. AMD tried to give us 2 Vcache but it was not a tangible benefit. I guess when your Game crashes and that message comes up that AMD does nothing with that. I guess when people complained about boot times for AM5 AMD did nothing. I bet AMD gave us Hyper RX to help make Gaming worse. I guess the 8700G is not what we have been asking for but anyhow, you are entitled to your opinion.

1. I never said that, I said AMD didn't introduce any unorthodox technologies since 3D V-Cache. Ryzen is a "safe" design with a very traditional "big core only" architecture.

2. It was their flagship, $700+ graphics card that was based on their strongest architecture then. It really was better at compute than gaming, no wonder it became the foundation to what is now known as CDNA. 16 GB of HBM2 across four fully enabled stacks, the first GPU to breach the 1 TB/s mark, and the one I had comfortably hit 1.25 TB/s effortlessly because it also overclocked the HBM well. I think a GPU like this should have been supported for more than just about 4 years... yes, of course AMD gets a pass. I assure you if Nvidia came out tomorrow and said "that's it folks, your 7 year old 1080 Ti's had enough of a good run, no more drivers for you", the pitchforks would be scorching hot and ready to poke. But then again, AMD has been no stranger to just axing hardware they don't want to spend resources in maintaining, like the R9 Fury X before it (the card was what, 5 years old on the clock). And don't get me started on Vega FE, which I also had.

3. No use sugar coating it. AMD lied. All it took was Alder Lake to crash their $300+ Ryzen 5 5600X party with $180 12400F's outperforming them that X370's suddenly supported everything and the CPU prices cratered. TRX40 owners are still waiting, btw. Oh wait, AMD alone entered the HEDT market this generation with Zen 4 Threadrippers at prices that not even Intel dared when they had utter domination with the Core i7 Extreme CPUs that were several times faster than the FX-9590 to begin with back in the day... with Zen 4 HEDT starting at $1,499 all the way to $4,999 MSRP. Not a word from the legions of AMD fans calling them the devil or "emergency edition", I see.

4. I won't criticize the pricing much as I said before it was an unfortunate market situation (mid-mining craze and Covid overlap), but you should be well aware that said alleged limitations regarding maximum refresh rate were always unfounded to begin with. As long as the display on the other end is compatible and the port has enough bandwidth, you should get an image. Now I must question the usefulness of a GPU such as Navi 24 for 120/144 Hz gaming, unless you're playing games from the mid-2000's I don't think you're getting good frame rates on anything like that

5. That doesn't excuse it, after all, the RTX 4090 D's sole reason for existing is US sanctions on China. It's just about powerful enough to comply with the government's regulation and thus is a legal product to export. I don't think that gamers should be punished and prevented from buying a product because of their nationality, I'd already feel quite slighted to get something nerfed simply because I am not American.

6. Overlapping with point one, the 7900X3D and 7950X3D rely on a software driver for core allocation as only one of the CCDs is equipped with the 3D V-Cache. No tangible benefit because they refused to provide such a chip anyway - why threaten their own HEDT or server business?

If you get messages after your game crashed (it shouldn't crash to begin with), it's because you got a TDR and this is so incredibly common with AMD that they've actually decided to use it as a point for data collection. I can't remember the last time my RTX 4080 crashed and caused a TDR, probably because it hasn't happened since I bought it. Boot times were caused by buggy AGESA, meaning this platform should never have launched in that state to begin with. It was never the DDR5 training, it was never the fact that it's "new", it's just AMD's low level firmware code being horribly broken - as it had traditionally been.

Most of the features that AMD has implemented in their graphics drivers in recent memory, if not all, are clones of existing Nvidia technologies. If you look at what they introduced in the latest, GeForce has supported HAGS, video upscaling, etc. for years now - and overlapping with your initial point, AFMF is exclusive to RDNA 3, not that you'd want to play with frame reprojection on... since it's clearly not generative, and the frame rate counter being unchanged rats that out big time.
 
Joined
Jun 2, 2017
Messages
8,969 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
1. I guess you are referencing Intel slapping a Mobile and desktop chip together and calling that advancing.
2. I wanted a Vega 7. I too salivated over the specs. Though you seem to forget that AMD drivers are Universal. I had Vega 64 in crossfire so what are you saying? I am pretty sure my 5600G still gets driver updates and I am pretty sure that is Vega.
3. Yep here we go the 12400F. This is after 5 different chipsets that just about every CPU was compatible with on AM4. At least you don't lose 8 lanes if you install an M2 in the top slot. Oh wait that is 12th Gen I guess that means Z590 and no PCIe 5.
4. Unfounded, I had a RX570 before and I can promise you that a 6500XT is much smoother with Freesync support than that. I was one of those people too. I used to argue in the PC store that refresh rate did not matter. Until I stared playing Division. I had a 60Hz panel and using the Machine Gun the gun would go all over the screen. I updated to a 32QC and all of a sudden I could use a scope with the machine gun to make head shots.
5. US Sanctions, Who is the Government? You make them seem like there are no geopolitics that are influencing that. Seeing that as Gamers being punished in a Country that drove down Online Gaming stock down with it's tactics and I did not know the entire stack was banned?
6. Here we go with "they rely on software". Show me a piece of PC hardware that does not rely on software to work.

I am so happy that you have had no crashes. You see I play a modded version of TWWH. When Creative Assembly updates the Game, it breaks the mod. That is what causes the issue to bring up the message. I know because they were not perfect releasing a brand new platform because they are human but at least memory support is better and Memory long times persist if you pick the 2nd Expo profile but it doesn't matter.

It is because Nvidia uses propaganda to make people feel triggered you mention that AMD could be competitive and in some cases better. Therefore ad naseum talk about Ray Tracing (In the 2 Games that supported it) made what we only were promised in one Game seem moot. Ashes of the Singularity was one of the first DX12 Games and is still one of the most optimized but of course we want DLSS and FSR is always blurry. Video Upsclaing, you should Google did Sapphire Trixx support Upscaling across all Games. You see that is where you miss the argument. I have a 7900XT. I was watching a PC World podcast and they were talking about "Bad Console ports". I asked a Super chat "I have a 7900X3D/7900XT combo why do I not have any of these issues". The response from Gordon was "Well if you have the horsepower it's not an issue for you". Maybe in 3 years when my 7900XT is older I will look into Upscaling. For my previous argument Upscaling makes the most sense at the low end. The thing is that though Nvidia may create, AMD makes it for the masses. AFMF is another Freesync moment that will improve with time and people that buy an 8700Gs will be happy for it.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,509 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
1. I guess you are referencing Intel slapping a Mobile and desktop chip together and calling that advancing.
2. I wanted a Vega 7. I too salivated over the specs. Though you seem to forget that AMD drivers are Universal. I had Vega 64 in crossfire so what are you saying? I am pretty sure my 5600G still gets driver updates and I am pretty sure that is Vega.
3. Yep here we go the 12400F. This is after 5 different chipsets that just about every CPU was compatible with on AM4. At least you don't lose 8 lanes if you install an M2 in the top slot. Oh wait that is 12th Gen I guess that means Z590 and no PCIe 5.
4. Unfounded, I had a RX570 before and I can promise you that a 6500XT is much smoother with Freesync support than that. I was one of those people too. I used to argue in the PC store that refresh rate did not matter. Until I stared playing Division. I had a 60Hz panel and using the Machine Gun the gun would go all over the screen. I updated to a 32QC and all of a sudden I could use a scope with the machine gun to make head shots.
5. US Sanctions, Who is the Government? You make them seem like there are no geopolitics that are influencing that. Seeing that as Gamers being punished in a Country that drove down Online Gaming stock down with it's tactics and I did not know the entire stack was banned?
6. Here we go with "they rely on software". Show me a piece of PC hardware that does not rely on software to work.

I am so happy that you have had no crashes. You see I play a modded version of TWWH. When Creative Assembly updates the Game, it breaks the mod. That is what causes the issue to bring up the message. I know because they were not perfect releasing a brand new platform because they are human but at least memory support is better and Memory long times persist if you pick the 2nd Expo profile but it doesn't matter.

It is because Nvidia uses propaganda to make people feel triggered you mention that AMD could be competitive and in some cases better. Therefore ad naseum talk about Ray Tracing (In the 2 Ganes that supported it) made what we only were promised in one Game seem moot. Ashes of the Singularity was one of the first DX12 Games and is still one of the most optimized but of course we want DLSS and FSR is always blurry. Video Upsclaing, you should Google did Sapphire Trixx support Upscaling across all Games. You see that is where you miss the argument. I have a 7900XT. I was watching a PC World podcast and they were talking about "Bad Console ports". I asked a Super chat "I have a 7900X3D/7900XT combo why do I not have any of these issues". The response from Gordon was "Well if you have the horsepower it's not an issue for you". Maybe in 3 years when my 7900XT is older I will look into Upscaling. For my previous argument Upscaling makes the most sense at the low end. The thing is that though Nvidia may create, AMD makes it for the masses. AFMF is another Freesync moment that will improve with time and people that buy an 8700Gs will be happy for it.

No, the AMD drivers are not universal. Pre-RDNA drivers are on a separate maintenance branch... and of course a 6500 XT is going to feel better than a RX 570, I'd be surprised if it didn't, the 570 is an ancient relic at this point.


1706592637932.png


More on this:


Mate... 12th Gen is for LGA 1700, that means Z690 and Z790... they all have PCIe 5.0 support... DDR5 support... it was just AMD price gouging while Intel had no competition.

Geopolitics aren't relevant to money... Chinese gamers aren't getting cards for free... and there's no propaganda here, you're just... wrong, man.
 
Joined
Sep 19, 2014
Messages
38 (0.01/day)
There is also a good chance that AMD is betting on Nvidia abandoning the mid-range to budget discrete desktop GPU space. This would leave AMD (RDNA4) and Intel (Battlemage) to compete in the sub $500 price bracket.
Hahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.

Nice try dude. Remove that useless , budget busting raytracing and you see why AMD is very comfortable where they are. No one really wants a $1600 gaming gpu for a personal PC. It's just stupid. /
Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.

Not happy about this. The RX7900XTX is a bit lacking in 4K and apparently I won't have any options anytime soon to remedy that problem...:banghead:
U can buy new upcoming High end Nvidia.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,147 (1.28/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
For me the simple sum is, playable vs non playable on competing products. percentages, frame-times etc are academic and useful, but ultimately RTX cards in combination with the software stack, give you playable RTX across the stack, can't say the same for AMD.
 
Joined
Dec 30, 2010
Messages
2,194 (0.43/day)
It's not just the rumours - it is pretty straightforward visible that the Navi 31 in RX 7900 series lacks performance. Nvidia's RTX 4090 with disabled AD102 is 25% faster in ordinary raster and whooping 60-65% faster in ray-traced gaming. AMD failed miserably with the chiplets approach, what do they think?

AMD opted for the use of chiplets due to the ever rising cost of going or building monolithic GPU's as Nvidia is doing. On top of that with monolithic you risk having more "faulty" chips out of a wafer, thriving up costs because you can sell less.

The RDNA3 series might not have bin that impressive, it will take a few generations to fully uncork it's potential. It was just the start.

The good margins are within the midrange of GPU's - not the 2000$ costing GPU's. Failed? I think you half understand what they are doing.

Whole CDNA (=Mi300X) is expected to skyrocket in regards of sales.
 
Joined
Jun 2, 2017
Messages
8,969 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
No, the AMD drivers are not universal. Pre-RDNA drivers are on a separate maintenance branch... and of course a 6500 XT is going to feel better than a RX 570, I'd be surprised if it didn't, the 570 is an ancient relic at this point.


View attachment 332084

More on this:


Mate... 12th Gen is for LGA 1700, that means Z690 and Z790... they all have PCIe 5.0 support... DDR5 support... it was just AMD price gouging while Intel had no competition.

Geopolitics aren't relevant to money... Chinese gamers aren't getting cards for free... and there's no propaganda here, you're just... wrong, man.
Of course you are going to use the 3 months that AMD did not give people the monthly update they were used to. Did those other cards suddenly fail? Yes you claimed that only RDNA3 supported AFMF and look my 5600 GPU gets it as well.

Of course a 6500XT us going to feel better than a RX 570 when people all claimed that the Rx 570 was a better card.

Forgive me for forgetting that Intel likes to release 2 main chipsets a year. I stand corrected forgive me.

If you think the Geo politics and money are not connected....The US did not ban the 4080,4070 or 4060. F me they did not even ban the 3090. Just the 4090. You can go ahead and blindly trust that China is no different than the West. I guess the US only has Carrier Strike Groups to waste resources in that part of the World.

I saw that story too and I still get updates for my 5600G but go on. They are releasing new Vega skus anyway but we can go on.
 
Last edited:
Joined
Dec 12, 2016
Messages
1,751 (0.61/day)
Hahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.


Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.


U can buy new upcoming High end Nvidia.
Its already happening with the MX mobile series gone and no 4050. Nvidia is packing up its budget and midrange as iGPUs and additional players enter that space. But mostly because fab allocation will go to AI compute GPUs. Again, the 5000 series might start with the 5070 at $500 or higher. Its similar reasoning for why the rumors say AMD is abandoning the high end: increased competition and wafer allocation to AI compute GPUs.

Sometimes you have to reprioritize your product offerings based on what’s hot and capacity constrained manufacturing.
 
Last edited:
Joined
Jun 29, 2023
Messages
543 (1.11/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
I guess you are right though a video can be many things. The thing is of all the Youtubers I trust Wendell at Level 1 more than most.
I mean, I trust wendell too, but wendell was simply the person going to AMD, he is not a private inspector or an undercover detective, he was likely invited by AMD to boost PR. It would be the same if it was GN, HUB, LTT, JTC, W1zzard or anyone else, it's not that these people are paid to put a good show, but rather AMD is making sure that AMD is putting on a good show.
 
Joined
Jan 14, 2019
Messages
12,167 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
There are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row. It's nonsense that the Radeon group doesn't have the resources, AMD have been pouring money into them. I'd assume after the first generation of failure they have a general idea of the bandwidth required for multiple GCDs. At the very least if they couldn't reach the required bandwidth number I'd expect them to further modularize their GPU, stack cache, ect. There are plenty of options for AMD to reach the high end while maintaining RDNA3's small die size and without the use of multiple GCDs.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost. This is particularly pertinent because for Nvidia the cost increase is exponential at the high end due to fact that yield drastically decreases when you get to the size of a high end GPU. By extension this means AMD has a large cost advantage in the high end (not that it really needs it given Nvidia's margins have always been large on those high end cards). AMD might not compete with Nvidia's high end chip with a single GCD but at the very least I'd expect them to stack up as many chiplets to have a competitive high end product simply because that's what stands to make them the most money. There's really no reason for AMD to simply leave money on the table.
I've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?
 
Joined
Dec 12, 2016
Messages
1,751 (0.61/day)
I've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?
Some of this is complicated by fab allocation and the AI surge.
 
Joined
Aug 26, 2021
Messages
369 (0.32/day)
There are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row. It's nonsense that the Radeon group doesn't have the resources, AMD have been pouring money into them. I'd assume after the first generation of failure they have a general idea of the bandwidth required for multiple GCDs. At the very least if they couldn't reach the required bandwidth number I'd expect them to further modularize their GPU, stack cache, ect. There are plenty of options for AMD to reach the high end while maintaining RDNA3's small die size and without the use of multiple GCDs.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost. This is particularly pertinent because for Nvidia the cost increase is exponential at the high end due to fact that yield drastically decreases when you get to the size of a high end GPU. By extension this means AMD has a large cost advantage in the high end (not that it really needs it given Nvidia's margins have always been large on those high end cards). AMD might not compete with Nvidia's high end chip with a single GCD but at the very least I'd expect them to stack up as many chiplets to have a competitive high end product simply because that's what stands to make them the most money. There's really no reason for AMD to simply leave money on the table.
Why even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.

Don't know if they will implement it or not in their consumer cards, but their Instinct MI300X is already having 2xGCD and it's working and is being treated by software as a single GPU. I know that there is a difference between that arch and RDNA, but still seems they are making good progress on that.

If they could make a GPU with 2GCD, each performing as 7900XTX, with MCDs on them, improve RT perf then I would sell my 7900XTX and buy it instantly.

EDIT
Did a read up on that Instinct card:


It uses XCDs and has eight of them and they are all exposed as a single GPU.
Question is does it convert to consumer market and a GPU used for gaming, we'll see I guess.
It's easier for compute less latency sensitive.
 
Joined
Mar 29, 2014
Messages
448 (0.12/day)
Hahahah in u dreams:roll:
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening.


Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.

if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know.


U can buy new upcoming High end Nvidia.
Corvette. Very good analogy. Really one of the biggest POS road cars out there. Did ok on the track but they are really useless to drive around in. Miserable even.
 
Joined
Jun 2, 2017
Messages
8,969 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I mean, I trust wendell too, but wendell was simply the person going to AMD, he is not a private inspector or an undercover detective, he was likely invited by AMD to boost PR. It would be the same if it was GN, HUB, LTT, JTC, W1zzard or anyone else, it's not that these people are paid to put a good show, but rather AMD is making sure that AMD is putting on a good show.
For me it was the the excitement that was obvious talking about products between him and the employees he spoke to.

Why even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.


It's easier for compute less latency sensitive.
I want to know which Games you are talking about I have a 7900XT and have had no issues.
 
Joined
Feb 21, 2006
Messages
2,210 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Why even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah. And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel.


It's easier for compute less latency sensitive.
lol I went from a 6800XT to a 7900XTX the broken experience you had I didn't see it.

"screw that up because of greed"

Its like you guys don't understand AMD has shareholders and they demand profit. They are not pricing it to do you a favor Shareholders > Consumers when it comes to priority. They are a business at the end of the day.

Team Green has 4090's for $2000 are they also greedy?
 
Joined
Sep 10, 2015
Messages
526 (0.16/day)
System Name My Addiction
Processor AMD Ryzen 7950X3D
Motherboard ASRock B650E PG-ITX WiFi
Cooling Alphacool Core Ocean T38 AIO 240mm
Memory G.Skill 32GB 6000MHz
Video Card(s) Sapphire Pulse 7900XTX
Storage Some SSDs
Display(s) 42" Samsung TV + 22" Dell monitor vertically
Case Lian Li A4-H2O
Audio Device(s) Denon + Bose
Power Supply Corsair SF750
Mouse Logitech
Keyboard Glorious
VR HMD None
Software Win 10
Benchmark Scores None taken
U can buy new upcoming High end Nvidia.

That company is not an option.



you do have options drop settings or using upscaling.

But to be honest if one is looking to push 4K the 4090 is a better option but it will cost you.

nVidia is not an option here so I probably have to do that upSCAMing thing...
 
Joined
Jul 7, 2019
Messages
905 (0.47/day)
I read this as more that RDNA4's potential would allow for a strong multi-chip GPU, if they can improve the links between cores more. RDNA3 is a worthwhile attempt that showed it can work, they need to mature it more and if the cores for RDNA4 are as good as claimed, pairing two of them for even more performance would be a viable option. But the weak link has been the interconnects and programming to make the most of it, which AMD is probably still working on.

That said, rumors claim RDNA5 is going back to a monolithic chip, or a hybrid monolithic+chiplet, as it's showing more promise in the near-term, while RDNA4 is a second attempt to refine the multi-GPU chiplet route until they can improve it for RDNA6 or 7. At the same time, RDNA3 is showing very good performance in AI workloads, so it's possible that their approach there will quickly lead to dividends in the rising AI sector (moreso since CDNA3 was based on RDNA3 work), and if RDNA4 is just refined RDNA3, could lead to even better AI performance.

If AMD is also leveraging 2 rival groups within for GPU development like they are for Ryzen (to ensure they always have options), then I expect we'll probably see alternating periods between performance uplift (monolithic or semi-monolithic) and efficiency gains (chiplet-based) until the chiplet method becomes good enough to compete in both while saving cost.
 
Joined
Jul 13, 2016
Messages
3,243 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I've just read an article somewhere that speculates that it's not about the cost towards AMD, but about the cost towards the consumer. Theoretically, a large MCM design on RDNA 4 would be way too expensive for the performance level it targets. Who would be interested in a 4090-beater if it costs shy of $2,000-2,500? Personally, I take this theory with a pinch of salt, considering that the 7900 XTX manages to match the 4080 in raster performance while being considerably cheaper, so why couldn't AMD pull the same on RDNA 4?

Me for one given I use a 4090 for AI where I could use a much faster card and I'm sure plenty more people as well. $2,000 - $2,500 is well within the range Nvidia has charged for their prosumer parts with their top end die in the past. Factoring in inflation that price range would be reasonable for a prosumer card.

Why even bother with a chiplet based approach if they don't intend to go down multi GCD approach and still suffer the latency penalty? They had a chance to sell for less because of the chiplet approach and still managed to screw that up because of greed. I don't mind buying a slower product if it's priced right but yeah.

RDNA3 actually has improves memory subsystem latency over RDNA2: https://chipsandcheese.com/2023/01/07/microbenchmarking-amds-rdna-3-graphics-architecture/

There are other advantages to chiplets like modularization, increased yields, reduced costs, vastly improved binning, scalability, and design simplicity but I assume that AMD 100% intended to have multiple GCDs when it first set out to design the RDNA3. AMD explicitly stated that it was unable to make it work due to bandwidth restrictions, which means they put significant effort into trying to make it work in order to find that out.

Maybe latency would be worse if AMD did have multiple GCDs or maybe it would be possible for them to keep data from hopping over often. In any case latency is not something that has regressed as compared to RDNA2.


And let's not get into the 7900xtx a product so broken in some games it has to be a hardware fault I returned mine and went back to my 6800xt and glad I did because it was the right move as it turns out.

I think perhaps you were just unlucky and got a defective product / had driver issues. This is not something that has been reported to be a widespread issue.

That said, rumors claim RDNA5 is going back to a monolithic chip, or a hybrid monolithic+chiplet

I have never heard this rumor and likely for good reason, it's the most nonsensical I've read to date. Going back to monolithic would require them to re-incorporate the MCDs back into the GCD, which makes no sense to do given it increases cost, decreases yield, decreases flexability, and a whole host of other negatives. MI300 would be impossible on such a design and AMD would again have to make different tapeouts for different products as compared to a chiplet based design where AMD makes 2 tapeouts, one for the MCD and one for the GCD.

Hybrid monolithic? If you have one big chip and smaller chips working together that's just a chiplet based architecture. You can't have a Hybrid between a chiplet architecture and a monolithic architecture as they are mutually exclusive. Monolithic implies one primary chip carries out all the functions of the architecture whereas chiplet splits those functions between multiple chips installed on an interposer. In otherwords, you either have one chip (monolithic) or chiplets (more than one) at the same time. You have products with multiple monolithic CPUs / GPUs on the PCB but those are just two monolithic chips in a group, not a hybrid or chiplet. Monolithic and chiplet refers to the architecture of the silicon product (CPU / GPU), not the grouping. You wouldn't call SLI GPUs or dual GPU cards chiplet architecture.
 
Joined
Dec 25, 2020
Messages
6,509 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
lol I went from a 6800XT to a 7900XTX the broken experience you had I didn't see it.

"screw that up because of greed"

Its like you guys don't understand AMD has shareholders and they demand profit. They are not pricing it to do you a favor Shareholders > Consumers when it comes to priority. They are a business at the end of the day.

Team Green has 4090's for $2000 are they also greedy?

No, AMD is our friend, the company is just wonderful, everyone is happy and they really care about the small people, you meanie. nGreedia is not an option!
 
Joined
Feb 21, 2006
Messages
2,210 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
No, AMD is our friend, the company is just wonderful, everyone is happy and they really care about the small people, you meanie. nGreedia is not an option!
Alot of people in forums not just this one seems to either not understand or skip over the business side of both of these companies.
  1. The enterprise market means more to them than to consumers.
  2. Shareholders take priority over consumers
  3. Profit > feels good or doing the right thing
Pay more attention to what they do and how they operate and less of the AMD vs Nvidia.
 
Joined
Dec 25, 2020
Messages
6,509 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Alot of people in forums not just this some seems to either not understand or skip over the business side of both of these companies.
  1. The enterprise market means more to them than to consumers.
  2. Shareholders take priority over consumers
  3. Profit > feels good or doing the right thing
Pay more attention to what they do and how they operate and less of the AMD vs Nvidia.

That's what I've been trying to say all along :D
 
Joined
Aug 25, 2021
Messages
1,140 (0.98/day)
I have never heard this rumor and likely for good reason, it's the most nonsensical I've read to date. Going back to monolithic would require them to re-incorporate the MCDs back into the GCD, which makes no sense to do given it increases cost, decreases yield, decreases flexability, and a whole host of other negatives. MI300 would be impossible on such a design and AMD would again have to make different tapeouts for different products as compared to a chiplet based design where AMD makes 2 tapeouts, one for the MCD and one for the GCD.
Agreed. Chiplets are the way, both on CDNA and RDNA. There is no going back. At the end of the day, chiplets are the first stage in preparations for High-NA EUV lithography that will half current reticle size for chips to just above ~400 mm2 per chiplet. Nvidia is squeezing out last years of profits on monolithic design baked on scanners with EUV litography. They will too be forced to move onto chiplets once High-NA EUV production starts on dies below 2nm in 2026/2027.
 
Top