Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#151
Space Lynx
Astronaut
Vayra86Ehh yeah. Next you're going to say the 7900XT took you to Mars. Its not only one game, its no single game.
Let's not exaggerate and try to see things for what they are. There is nearly 25% between the XT and the XTX :) There are no OCs' on a 7900XT for more than 15% perf, and even then you're doing something special.
can't find the techspot article right now, but this is it here, 4090 gets beat by 10-15 fps in valhall by xtx, and my xt oc'd beats a reference xtx in this game... so you do the math.

Posted on Reply
#152
Vayra86
Space Lynxcan't find the techspot article right now, but this is it here, 4090 gets beat by 10-15 fps in valhall by xtx, and my xt oc'd beats a reference xtx in this game... so you do the math.

No I don't care and its irrelevant plus Youtuber bullshit. There's 1,5 tier between the 7900XT and the 4090. We all know top end runs into CPU bottlenecks and this is an eternal Ubisoft open world shitstorm so you do the math ;) You're 'beating' an XTX by extrapolating your setup to what's on a Youtube video in a single game that is obviously limited not by the GPU.
Posted on Reply
#153
Space Lynx
Astronaut
Vayra86No I don't care and its irrelevant plus Youtuber bullshit
well it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
Posted on Reply
#154
Vayra86
Space Lynxwell it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
It doesn't. The 4090 is limited by the game. This is like starting Minesweeper and saying your Intel IGP is as fast as that 4090. Come on man.
Posted on Reply
#155
enb141
G777Your issues may be limited to the RX 6400, which is a pretty middling card. Even something like a RX 6600 would've given you a much better impression.
My old 1050 TI didn't had those issues, and was way slower, also my 1030 (my backup card) doesn't has those issues either.

So saying that 6600 will not have those issues is nuts.
Posted on Reply
#156
ARF
AusWolfWhat are you talking about? The 6600 is a perfectly fine 1080p card.
Cards with 8 GB are really bad, and everyone tells you this. Listen:



RX 6600 is a low-end, poorly perfoming card for today's games, only good for yesterday's games which require less VRAM, maybe 4 or 6 GB.
Posted on Reply
#157
Dr. Dro
Space Lynxnot sure how you can logically say that, when my 7900 XT goes toe to toe with your 4080 for half the price (if you factor in the most recent Prime Day sales). 14-19% improvements in fps gains since launch, drivers that are rock solid (for me anyway), all my games smooth as butter. considering the price I paid, I can't complain, and kudos to AMD for the driver improvements, I expect more will come.

www.techspot.com/review/2717-amd-radeon-7900-xt-again/
Until it doesn't. And that's fundamentally the problem with AMD's GPUs. By buying a Radeon, you forgo your right to the front seat. Nvidia currently supports - and supports well - all of the technologies that make a modern graphics card what it is. By owning a Radeon, you give up on each and every one of those. Point in case, the hyperfixation on legacy raster graphics that AMD fans have should be painfully obvious that Radeon's lacking in the other departments. That 2% faster than a reference 4080 at 10% higher power target in W1zzard's review suite for the 7900 XTX can't even be called a win for AMD.

The hardware just isn't up to snuff - you don't have access to matrix multiplication units, the hardware video encoder, while no longer completely awful, doesn't support full chroma or 4:2:2 video, which hinders its usefulness for video production as the GPU is incompatible with high-quality codecs used by modern cameras, you lose out on pretty much almost universally all of the niceties and eye candy that Nvidia has developed and maintained over the years, relegate yourself to a last-gen raytracing performance... and if we go by MSRP, congrats, you got 200 bucks off your 20-24 GB GPU that can't run anything that'd make that video memory worthwhile. In the meantime, Nvidia's figured out how do to on-the-fly shader execution reordering, and even has an early implementation of motion interpolation, which while increases latency, it can be countered somewhat with the use of Nvidia Reflex - well, I promise I won't tell anyone about the mess that Radeon's antilag thing is. Oops, I guess I did :oops:

Then there's the other thing, you got 19% fps since launch, that's pretty great! The problem is, Nvidia is also constantly improving their own software's performance. Reviews are best referenced when the hardware's closer to its launch, or when you manage to get a newer review with newer software - for example, I use W1zz's Taichi White XTX review as my reference because of that.

End of the day what matters is that you and you alone are happy, but if you carefully evaluate my reasoning, you'll see that for all the things that I get? The difference in MSRP, those $300 that would separate a 7900 XT to a 4080, accounting for all the performance gains, the far richer feature set, the constant stream of certified drivers plus the studio drivers for content creation which are made available to all RTX GPUs, it all adds up enough for me to personally, lean heavily towards Nvidia's offering. Strictly as a Windows user, anyway... Linux is the green team's achilles heel, mostly because everything that makes a GeForce give the experience it can is closed source.
Posted on Reply
#158
pR0m3tH3u$
Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
Posted on Reply
#159
Tek-Check
rv8000Its the card to buy at its price range imo, unless you really need RT for some reason.
I don't. First pure raster, good hardware and better prices, then software perks. Once Nvidia gets this combination right, I will buy Nvidia card again.
Posted on Reply
#160
Dr. Dro
Tek-CheckI don't. First pure raster, good hardware and better prices, then software perks. Once Nvidia gets this combination right, I will buy Nvidia card again.
It'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
Posted on Reply
#161
ARF
pR0m3tH3u$Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
I wouldn't vote against. That'd definitely be a welcome change.
Posted on Reply
#162
AusWolf
ARFCards with 8 GB are really bad, and everyone tells you this. Listen:



RX 6600 is a low-end, poorly perfoming card for today's games, only good for yesterday's games which require less VRAM, maybe 4 or 6 GB.
Yeah, basically the whole internet is loud with saying that 8 GB is crap, but I haven't run into any single situation where it's really a limiting factor at 1080p. It's like everybody telling me that the sky is red and grass is purple, and I'm the idiot for not believing it.

Similarly, I haven't run into any situation where my 6750 XT can't deliver a stable 60 FPS with using only half of its power limit. Yes, I have other 8 GB (and even 4 GB) GPUs, and they're fine.
Posted on Reply
#163
bug
ARF200$ is RX 6600 8GB, 500$ is RX 6800 XT three ! ! years after its release.
200$ is RTX 3050 8GB, 500$ is RTX 3070 Ti 8GB more than two !! years after its release.

What are you going to do with these cards ? Slide show "gaming" at 1080p? :banghead:

I don't think the 1000+$ cards must go, actually the opposite - everyone should focus on them and try to buy only them. Instead of upgrading every year or two, just buy that 1000$ beast and stay with it for the next five-seven ! ! years with ease.
Yes, but by increasing competition, I would expect more capable cards to be squeezed in that price range.
Plus, Nvidia's $400 4060Ti is really a $200-250 card, look at the PCB pictures. AMD is probably no better.
Posted on Reply
#164
Dan.G
pR0m3tH3u$Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
They might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...
Posted on Reply
#165
enb141
Dan.GThey might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...
I mostly play LOL, I have a 3070 TI and a 13 gen Core i9, I play at 4K, If I open discord, I got random FPS drops, so just because we play a game that is supposed to run on a microwave, doesn't means it can play on anything.

Before that I had a 1070 with a Core i7 8 Gen, that machine wasn't able to play LOL at 4K without drops, I tried with a 3060 with 12 GB RAM and was way better but still not perfect, so anything higher than 1080P will require a good video card.
Posted on Reply
#166
Assimilator
Dr. DroIt'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
Amen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
Posted on Reply
#167
Dan.G
enb141If I open discord, I got random FPS drops
Discord is known for causing performance issues, on any system, regardless - not graphics related.
Posted on Reply
#168
rv8000
AssimilatorAmen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
Until there’s a GPU and engine capable of full path tracing at 60 fps min, rasterization and or hybrid rendering will never be dead. Unless either company can magically quintuple RT performance gen to gen, were years away from that being any sort of reality.
Posted on Reply
#169
enb141
Dan.GDiscord is known for causing performance issues, on any system, regardless - not graphics related.
When I had my 1070 was worse, now is much much less an issue, still there but at least is not as annoying as it was with my previous setup.
Posted on Reply
#170
Tek-Check
pR0m3tH3u$Maybe AMD should lower the prices and sell their XTX for $799 MSRP instead of $999... then they would sell more.
You are asking AMD to sell you their top card for the price of 4070Ti. It's a joke.
Perhaps you could ask Nvidia to sell you 4080 for $850, and then ask AMD to sell 7900XTX for $799.
Posted on Reply
#171
TheoneandonlyMrK
Dan.GThey might. But a lot of people will still buy nVidia for features and better drivers.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is... People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).
Maybe it's time to read more and play less... Just saying...
You sort of prove yourself wrong,

You didn't need the best.
Everyone doesn't buy the best.
Nvidia 4060 isn't the best.

Life goes on still no surprise.

You prove people are fickle and buy favourite name's.

@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
Posted on Reply
#172
Tek-Check
Dr. DroIt'll never happen. It's the opposite direction the market is headed.
Tough, then I will not be their customer.
5080 for $1,200 can pass my test only if it has: 24GB VRAM, 50% uplift in 4K over 4080 and DisplayPort 2.1 ports (imncluding one USB-C).
Posted on Reply
#173
ARF
bugYes, but by increasing competition, I would expect more capable cards to be squeezed in that price range.
Plus, Nvidia's $400 4060Ti is really a $200-250 card, look at the PCB pictures. AMD is probably no better.
I know. This entire generation, both from AMD and nvidia, is rebranded at least a tier up the product stack.

RX 7900 XTX should be 7900 XT
RX 7900 XT should be 7800 XT
RX 7600 should be 7400 XT

RTX 4090 should be RTX 4080 Ti
RTX 4080 should be RTX 4070
RTX 4070 Ti should be RTX 4060 Ti
RTX 4070 should be RTX 4060
RTX 4060 Ti should be RTX 4050 Ti
RTX 4060 should be RTX 4050
Posted on Reply
#174
Colddecked
Dr. DroIt'll never happen. It's the opposite direction the market is headed. We've achieved enough raster performance back with Pascal, AMD caught up with RDNA, you'll find the GTX 1080 can still run practically any game exceptionally well if you leave RT, modern accurate lighting and occlusion algorithms, high-precision soft shadows, etc. - you know, the newer technologies off. And i'll double down on my point:


This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
I'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!
Posted on Reply
#175
Tek-Check
AssimilatorAmen.
So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
There are serious doubts about it, unless both companies substantially improve RT performance across all classes of GPUs, and fast, without breaking customers' piggy-banks.
Is 4060Ti capable of "acceptable RT performance" for $500? No. Even 4070 chokes with RT in more demanding titles and becomes a stuttering mess. So, the mainstream market GPUs still have RT performance in its infancy. Raster is dead - long live the raster.
ARFI know. This entire generation, both from AMD and nvidia, is rebranded at least a tier up the product stack.

RX 7900 XTX should be 7900 XT
RX 7900 XT should be 7800 XT
RX 7600 should be 7400 XT

RTX 4090 should be RTX 4080 Ti
RTX 4080 should be RTX 4070
RTX 4070 Ti should be RTX 4060 Ti
RTX 4070 should be RTX 4060
RTX 4060 Ti should be RTX 4050 Ti
RTX 4060 should be RTX 4050
Names are less important. Marketing departments of both companies use it to confuse people and make comparisons harder. We need to take official names on the face value for what they are, and simply have a healthy distance to it by comparing performance and features.
Posted on Reply
Add your own comment
Jun 2nd, 2024 17:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts