• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

The microstuttering was solved when they switched to frame pacing with AMD drivers for Crossfire, so if you still had stuttering you either had frame pacing off, somehow, or the game wasn't properly supported.
Could be. I just remember how the difference in playability was like between day and night when I went from R9 290 CF to 980 Ti in BF1 even though the FPS was lower.
 
Could be. I just remember how the difference in playability was like between day and night when I went from R9 290 CF to 980 Ti in BF1 even though the FPS was lower.
Yea that's typical multi-GPU stuttering for you (game not supported/no frame pacing produces this issue), fps is high but it's not producing the quality you'd expect, ie fluidity that usually gets hand in hand with high fps.
 
Last edited:
Yea that's typical multi-GPU stuttering for you (game not supported/no frame pacing produces this issue), fps is high but it's not producing the quality you'd expect, ie fluidity that usually gets hand in hand with high fps.
Anyway, can't confirm your point since that was years ago and I don't have any newer dual cards than two HD 4890s currently. :/
 
Anyway, can't confirm your point since that was years ago and I don't have any newer dual cards than two HD 4890s currently. :/
I didn't play BF1 but with the games I tried back then (7800 XT's in CF), it ran flawlessly, they were fully supported though. Before they introduced frame pacing into the driver, it was a mess and not worth using (for me), as it was stuttery unless your fps was sky high.
 
Perhaps if AMD and Nvidia as well lowered the prices of their high end cards out of the stratosphere and down to a level where many more people could actually afford to buy the video cards, alot more people would be buying them
 
Perhaps if AMD and Nvidia as well lowered the prices of their high end cards out of the stratosphere and down to a level where many more people could actually afford to buy the video cards, alot more people would be buying them
The high end has never been about sales numbers. Most people don't need a high end graphics card anyway.
 
As an RX 6600XT owner and budget-build user, we honestly need more budget-level and mid-range GPUs. Intel can handle the low-end (which they kinda already do), AMD can have the mid-range (like they did with the RX 480), and NVIDIA can remain in the enthusiast market. This is only theoretical of course.
 
Smart Move as the mid range is where the market is anyway and only the rich or dumb would buy any 4000Series GPU's from Nvidia and AMD give alot more VRAM which is already proven to be a must these days with modern games at a lower over all Cost especially in AUS. If I wanted a NVIDIA card now with over 10GB VRAM it would literally cost me twice as much as a AMD Card even if the AMD card didnt get as much FPS but good enough still for 1440P Gaming or under. I think this is also why the 1080 Ti was such a good card even today, good performance and good amount of VRAM, RIP my 1080 Ti.
 
Let's focus on what matters. AMD is also rumoured to be developing a halo RDNA4 SKU. It could look like this.
SmartSelect_20231201_134704_Firefox.jpg
 
Last edited by a moderator:
Let's focus on what matters. AMD is also rumoured to be developing a halo RDNA4 SKU. It could look like this.
View attachment 323680
I was thinking of something similar (with just two mid-end GDCs running together), I can't fathom how they'd pull it off, but it would indeed be an impressive feat.

it would replicate the successful multi-chip strategy used in Ryzen, EPYC, TR etc...
 
Last edited by a moderator:
Keep it on topic.
Stop the personal arguing.
 
OP.

After much consideration this story true or not is offensive.

Untrue.

Wierdly biased.


I and many many others were Enthusiasts, Way way before most of you, way way before I ever had money and way way before the PRESS.

YOU OP.

Suggest you have to pay X amount to be an Enthusiast.


HIGH END is the appropriate phrase.

Stop creating fake narratives that ensure people feel they should be selling kidneys for GPU.

That's why we're here.
 
OP.

After much consideration this story true or not is offensive.

Untrue.

Wierdly biased.


I and many many others were Enthusiasts, Way way before most of you, way way before I ever had money and way way before the PRESS.

YOU OP.

Suggest you have to pay X amount to be an Enthusiast.


HIGH END is the appropriate phrase.

Stop creating fake narratives that ensure people feel they should be selling kidneys for GPU.

That's why we're here.
My man, that's just what the industry has named the ultra highend bracket for some time now. Don't shoot the messenger.
 
I was thinking of something similar (with just two mid-end GDCs running together), I can't fathom how they'd pull it off, but it would indeed be an impressive feat.
it would replicate the successful multi-chip strategy used in Ryzen, EPYC, TR etc...
Two GCDs could be N42 package. entirely possible. It looks like they are gunning for flexible modularity, i.e. two dies clocking high and then combine those dies in different packages. It will be fascinating to see this. They have similar concept already implemented in MI300.
 
Back
Top