• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

No, it isn't. It's about the same it was when DX11 first got started up. It's 2.2% mGPU games total currently. The total number of DX12 games is around only 10% of what the total for DX11 games that are out. Also half of those games in DX12 support some type of raytracing, where as all games in DX11 supported tessellation.
I'm talking about SLI/Crossfire. Both have killed the support for those in few latest GPU generations already.

I had a R9 290 Crossfire about four years ago and I was surprised how many newer games already lacked the support for it.
 
Even if true that doesn't appear to be happening until RDNA5, so 3 years at least and I wouldn't bet on there ever being a RDNA5. AMD will struggle next year against Intel's Battlemage IMO and drop to third in a few years. They won't even bother trying to compete other than with APU's.
More than a year after release, Arc is very hit or miss. Most of the time, it struggles to beat the RX 7600 despite a die that's twice as large and consuming 50% more power. I don't believe BattleMage will overcome such a large deficit in performance per square mm and Watt.
 
They prefer to sell other products or scale back most of their stuffs coz they realize not everyone is defending their shitty decisions and complacency with their wallets or opinions anymore.
 
I highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
 
More than a year after release, Arc is very hit or miss. Most of the time, it struggles to beat the RX 7600 despite a die that's twice as large and consuming 50% more power. I don't believe BattleMage will overcome such a large deficit in performance per square mm and Watt.
But putting events in the scale of GPU history timeline, Intel has put up an impressive display so far. Of course they will still be playing the catch up, but the moment it sparks serious competition with AMD, the dreaded duopoly will end. I don't think AMD will give Intel free marketshares, and the competition between them should see NV stop its anti-everyone attitude.

I mean, personally I don't mind Intel spending money on GPU R&D. It only serves to potentially benefit me down the line. I do hold 10 shares but don't really care about that potential loss.

I highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
I just wonder what their xx70 XT offering involves this time.
It would probably be a 256-bit memory controller? I think it would make no sense to kneecap yourself at 6/7700XT level of die size. And 5700XT was a 256-bit card.
I think its a sensible choice but time will tell. RTG is not exactly on a "not dissatisfying everyone" streak with their recent performance.
Imma "Let RTG cook".
 
I highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
Yepp, same here. It makes no sense, other than to increase the attractivity of current high end AMD offerings (7900 XTX will be more "future proof"). It just looks like somebody misinterpreted the information that AMD will stop making large dies (because of MCM) or some guerilla marketing tactic. I'm waiting for a confirmation or dismissal from a leaker that has multiple sources inside AMD.

But putting events in the scale of GPU history timeline, Intel has put up an impressive display so far. Of course they will still be playing the catch up, but the moment it sparks serious competition with AMD, the dreaded duopoly will end. I don't think AMD will give Intel free marketshares, and the competition between them should see NV stop its anti-everyone attitude.

I mean, personally I don't mind Intel spending money on GPU R&D. It only serves to potentially benefit me down the line. I do hold 10 shares but don't really care about that potential loss.


I just wonder what their xx70 XT offering involves this time.
It would probably be a 256-bit memory controller? I think it would make no sense to kneecap yourself at 6/7700XT level of die size. And 5700XT was a 256-bit card.
I think its a sensible choice but time will tell. RTG is not exactly on a "not dissatisfying everyone" streak with their recent performance.
Imma "Let RTG cook".
Intel has less money to spend with GPU than AMD has, and it has a longer way to go. I would love Intel to come with great performance and value GPU in the future, but I think what we can expect at best from them is to make just 1 or 2 mid range GPU per generation while polishing up their drivers.
 
….that means that the nvidia 5000 series will just be an Ada refresh with more/ different type of vram and bandwidth.

The 5080 may be a 352bit GDDR7 22GB full ad103 10240 CUDA cores.
 
That would be longterm suicide, putting things on the back burner for a year as the economy collapses and the ai craze dies down is more like it.
 
And as usual the original "soruce" is Twitter "leaks".
 
This rumor makes little sense. Two major reasons AMD switched to chiplets was the cost savings on combined total die size and scalability. Not utilizing either of those advantages seems extremely unlikely. AMD can produce high end GPUs at a fraction of the cost Nvidia can because it's individual dies are much much smaller and thus the amount of wasted wafer is drastically reduced. On the flipside, Nvidia's costs increase exponentially as they increase die size.

It doesn't make sense from a business standpoint for AMD to retreat from the high end when that's where chiplets specifically lend their benefits. Even if you assume that AMD does allocate most of it's wafer towards AI or other segments, it still makes sense for AMD to have a card for the high end.

This is just another rumor at the end of the day and 99% of them turn out to be false.
 
Gamers want prices to stay the same even if costs are going up.
Completely untrue. All we want are prices to be sane for what's on offer. That's not "toxic" in any way shape or form, unless you're one of the greedy parasites in charge at AMD or NVIDIA.
 
This rumor makes little sense. Two major reasons AMD switched to chiplets was the cost savings on combined total die size and scalability. Not utilizing either of those advantages seems extremely unlikely. AMD can produce high end GPUs at a fraction of the cost Nvidia can because it's individual dies are much much smaller and thus the amount of wasted wafer is drastically reduced. On the flipside, Nvidia's costs increase exponentially as they increase die size.

It doesn't make sense from a business standpoint for AMD to retreat from the high end when that's where chiplets specifically lend their benefits. Even if you assume that AMD does allocate most of it's wafer towards AI or other segments, it still makes sense for AMD to have a card for the high end.

This is just another rumor at the end of the day and 99% of them turn out to be false.

And still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?


1691395508701.png


1691395582229.png

 
Subscription-based GPUs? Seems like an industry trend, to be honest.
Take BMW, for example: subscription-based heated seats!
They might launch a mid-range GPU and for a "measly" 20 $ / month you can turn it into a graphics monster!
Don't know if that's possible, but it wouldn't suprise me if it happened. Though I don't think many (if any) people would pay for that...
Dark times are comming...
 
And still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?


View attachment 307984

View attachment 307985
Both companies have nearly identical drop in gaming segment revenue. That's to be expected seeing how last year both companies were still riding the mining wave and writing it off as "gaming" segment.
 
Finally. All those worshiping Nvidia, no matter what, all those years, will be asked to pay twice the price for their next GPU. They will be blaming AMD obviously while begging Intel to come and rescue them.
Going to buy some champagne to celebrate.


obviously I am for the madhouse with this comment, but understand this, 15 years reading the same crap "Don't buy AMD. This AMD model is faster than the Nvidia one and cheaper also, but "AMD this and AMD that and AMD the other.....and excuses"
 
Reminds me of Polaris.


38x0 wasn't that far from 8800 GT(S) and they had 3870X2 as the flagship. Though multi-GPU is dead and buried so no X2 this time.
Thanks for reminding me, forgot the x2 exist
 
I don't think "it worked". How many people do you know (or think) have that mid-range RX 5700 XT?
Not many now, but it was a decent competitor to the 2070.
 
What do "enthusiast" and "performance" segments even mean ?
 
What do "enthusiast" and "performance" segments even mean ?
For me:
nVidia lineup:
xx50 - entry level;
xx60 - low-end;
xx70 - mid-range;
xx80 - high-end (performance);
xx90 - enthusiast.
AMD lineup:
x400, x500 - entry level;
x6x0 - low-end;
x7x0 - mid-range;
x8x0 - high-end (performance);
x9x0 - enthusiast.
 
They prefer to sell other products or scale back most of their stuffs coz they realize not everyone is defending their shitty decisions and complacency with their wallets or opinions anymore.
How the eff do you explain Nvidia's record profits then? Ok, not most of it was from gaming but even if JHH sells his t*** for a discount the Nvidia zealots will buy it at a premium :nutkick:

It's only lose lose for AMD at this point :shadedshu:

And still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?
I wouldn't say they don't work, otherwise they wouldn't be making the inroads they have with Zen into servers & now Xilinx as well. Nvidia has an inherent advantage with CUDA & they spend gazillions on that, even Intel vastly outspends AMD on the software support front but that is changing albeit slowly.
 
Keep in mind this is rumor.

That said, this is routine by now, coming from AMD. When they lag behind, they claim something like this. If memory serves me well, they did it with Radeon HD 3000 or 4000, they did it with Radeon 200 and they did it with initial RDNA. It never stuck.
 
Reminds me of Polaris.


38x0 wasn't that far from 8800 GT(S) and they had 3870X2 as the flagship. Though multi-GPU is dead and buried so no X2 this time.
38x0 was considered mostly a fix for the power hungry 2900 cards, nothing more. While 8800 GT(S) wasn't so great cards and 38x0 had some chance against them, Nvidia had released the 8800 GT and that card was clearly ahead. Only with 4800 series cards AMD put real pressure on Nvidia.
 
So AMD waits a year-ish to release their midrange product line only to, according to the rumors, create an entire generation of just midrange product line to come out next year to replace it?

Market is saturated right now. You got 2 generations competing against itself.

I mean it's great value for everyone, but no need to push every 6 months a new bunch of cards out.
 
Back
Top