• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

This works for me. If they can match the longevity that the RX 5700XT and the RX 6000 series in general is enjoying while bringing price down, I'm all for it. A 16GB GDDR6 card with decent midrange performance (so, somewhere between the 7700XT and 7800XT current) for like $380ish would be a sharp kick in the rear to actually start sliding the price scale back down. I just hope they pick up the high end with the Radeon PRO line so they're not losing out on professional clients. I remember they were doing an awesome job of filling in the gaps left by Quadro Ada.
 
This works for me. If they can match the longevity that the RX 5700XT and the RX 6000 series in general is enjoying while bringing price down, I'm all for it. A 16GB GDDR6 card with decent midrange performance (so, somewhere between the 7700XT and 7800XT current) for like $380ish would be a sharp kick in the rear to actually start sliding the price scale back down. I just hope they pick up the high end with the Radeon PRO line so they're not losing out on professional clients. I remember they were doing an awesome job of filling in the gaps left by Quadro Ada.

What longevity does a card released 5 years ago, with incomplete DirectX API support, absolutely no matrix multiplication/tensor/ray acceleration functionality, buggy drivers, deficient encoding hardware, reliability issues to the extent it received literally 5 steppings while it was being manufactured, limited VRAM size from day one have? The 5700 XT is a good case of how to not make and maintain a GPU.
 
"Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy."

Didn't know buying a $1200 gpu was the equivalent of buying a $120k car (911 carrera base model) o_O; i'll gladly trade my 4080 for a used 911 if anyone is out there! :D
 
Somewhat understandable
View attachment 362660
According to the Steam hardware survey
You have to "go down" to the 16th place to see the first card which can be considered as a high-end.
This is not new, there are only a few exception - most of the money profit came on the mid to low tier cards.


What?
When was the last time when nVidia had competition on the top tier graphics card market?

Well, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
 
What longevity does a card released 5 years ago, with incomplete DirectX API support, absolutely no matrix multiplication/tensor/ray acceleration functionality, buggy drivers, deficient encoding hardware, reliability issues to the extent it received literally 5 steppings while it was being manufactured, limited VRAM size from day one have? The 5700 XT is a good case of how to not make and maintain a GPU.
"Released 5 years ago"
And still viable and in support, ergo still around.

"Incomplete DirectX API support"
Ah yes, lack of support for feature level 12_2, which includes... things it doesn't have in its design drafted in 2018. When DX12 Ultimate came out in 2020. Be real.

"No matrix multiplication/tensor/RT support"
When the competition barely had all of that in the card's heyday? When the people that have/are interested in a 5700XT probably won't be considering workloads like that?

"Buggy drivers"
This I'll concede, but that applies more to the card's earlier years than it does now.

"Deficient encoding hardware"
I will also concede VCN kind of sucking but that fact has not changed in comparison to NVENC/QSV at any point in the last decade or so. Didn't kneecap its sales nor sales of RX 6000 and RX 7000.

"Reliability issues requiring several revisions"
Much of what I could even dig up is solved by now or had existing workarounds at the time, cards that are still around can be/are fixed.

"Limited VRAM size."
Lemme grip you by the ear and rattle off some models you might be familiar with. 2070. 2070 SUPER. 2080. 2080 SUPER. 3060Ti. 3070. 3070Ti. 4060. 4060Ti 8G. A580. A750. A770 8G. Released around the same time or newer, or far newer, all hampered by the same 'lack of VRAM' you rest the blame on AMD for as if they were supposed to have some moment of divine providence to realize that The Last of Us Part 1 Remastered: Extra Shitty Port Edition (2023) will need more than 8GB. When the cards that roughly match it in performance have the same memory sizes and are still being used TODAY.

I highlighted how long the 5700XT has lasted as a card that you can slap into a PC and still use within its means, specifically in the segment that consumer Radeon targets: value-conscious gaming. Even its geriatric Polaris predecessor the RX 480/580 is still seeing use. Much of what you cite as it being a 'bad example' are issues that were either relevant only in its youth or a result of the card, shocker, being old. Get a grip.
 
Last edited:
Some AMD fans are AMD's worst enemy. All of the mind-numbing hate just puts people off from wanting to be associated with the brand. AMD deserves better fans than that imo because they really do accomplish a hell of a lot with little resources.
It's ALWAYS AMD's mistake, or bad AMD fans or whatever with an AMD reference on it. NEVER EVER EVER Nvidia's or Intel's fault.

That's why AMD is limiting their investment in the PC gaming market. Why invest in a market where the consumers will go with any logical or illogical excuse and buy the competitor's product? Why invest in a hostile market?
 
I think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
preis.JPG
 
This is extremely bad news for us enthusiasts. Without competition Ngreedia's prices will skyrocket. Prepare to pay at least $1499 for 5080 and $1999 for 5090:mad:

Unfortunately I don't even think you can call that skyrocketing, it's just normal at this point.

It's ALWAYS AMD's mistake, or bad AMD fans or whatever with an AMD reference on it. NEVER EVER EVER Nvidia's or Intel's fault.

That's why AMD is limiting their investment in the PC gaming market. Why invest in a market where the consumers will go with any logical or illogical excuse and buy the competitor's product? Why invest in a hostile market?

Yep, I remember when people accused AMD of forcing FSR only in starfield (a sponsored title) with zero proof and the game ended up having DLSS anyways. I don't seem to remember people making the same complaint for games that implemented DLSS and only added FSR later. People were outraged that AMD might want to promote it's own features in a game it's paying to sponsor meanwhile they praise Nvidia for doing the same thing.

People also complained that AMD was forcing it's sponsored games to use a large amount of VRAM, again with zero proof.

Meanwhile when Nvidia has a bug or an issue like the terrible 12VHPWR adapter, 3000 series transient spikes and noise feedback in the 12v sense pin, New World bricking cards, or the discord bug that lowered clocks, people blamed everyone but Nvidia.

Well, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!

Yep, AMD needs a Ryzen moment for their GPUs. They need to provide enough of a value advantage to make customer take notice, because most aren't even considering AMD.

That said I'm not sure they could have a Ryzen moment because Nvidia has been very aggressive in the past with it's pricing to prevent AMD from gaining marketshare. Nvidia could lower mid to low end GPU prices temporarily just to crash AMD and then things would return to normal the gen after. As we've seen with their AIBs and the AI market, they aren't afraid of coercion and other illegal tactics either.
 
Well, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
I was thinking about a week ago,
My last Geforce was the NVIDIA GeForce 8800 GTS 320 in 2007,
I loved it, but kept buying ATi then AMD because it supported 3 displays (Eyefinity), sometimes it was better and sometimes it was cheaper.
Now I starting to hate it since AMD doing nothing to make the cool stuff accessible with their GPU-s (mostly AI which sometimes I use) and the incredibly s*#t drivers.
They just not deserving my money anymore...

I think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
View attachment 362696
I don't have recent experience with Nvidia drivers... but I can tell AMD is not doing well on that department.
 
In other words : we can't compete in next generation on high end, so we will try to compete in the low and mid end.
Nothing wrong with that, but as rumors have it RDNA 4 will top at RX7900XT to RX7900XTX performance with rt performance of 4070 to super and price of 500-600$, at best it will be OK performance/dollar improvement from last gen with stronger RT performance. All nvidia has to do is price their card within 10-15% more from AMD equivalent and people will pay the extra.
 
That's fine. My typical budget is $250-$350CAD anyway for a gpu.

In other words : we can't compete in next generation on high end, so we will try to compete in the low and mid end.
Nothing wrong with that, but as rumors have it RDNA 4 will top at RX7900XT to RX7900XTX performance with rt performance of 4070 to super and price of 500-600$, at best it will be OK performance/dollar improvement from last gen with stronger RT performance. All nvidia has to do is price their card within 10-15% more from AMD equivalent and people will pay the extra.
Nvidia can price their cards 40% higher and people will still buy them. Just like iphones.............
 
Last edited:
Issue for me is that someone who sees a company release a 4090, will more likely purchase a GPU from them.

Those products (4090) not only sells relatively well but mainly serve as a flagship, a technological demonstrator and it works really well for Nvidia.

Now tbh, 2000-2500 EUR RTX5090, is that really relevant ? It's a very restricted class of consumers that can purchase one, it's way more than one month of minimum wage ("smic" around 1300 EUR) in France just to give reference.

...but you see a RTX 5090, marvel of engineering, it's a gigantic soft power, a gigantic ad for many consumers.

I think they should be in-between, no need to pursue the 4090/5090, but a 5080Ti would be the perfect spot for a 8950XTX, something with a smaller delta than a 7900XTX/4090..but not a direct 5090 contender, a cheaper in-between one that shows you still can pull a fight
 
I think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
View attachment 362696

Hm, let's see.

4070 Ti Super's 16 GB memory is not a hindrance to its performance, even at the highest resolutions that go beyond this class of hardware's capabilities in today's most VRAM hungry titles

performance-3840-2160.png


4070 Ti Super will match the 7900 XT's raster performance

relative-performance-3840-2160.png


4070 Ti Super should be more than 20% faster at RT than the 7900 XT

relative-performance-rt-2560-1440.png


4070 Ti Super is more power efficient than 7900 XT even in its strongest factory overclocked models, while pushing the heaviest ray tracing workloads - generating less heat to do so - in this specific case, the 7900 XT is likely power limited, but even at full tilt its still ~20 watts of difference under full load

power-raytracing.png



4070 Ti Super features AD103 dual-engine NVENC which will record in 4:4:4 and HDR at high resolutions and frame rates at near zero latency


4070 Ti Super has access to NVIDIA Studio drivers which have stability and productivity application compatibility in mind


4070 Ti Super supports the full breadth of functionality in NVIDIA's RTX ecosystem, including Canvas, RTX AI, DLSS frame generation and ray reconstruction technologies, as well as full support for Intel's XeSS and AMD FidelityFX

I could go on, but it's obvious that AMD cannot compete and this gap will significantly widen as RDNA 4 targets RTX 4080's performance, lacks most of its feature set, and Blackwell's release is imminent. The return isn't worth the investment, Radeon buyers' mindshare doesn't line up, the wafers can be used at more profitable segments, the investment software-side is almost if not more of an Herculean effort to get up to par than the hardware... it's good that they came clean. AMD just cannot do it.
 
For 4K gaming, the 7900 XTX is perfectly acceptable. Average frames are ~ 20 FPS lower than the 4090, but you spend ~ $750 less than the 4090. With that $750 savings, you're averaging ~ 90 FPS across all games 4K tested in W1z's suite in Windows 11 (which is a hog, at some point I would love to see some mini reviews in Linux). Devote the money to a good processor and ram. RT is still a fantasy concept in 4K without scalers or ml. I honestly hate the concept of frame generation outside of consoles (the last 2 gens). I'm looking for pure raster performance without degraded imaging.

Competitive games like Counter Strike and DOTA 2 where high frames matter, you will see greater than 200+ frames in 4K max settings. I have a high 4k monitor with 240 hz. refresh, but it's just investment for future cards. The monitor really isn't a use case quite yet.

I honestly felt like AMD already gave this market away with their absence in competition for the 4090. Going further, would be a deal breaker for me on future cards with AMD... This generation, I would have loved to have seen a 7950 xtx (similar to the Asrock version I have https://www.techpowerup.com/review/asrock-radeon-rx-6950-xt-oc-formula/) or at minimum a binned Navi 31 similar to the XTXH (found in the power color ultimate version I have https://www.techpowerup.com/gpu-specs/powercolor-red-devil-rx-6900-xt-ultimate.b8752 ). With that being said, again they have already given up the enthusiast market. I'm already disappointed, and going further would be a bad move by AMD in my opinion.

Right now I have multiple AMD cards, 7900 XTX, 7900 XT, 6950 XT, 6900 XTXH, 6700 XT, and 6750 XT. These cards perform well for all their use cases. However, for my main PC, I'm looking for very good raster in 4K. AMD giving this market away loses me as customer.

AMD is in great position to compete, but investors want gains in AI.
 
Last edited:
It depends on your target resolution and settings, too, I guess. To me, a 7900 GRE/XT level GPU with improved RT for £400-450 sounds great. Add reduced video playback power consumption into the mix, and I've got my next GPU.
I'd be willing to consider that.

I've just moved to all 4k (after our monitor discussion I pulled the trigger on a MSI MPG321URX) so 4k/high refresh is the aim. I'll be playing league of legends and solitaire at high refresh I guess.

I think after seeing what Wukong, Star Wars outlaws and now Space Marine 2 take to run at 4k and not get even near 120fps I'll be moderating my expectations for a generation or 3.
 
I remember when people accused AMD of forcing FSR only in starfield (a sponsored title) with zero proof
Half of the friggen tech press accused AMD of doing this, and yeah there was no proof, just a mountain of supporting evidence for that conclusion, they didn't come to this conclusion because they just wanted to hate on poor old AMD.
 
As long as AMD doesn't classify 7900XT/XTX level performance as "enthusiast GPU" its all good in the hood!

And exactly what are we referring to as the "enthusiast GPU segment"? You've got the APEX flagship, higher tier, mid and low. Not competing (or can't) with the 5090 makes sense but when did 5080/5070 level performance fall short of the "enthusiast GPU". Seem likes we have to blow insane amounts of money at the fastest brick in town to get a green card for the enthusiast graphics club?
 
Half of the friggen tech press accused AMD of doing this, and yeah there was no proof, just a mountain of supporting evidence for that conclusion, they didn't come to this conclusion because they just wanted to hate on poor old AMD.

"mountain of supporting evidence"? Where, if it's that easy to prove then provide it. I followed the issue when it was going down and the only "evidence" provided was people's observations.

People jumped to conclusions when AMD's marketing didn't explicitly say they weren't blocking DLSS but AMD's marketing department has got to be the worst marketing department of any company in the world. You cant rule out that it's just incompetence. Frank Azor came in shortly after that and clarified that they do not mandate FSR only in AMD sponsored games.

Could AMD have restricted DLSS prior to public outcry? Sure, but why haven't these people also complained on any of the occasions a game released with DLSS and not FSR? Why were these people silent about any of the numerous worse things Nvidia has done like GameWorks, GPP, ect? GPP never went away, all the top SKUs from Gigabyte, MSI, and ASUS are Nvidia only. Public and press response? Nothing. I explicitly remember people giving Nvidia a pass for anti-competitive gameworks features when there was explicit proof provided showing games like Crysis 2 used way too high levels of tessellation (far beyond any visual benefit and even tessellated objects outside of view, there's a video on Youtube demonstrating this still up) or when games like The Witcher 3 was running hairworks on models that weren't even visible.

The cope was either "Nvidia has a right to push it's prorietary features" or outright denial. I pointed out to TPU multiple times that the top SKU from MSI, Gigabyte, and ASUS weren't available on AMD cards anymore and they said 'wait a few weeks'. Well I let them know a few months after that that they still weren't available but I guess TPU didn't care for the story. If we are going to hang AMD for starfield not launching with DLSS and FSR based some frankley pretty weak evidence, it's a complete double standard to not do the same to Nvidia, particularly when in many cases the evidence is stronger.
 
Last edited:
"mountain of supporting evidence"? Where, if it's that easy to prove then provide it. I followed the issue when it was going down and the only "evidence" provided was people's observations.
If you didn't believe it then why would you believe it now? I can provide it, but I can tell right now it won't change your mind as it's already made up. Personally I found AMD's inability to provide a simple yes or no answer to a very straight forward question asked of them by multiple reputable tech press outlets very telling. If they we're absolutely not doing it, it would have been extremely easy and great PR to just say so. At best their PR and marketing department are woefully incompetent and mishandled one of the easiest (possible) perception wins of the year (in the timeline where they're innocent), and at worst they absolutely intentionally blocked competitor upscaling technology, riiight up until they either wound back the requirement or the clause expired, and then they still couldn't give a clear, unambiguous retrospective answer. It's not a criminal law case, so both sides of the argument will never get a beyond all reasonable doubt/proof type finding, we're all free to make a judgement based on a balance of probabilities and available evidence, and I, like several reputable tech press outlets, believe it's more likely than not AMD did the bad thing intentionally so they shut up for months while doing damage control. You're free to disagree and make your own judgement naturally, but like I doubt I can convince you they did it, you are extremely unlikely to convince me they didn't, unless you have something new to share we didn't all see at the time. My point is that this many reputable, reasonable people didn't come up with it out of thin air because they have an AMD hate boner, they came to the conclusion because of it's relative plausibility.

And no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.
 
Prepare for 2000$+ video cards from nGreedia. /facepalm.
Well, if people are dumb enough to change every year their phone for a simmilar one, then they are probably dumb enough to pay nGreedia's callous prices too, so...
Quite funny bro. I like this names you gave Nvidia. NGreedia. That's so great But you have absolutely right. With those dumb gamers out there, who pays every scale of high prices. When they're asked for which reason. They say, I want the best performance on the GPU market. Of course they doing this. No matter if the prices looking like 1800 2000 2200 2400 2600 2800 and 3000 bucks
 
If you didn't believe it then why would you believe it now? I can provide it, but I can tell right now it won't change your mind as it's already made up. Personally I found AMD's inability to provide a simple yes or no answer to a very straight forward question asked of them by multiple reputable tech press outlets very telling. If they we're absolutely not doing it, it would have been extremely easy and great PR to just say so. At best their PR and marketing department are woefully incompetent and mishandled one of the easiest (possible) perception wins of the year (in the timeline where they're innocent), and at worst they absolutely intentionally blocked competitor upscaling technology,

Have you not made the argument that AMD's marketing department is incompetent in the past? I don't think that would be a controversial statement. I think there's sufficient evidence to apply Hanlon's Razor in this instance. It's seems contradictory for you to say in past comments that AMD's marketing department is bad (and I would agree) and suddenly not consider that in this scenario.

Frank Azor from AMD did clarify a few days later but obviously by then the damage was already done.

None of your comment of which touches on the fact that the same auspices which started the uproar over a lack of DLSS at launch were present in reverse in many titles that supported only DLSS at launch. It's a clear double standard. Notwithstanding any of the other things Nvidia has done (as pointed out in my last comment) where an equal or greater level of evidence was provided than what you are using to spear AMD here, yet I don't see people complaining about those.

riiight up until they either wound back the requirement or the clause expired, and then they still couldn't give a clear, unambiguous retrospective answer.

You could say that of every company nowadays, Nvidia included.

It's not a criminal law case, so both sides of the argument will never get a beyond all reasonable doubt/proof type finding, we're all free to make a judgement based on a balance of probabilities and available evidence, and I, like several reputable tech press outlets, believe it's more likely than not AMD did the bad thing intentionally so they shut up for months while doing damage control. You're free to disagree and make your own judgement naturally, but like I doubt I can convince you they did it, you are extremely unlikely to convince me they didn't, unless you have something new to share we didn't all see at the time. My point is that this many reputable, reasonable people didn't come up with it out of th

Tech Outlets both didn't come to that conclusion nor should they given the lack of concrete evidence. It would be insanely bad journalism to publish an article with a definitive conclusion based on mere observation. It's one thing to report on a story, it's another to pass something as fact.

Yes, clearly I'm unlikely to change your mind because your own idea of what happened is mismatched with what actually happened. In your mind, every outlet confirmed AMD blocked DLSS (they didn't) and every reasonable person. By extension you are dismissing people who disagree as unreasonable. Your logic doesn't leave any room for disagreement or room to change your mind.

And no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.

You are completely assuming said titles wouldn't have had them regardless. This is classic confirmation bias, you are seeing this in a way that confirms your self-admitted non-flexible version of events.
 
Have you not made the argument that AMD's marketing department is incompetent in the past? I don't think that would be a controversial statement. I think there's sufficient evidence to apply Hanlon's Razor in this instance. It's seems contradictory for you to say in past comments that AMD's marketing department is bad (and I would agree) and suddenly not consider that in this scenario
I have and they absolutely are, but both can also be true. And I did consider it, it's my one of two options, but the option I consider less likely, or perhaps also true. Razors also aren't some mic drop argument winning statement, they often have merit but I'd even wager that should the events I believe have occurred, AMD would be hoping people just think of this razor.
yet I don't see people complaining about those.
People can and do complain about what Nvidia does all the damn time, as do I when I feel what they do is shitty, and it was absolutely a rebuttal used in this particular fiasco too. Saying but what about Nvidia isn't an effective rebuttal to this assertion.
Tech Outlets both didn't come to that conclusion nor should they given the lack of concrete evidence.
Tech outlets didn't say "100% undeniably AMD did this" which is also not what I am saying, I am saying I believe AMD did do this, as it's likely, which is what they said too.
Yes, clearly I'm unlikely to change your mind because your own idea of what happened is mismatched with what actually happened. In your mind, every outlet confirmed AMD blocked DLSS (they didn't) and every reasonable person. By extension you are dismissing people who disagree as unreasonable. Your logic doesn't leave any room for disagreement or room to change your mind.
Right back at you, you don't know conclusively what happened either, yet you are stating it as if its facts. I don't consider it to be fact, I just believe it is what occurred on a balance of everything I saw, heard and read. I have absolutely left you room to change my mind, but you've offered nothing I didn't already know. I welcome you to present either a new viewpoint or evidence that might change my mind.
You are completely assuming said titles wouldn't have had them regardless. This is classic confirmation bias, you are seeing this in a way that confirms your self-admitted non-flexible version of events.
That's also possible, but you can't also assume with 100% certainty that would have happened regardless of the fiasco either.

Why do you insist I am non flexible on the version of events? I told you why I believe what I believe and readily concede it isn't definitive proof, it just seems to me, to be the most likely thing that happened when I consider all evidence I have consumed. You came to a different conclusion, that doesn't make your version fact however, and if anything, you are coming across completely non-flexible on the topic. I would readily entertain irrefutable proof that this didn't happen. Hell, I'd have even taken AMD's word for it at the time, but they didn't even want to offer that.

So now that we've opened this can of worms, do we keep repeating ourselves or is there something new to add that might sway you or I? if there is nothing new, well I think we both said our piece on why we believe what we believe to have occurred, and in the absence of something revelatory this conversation is going nowhere.
 
Hopefully they actually make these aggressively priced.

All I heard was "RDNA 3 is cheaper to produce" and yet the 7600/7700 series was nothing special both performance and price wise. Even the 7800XT (power assumption aside) was the same performance as the 6800XT
 
Back
Top