Monday, September 9th 2024

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

AMD in an interview with Tom's Hardware, confirmed that its next generation of gaming GPUs based on the RDNA 4 graphics architecture will not target the enthusiast graphics segment. Speaking with Paul Alcorn, AMD's Computing and Graphics Business Group head Jack Huynh, said that with its next generation, AMD will focus on gaining market share in the PC gaming graphics market, which means winning price-performance battles against NVIDIA in key mainstream- and performance segments, similar to what it did with the Radeon RX 5000 series based on the original RDNA graphics architecture, and not get into the enthusiast segment that's low-margin with the kind of die-sizes at play, and move low volumes. AMD currently only holds 12% of the gaming discrete GPU market, something it sorely needs to turn around, given that its graphics IP is contemporary.

On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."
Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."

The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.

The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.

Catch the full interview in the source link below.
Source: Tom's Hardware
Add your own comment

271 Comments on AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

#151
Minus Infinity
All the wannabe forum jockey's make noise like they own a 4090 and AMD's a joke for not competing, yet 99.9% own a 1060/1650/3060/3070. The hype is made with the flagship, the money is made in the mid to lower tiers. AMD has not abandoned the enthusiast market, it's just cancelled the RDNA4 version to focus on RDNA5. It was not cancelled due to AI or data centre focus. It was cancelled becuase it was a far more complex design than RDNA3 with 20 chiplets rather than 7 and was failing to meet performance targets. They said it would have taken too much money and resources to get it working in time and would have pushed back RDNA5. The team working on RDNA5 is proceeding very well and is on track for a late 2025, early 2026 launch for high-end. In the meantime AMD is wisely targeting an area that is crying out for a real performance lift in the mid-tier with N48. It's likely 8800XT will beat 7900XT in every metric and use less power for at least $150 less. If they launch well under $600 they'll do well. Hopefully, 8700XT follows suits and easily beats out 7800XT. RTing will really get a decent bump for RDNA4. Target's I've heard for 8800XT are 4080 raster, 4070 Ti RTing.

I'm glad AMD don't bother with the BS 4090 market, focus on real world and let Nvidiots hype something they will never own anyway.
Posted on Reply
#152
Godrilla
Neo_MorpheusBlender devs themselves only targeted Ngreedia and didnt do any optimization for AMD gpus for a while. Some people even speculated it was due to being fanbois.
Last time I checked, they started using ROCm and that has improved the software.

I hate the fact that AMD used the 9 moniker, since everyone thinks that they are competing with the 4090.
That said, its stupid to ignore the fact that in many (not all) benchmarks, the 7900xtx is around 15 to 20% slower but can cost up to 100% 50% less.
That’s assuming best case scenario in pricing for AMD (800 in my particular case) to 2k for the 4090 (worse case for Ngreedia when msrp prices weren't available)

Please read my comment again.

And everyone bought…4080 Super, instead of 7900xtx.

Which bring us back to the hypocrites asking for AMD to compete. They will not buy an AMD gpu regardless of how good it is. All they want is for Ngreedia to cut prices so they would get their beloved Ngreedias gpus cheaper.
I would buy an AMD gpu if it was better than a 5090 in rasterization, rt and efficiency. I would even pay more for it. I am pretty sure I am not the only one.
By not having the crown AMD is losing mind share which is costing them market share. AMD is giving free advertisements to Nvidia's 4090 when showing off their 7800X3D benchmarks. rinse repeat Blackwell 5090/9800x3d.
Currently AMD has the vram advantage in most tiers. AMD can get developers to take advantage of this by filling it with higher quality textures. This would be an easy short term win for them that can move the needle. Gamers will will with no performance cost. Tech media will praise AMD for having more vram than it's Nvidia counterparts. The conversation will move away from rt to higher quality textures with no performance penalty. All those 16 gig vram cards from rx6800 to upcoming 8800xt will benefit and will destroy all those 12 gig and 8 gig Nvidia cards. I wonder why most Nvidia titles like Cyberpunk 2077 and Blackmyth Wukong have lower quality textures but heavily rely on rt features, maybe because they will get destroyed in the midrange.
In the end of the day AMD has only itself to blame not developers ,not gamers, not the competition especially with2 crypto mining booms benefits and now ai craze.
While the market leader has a diversified portfolio of hardware in case of ai market saturation ( ai bubble pop). AMD on the other hand wants to focus on the midrange. This could backfire with further market share loss to Intel if Battlemage has superior rt performance.
Also Nvidia started using ai for driver development years ago. AMD needs to do the same like yesterday! Imo.
Posted on Reply
#153
AusWolf
Vayra86Agreed, but if the top end of RDNA4 isn't moving forward much past the 7900XT, then how much movement is the 6700/6800XT crowd really waiting for? +20%? They can already buy a discounted 7900GRE or XT now...
The 7900 XT is roughly double the performance of the 6700 XT. If we assume that you can get a 7900 XT for 500 quid (which you can't, that's more like 7800 XT price), then sure, you can go for it. Whether you do, or wait for RDNA 4, it doesn't matter, it's an AMD buy regardless. Think of how Nvidia kept low-end Turing (GTX 16-series) running alongside Ampere, and I think also Ada for a while.
Vayra86Not exactly going to be earth shattering, its going to have to come from the improved featureset if anything then. And let's consider that. Even IF AMD gains perf/feature parity with RDNA4 on RT and FSR (probably too tall an order) and they release a 500 dollar GPU that performs like a 7900XT. If you had a 6800XT now, would you pay that? Again... its not really an earth shattering jump here, and if you really wanted RT, you'd have gone for Nvidia at this point.

I'm not convinced of AMD's rationale or strategy here. They're responding, roll for initiative: critical miss. The last time they focused on their midrange they came up 2 generations short of Nvidia which is what got us into the RTX mess in the first place. AMD managed to completely stall at Vega performance for what, 5-7 years? Also, look at Intel, fighting tooth and nail to get every bit of performance out of Xe so they can finally compete in the midrange. AMD has a high end segment position in spitting distance and says 'meh, whatever'. I can't even.
If I had a 6800 XT, I probably wouldn't be looking for an upgrade, or just get a 7900 XT(X).

On a personal note, I think the biggest problem with RDNA 3 is the video playback / low usage power consumption. Nobody wants 50 W gone on a 7800 XT with the fans turning on and off intermittently just for having VLC or YouTube open. I know it's a very unique gripe, but I only expect AMD to fix this for RDNA 4, although I fear that focus will be elsewhere. If they do fix it, and we get 7900 XT level performance with improved RT, I'll call it a massive win.
Vayra86Good riddance its about time you guys stopped paying over 1k for these temporary items. You have yourself and only yourself to blame for this, jumping on every new x90 like its candy
Oof! I couldn't have said it better myself. :D
Outback BronzeWell, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
The point is that those midrange Nvidia cards are an easier target to compete against for increasing market share. More gains if AMD succeeds (because it's a much bigger market than the high-end), and fewer losses if they don't (because less is spent on R&D and manufacturing).
csendesmarkI don't have recent experience with Nvidia drivers... but I can tell AMD is not doing well on that department.
What do you mean? I've been using both AMD and Nvidia concurrently in the last 8-10 years, and with the exception of the 5700 XT, all AMD drivers have been fine.
Posted on Reply
#154
AusWolf
BagerklestyneI'd be willing to consider that.

I've just moved to all 4k (after our monitor discussion I pulled the trigger on a MSI MPG321URX) so 4k/high refresh is the aim. I'll be playing league of legends and solitaire at high refresh I guess.

I think after seeing what Wukong, Star Wars outlaws and now Space Marine 2 take to run at 4k and not get even near 120fps I'll be moderating my expectations for a generation or 3.
4K high refresh is 4090 territory, there's no other card in existence which can achieve that. You should have considered this before the monitor upgrade.
wolfIf you didn't believe it then why would you believe it now? I can provide it, but I can tell right now it won't change your mind as it's already made up. Personally I found AMD's inability to provide a simple yes or no answer to a very straight forward question asked of them by multiple reputable tech press outlets very telling. If they we're absolutely not doing it, it would have been extremely easy and great PR to just say so. At best their PR and marketing department are woefully incompetent and mishandled one of the easiest (possible) perception wins of the year (in the timeline where they're innocent), and at worst they absolutely intentionally blocked competitor upscaling technology, riiight up until they either wound back the requirement or the clause expired, and then they still couldn't give a clear, unambiguous retrospective answer. It's not a criminal law case, so both sides of the argument will never get a beyond all reasonable doubt/proof type finding, we're all free to make a judgement based on a balance of probabilities and available evidence, and I, like several reputable tech press outlets, believe it's more likely than not AMD did the bad thing intentionally so they shut up for months while doing damage control. You're free to disagree and make your own judgement naturally, but like I doubt I can convince you they did it, you are extremely unlikely to convince me they didn't, unless you have something new to share we didn't all see at the time. My point is that this many reputable, reasonable people didn't come up with it out of thin air because they have an AMD hate boner, they came to the conclusion because of it's relative plausibility.

And no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.
I remember that people were pointing at evidence left and right, and I was looking at where they were pointing, then took out a magnifying glass, then took out a microscope, and I still couldn't see any.

I also remember pointing out that locking a game into a technology that everybody can use isn't as bad as locking it into a technology that only last-gen Nvidia card owners can use (which is what Nvidia had been doing before that without any backlash), but no one cared.

Anyway, this is quite off-topic, so let me leave it at that. :)
Posted on Reply
#155
wolf
Better Than Native
AusWolfI remember that people were pointing at evidence left and right, and I was looking at where they were pointing, then took out a magnifying glass, then took out a microscope, and I still couldn't see any.
Everyone is free to look at the evidence and make their own determination, even if that is that it doesn't constitute evidence at all (lets take a moment to remember the definition of evidence - "the available body of facts or information indicating whether a belief or proposition is true or valid").. It's well documented that reputable tech press reviewed the history of the situation (for both camps) and made the same determination I did. Neither of us are inherently right or wrong to determine what we chose to from reviewing the evidence, I just know that to me, it lead to a conclusion I found to be likely. ANd (not that you asserted it) I reject that I or tech press came to our conclusion because of bias against AMD.
AusWolfI also remember pointing out that locking a game into a technology that everybody can use isn't as bad as locking it into a technology that only last-gen Nvidia card owners can use (which is what Nvidia had been doing before that without any backlash), but no one cared.
This gets said a lot but honestly I call BS, loads of people care and loads of people complained about that before and during the Starfield fiasco. Perhaps another case of us both looking at the same stuff (and of course lots of different stuff/discussions too) and having different takeaways and conclusions.
AusWolfAnyway, this is quite off-topic, so let me leave it at that. :)
Happy to, I by no means am of the opinion that I am irrefutably correct and that this definitely happened, I just really think it did based on the evidence I saw. I end up back at, loads of people thought it did, and also loads of people thought it didn't, and we'll likely never know the undeniable truth of the matter, so I regard this as sort of like a cold case/unsolved one, and will continue to believe what I believe until presented with extra/new information that can alter that belief.
Posted on Reply
#156
ratirt
Vayra86There's generally an ocean between 'want' and 'will do'.

What did you pay for your 6900XT? I think that'll underline this just fine :)
Yes there is. I'm just explaining what I meant and said actually.
Way too much but that doesn't mean anything. The prices are way lower than used to be.
Posted on Reply
#157
Bagerklestyne
AusWolf4K high refresh is 4090 territory, there's no other card in existence which can achieve that. You should have considered this before the monitor upgrade.
monitor was 2/3 the price of the video card lol.

Despite that, I can play some less demanding titles @ 4k in a high refresh experience. Reality is unless the monitor dies it's not getting replaced for 3-5 gpu cycles and it's replacing a 1080p 144hz screen (so no merit for a 4090 there)

I've decided a compromise is 4k for esports/hdr streams and 1080p for high refresh gaming for demanding titles, so as per our earlier post, an RDNA4 card with 7900XT equivalence, with some good efficiency, power consumption and an uptick in RT performance is likely to be a sale for me.
Posted on Reply
#158
csendesmark
AusWolfWhat do you mean? I've been using both AMD and Nvidia concurrently in the last 8-10 years, and with the exception of the 5700 XT, all AMD drivers have been fine.
My CS2 crashes with the driver error once a week, that is the only game I play.
Really annoying.
Posted on Reply
#159
The Shield
despite all the talk about “2000 bucks flagship”, will have products at all price points
I never understood this "they-cover-all-price-point" thing.
If they offer in this or that price point an awful product for an unreasonable amount of money, they are covering nothing.
A 12 gb card for 700 $/€ in 2024 does NOT mean that you are covering the 700 bucks price point.
The same is when they say "ehi, there are still 300$ card, exactly like in the past", No, there are not, because the 300 $ cards in the past were high level cards, the same cards that now cost 800-1.200 $ at least.
Posted on Reply
#160
AusWolf
wolfEveryone is free to look at the evidence and make their own determination, even if that is that it doesn't constitute evidence at all (lets take a moment to remember the definition of evidence - "the available body of facts or information indicating whether a belief or proposition is true or valid").. It's well documented that reputable tech press reviewed the history of the situation (for both camps) and made the same determination I did. Neither of us are inherently right or wrong to determine what we chose to from reviewing the evidence, I just know that to me, it lead to a conclusion I found to be likely. ANd (not that you asserted it) I reject that I or tech press came to our conclusion because of bias against AMD.

This gets said a lot but honestly I call BS, loads of people care and loads of people complained about that before and during the Starfield fiasco. Perhaps another case of us both looking at the same stuff (and of course lots of different stuff/discussions too) and having different takeaways and conclusions.

Happy to, I by no means am of the opinion that I am irrefutably correct and that this definitely happened, I just really think it did based on the evidence I saw. I end up back at, loads of people thought it did, and also loads of people thought it didn't, and we'll likely never know the undeniable truth of the matter, so I regard this as sort of like a cold case/unsolved one, and will continue to believe what I believe until presented with extra/new information that can alter that belief.
If any "evidence" necessitates that you draw your own subjective conclusion in the end, then it's not really evidence, is it? I asked the same question back then, but everybody was like "nah, my truth is the ultimate one, so STFU". Evidence makes the truth stand out in the spotlight, it leaves no room for personal judgement.
Bagerklestynemonitor was 2/3 the price of the video card lol.
It doesn't matter, 4K high refresh is still 4090 territory.

Personally, I like my monitors to last for a lifetime, but each to their own. I was happy with 1080p at 24", and I would have never upgraded if it wasn't for the ultrawide angle.
csendesmarkMy CS2 crashes with the driver error once a week, that is the only game I play.
Really annoying.
Fair enough. I don't play that game, and I don't have any issue in any other, so I'm fine. Are you sure it's a driver problem, though?
Posted on Reply
#161
wolf
Better Than Native
AusWolfIf any "evidence" necessitates that you draw your own subjective conclusion in the end, then it's not really evidence, is it? I asked the same question back then, but everybody was like "nah, my truth is the ultimate one, so STFU". Evidence makes the truth stand out in the spotlight, it leaves no room for personal judgement.
I mean, that's the way evidence works, it's a broad term that covers relevant facts or information, evidence absolutely allows people to draw conclusions, some evidence is effectively proof, but not all evidence is a 100% conclusive "this means x person 100% did y crime", you need to listen to all the available evidence, and depending on the circumstance, only consider beyond all reasonable doubt to be acceptable (criminal case for ex) or on probability (civil case), but even that is talking about court cases and law rather than something not technically illegal but still a bad thing to do. Evidence doesn't need to be a literal smoking gun with fingerprints and a signed admission of guilt.

If those people gave you that answer well that also sucks, and I'd say it sucks for anyone saying the same but saying AMD didn't do the thing, there is no proof, but there is evidence that they did, and I'm inclined to believe it.

If someone accused AMD of child slavery in a direct yes or no question I'd imagine, incompetent as they seem to be, they'd go on record super fast denying it. All I was looking for from AMD was a swift and concise denial of what they were accused of, yet we got (paraphrasing) "no comment" for months. Yes, that's evidence.
Posted on Reply
#162
cerulliber
let's see an european gamer real life options and pls don't go under 7900xt for comparison. 19% vat included

amd european shop
7900xtx "midrange is 1,091 euro
7900xt "midrange" is 818 euro out of stock. time for 8800xt release?
geizhlas.de
7900xt, an aib which is not Asus - 700 eur
7900xtx 910 euro

nvidia germany store
4070 ti super 810 eur
geizhals.de 820 eur

for 120 eur and no driver lottery every single sane european which is not fanboy will buy 4070ti.
now goodluck with amd 80% marketshare and report to amd real-life situation if you have contacts.
Posted on Reply
#163
csendesmark
AusWolfAre you sure it's a driver problem, though?
I have 24.6.1 installed, upgraded from the utterly unstable 24.8.1- the driver crashed several times a day... did not check the latest tho, but generally I have low trust for their new drivers.
Posted on Reply
#164
Neo_Morpheus
AusWolfFair enough. I don't play that game, and I don't have any issue in any other, so I'm fine. Are you sure it's a driver problem, though?
I had a similar issue with my 6900xt and the culprit was a failing psu and like the other poster, only happened under specific times, meaning under heavy utilization.
cerulliberand no driver lottery
Lord, this is beyond tiresome.
Posted on Reply
#165
AusWolf
wolfI mean, that's the way evidence works, it's a broad term that covers relevant facts or information, evidence absolutely allows people to draw conclusions, some evidence is effectively proof, but not all evidence is a 100% conclusive "this means x person 100% did y crime", you need to listen to all the available evidence, and depending on the circumstance, only consider beyond all reasonable doubt to be acceptable (criminal case for ex) or on probability (civil case), but even that is talking about court cases and law rather than something not technically illegal but still a bad thing to do. Evidence doesn't need to be a literal smoking gun with fingerprints and a signed admission of guilt.

If those people gave you that answer well that also sucks, and I'd say it sucks for anyone saying the same but saying AMD didn't do the thing, there is no proof, but there is evidence that they did, and I'm inclined to believe it.

If someone accused AMD of child slavery in a direct yes or no question I'd imagine, incompetent as they seem to be, they'd go on record super fast denying it. All I was looking for from AMD was a swift and concise denial of what they were accused of, yet we got (paraphrasing) "no comment" for months. Yes, that's evidence.
Fair enough. Then let's just say that there was never enough evidence for me to draw the absolute conclusion that AMD's hand was definitely in it. So with that, I let them off on this one. I like thinking on the premise of "innocent until proven guilty", and since no "evidence" served as proof... you know where I'm going. Of course, you're free to think on the contrary, but I don't think either your or my way of thinking is wrong.
csendesmarkI have 24.6.1 installed, upgraded from the utterly unstable 24.8.1- the driver crashed several times a day... did not check the latest tho, but generally I have low trust for their new drivers.
That's cool, but have you ruled everything out? Is 24.6.1 undeniably more stable than 24.8.1?
Posted on Reply
#166
wolf
Better Than Native
AusWolfFair enough. Then let's just say that there was never enough evidence for me to draw the absolute conclusion that AMD's hand was definitely in it. So with that, I let them off on this one. I like thinking on the premise of "innocent until proven guilty", and since no "evidence" served as proof... you know where I'm going. Of course, you're free to think on the contrary, but I don't think either your or my way of thinking is wrong.
I'm not drawing an absolute conclusion either, I'm on the side of likelihood and I deem it more likely than not, personally, unless I see something very compelling to the contrary. Others seem keen to also say "what about when nvidia did.. ", and I'd argue that going by the same standard of evidence, they perhaps shouldn't be vilified either (for certain things). On likelihood however I'd also wager nvidia did the many bad things that aren't conclusively proven that they've been accused of, just because on the basis of evidence it seems likely. Neither deserve a pass.
Posted on Reply
#167
csendesmark
AusWolfThat's cool, but have you ruled everything out? Is 24.6.1 undeniably more stable than 24.8.1?
The rig performs well, no other issues
Except when sometimes LM studio needs to be reloaded, sometimes - but that not ending driver crash
Posted on Reply
#168
Random_User
ratirtIf AMD targets the mid range, what we are supposed to take from that?
Which would be the mid range card for the next gen? Is it the 7900 xt? The 7900 xt was $799 at release and it was quite a bit for something of that caliber.
What would be the price for the mid range of that caliber? Is $400 OK? It is still a mid range.
If AMD can sell 7900 XT performance for a $300 with the RDNA 4 arch I will be willing to buy it. $350 to consider
That`s what should be. Heck, it should have been this way, about two years ago, with the RDNA3 launch, instead of artificially inflating the SKU numbers, to squeeze more money from shafted loyal supporter buyers.
Such strategy, to return to the "old days" and roots, is what everyone, including AMD themselves would favor heavilly. I mean, when the top solutions from previous generation becomes cheaper, not only on the second-hand market, but manifests as performance uplift for the new generation mid-end cards. This is the whole meaning, and definition of progress.

Though so far, sadly, this seems more like the wishful thinking, false hopes, and holow promises. I would rather believe, that by mainstream segments, AMD means 7900XT performance for 7900XT money, not the shifting it down the sane prices. They would rather adjust the price tags and SKU monikers of mid end, to the higher numbers, than lose the sweet profit margins. At least the last five or so years, have proven, that this predatory behaviour, to be a solid trend, and most likely is a presumable outcome, rather than not.
Posted on Reply
#169
AusWolf
csendesmarkThe rig performs well, no other issues
Have you tried with an Nvidia card of similar calibre?
wolfI'm not drawing an absolute conclusion either, I'm on the side of likelihood and I deem it more likely than not, personally, unless I see something very compelling to the contrary. Others seem keen to also say "what about when nvidia did.. ", and I'd argue that going by the same standard of evidence, they perhaps shouldn't be vilified either (for certain things). On likelihood however I'd also wager nvidia did the many bad things that aren't conclusively proven that they've been accused of, just because on the basis of evidence it seems likely. Neither deserve a pass.
I also like disregarding the "what about Nvidia" comments, to be honest. People like treating similar things as if they were different sides of the same coin, but they're not.

For example, the "what about GameWorks being gimped on AMD" comments... For one, it's an entirely different matter, so there's no "what about" here. Secondly, Nvidia developed GameWorks for Nvidia cards. They probably never even tested it on AMD. Why would they have? Let's be reasonable. :D

Limiting a lot of games to DLSS only in early days is kind of a similar thing, but for one, it's still a separate issue, and secondly, AMD was late with FSR, so there wasn't really another option anyway.
Posted on Reply
#170
wolf
Better Than Native
AusWolfI also. ....... anyway.
You're nothing if not reasonable my dude :toast:
Posted on Reply
#171
AusWolf
wolfYou're nothing if not reasonable my dude :toast:
Cheers, likewise. :toast:

I just wish a lot more people had the ability (and willingness) to remain impartial in matters concerning for-profit companies that they're not financially invested in.
Posted on Reply
#172
Draconis
Dr. DroYour justification is precisely why AMD is doing this. Why would they bother with a high investment, high complexity project for a high-end board if most Radeon gamers are running previous generation midrangers, tend to be precisely from the slice of the market that is all too happy to settle "conscientious consumers", don't upgrade unless there are massive gains and would rather shun and dismiss new features and advanced functionality if it means they save a buck... the whole defense of the Radeon business at this point has centered against the fact that Nvidia pushes the envelope too hard and charges too much for their products, yet at the same time, software would not advance if the hardware wasn't there to back it up. Turing was the prime example of this, it was the first of the modern architecture GPUs and it was extremely expensive with very limited software availability at the time.

A quiet, efficient chip is the best upsell they can offer at this point in time. Best to use the wafers in high-margin businesses like Epyc, after all, Radeon's core audience doesn't really care about DLSS, RT, or even day one driver optimization...
As an AMD 6750X owner it's a good point. RT, Upscaling etc doesn't mean much to me at the moment. They will probably when they mature. And I only got the AMD card because it was the first card that came down to close to MSRP at the tail end of the mining craze. I don't need day 1 driver optimisation because I generally wait at least a year before I start any game so that hopefully the kinks are ironed out.
Posted on Reply
#173
las
DavenIt’s important to point out here that we are talking mainstream performance in the NEXT gen which means high performance in the current gen.

AMD cannot release 7800xt performance levels as the max RDNA4. The highest SKU needs to tie or beat 7900XT levels which would move down a peg in the next series typically.
AMD is not competiting in the enthusiast tier in current gen either.

AMD mostly sell cards in the sub 500 dollar segment, nothing new really.

The most popular AMD cards of all time are like sub 250 dollar cards.

AMD is years behind on features and drivers are simply way more wonky. Especially when you leave the most popular titles that gets benchmarked alot (for reviews). AMD GPU is terrible in most early access games, lesser popular games and in emulators. With Nvidia you get good performance across the board, with drivers ready for new games before release date, and best features possible. Resell value is higher because demand is higher. Watt usage is lower.

You literally save nothing by going with an AMD GPU when you factor it all in. Hence why marketshare keeps dropping.
Posted on Reply
#174
Draconis
lasAMD is not competiting in the enthusiast tier in current gen either.

AMD mostly sell cards in the sub 500 dollar segment, nothing new really.

The most popular AMD cards of all time are like sub 250 dollar cards.

AMD is years behind on features and drivers are simply way more wonky. Especially when you leave the most popular titles that gets benchmarked alot (for reviews). AMD GPU is terrible in most early access games, lesser popular games and in emulators. With Nvidia you get good performance across the board, with drivers ready for new games before release date, and best features possible. Resell value is higher because demand is higher. Watt usage is lower.

You literally save nothing by going with an AMD GPU when you factor it all in. Hence why marketshare keeps dropping.
I disagree, see post above.

I have zero stability issues with my 6750XT and users like me may be in the minority but we do exist. I needed to upgrade my GTX970 because it was not cutting it anymore and didn't go with another Nvidia card because bang for buck pricing at the time AMD was miles ahead.
Posted on Reply
#175
csendesmark
AusWolfHave you tried with an Nvidia card of similar calibre?
As I wrote above, my last Nvidia card was the awesome 8800 GTS 320MB
There was no preference for me between AMD and Nvidia, until the Eyefinity advantage gone.
Would not replace this card, but my next one will be an Nvidia... most likely a 6080 or a 7080 when we get there.
Posted on Reply
Add your own comment
Oct 6th, 2024 12:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts