Monday, September 9th 2024

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

AMD in an interview with Tom's Hardware, confirmed that its next generation of gaming GPUs based on the RDNA 4 graphics architecture will not target the enthusiast graphics segment. Speaking with Paul Alcorn, AMD's Computing and Graphics Business Group head Jack Huynh, said that with its next generation, AMD will focus on gaining market share in the PC gaming graphics market, which means winning price-performance battles against NVIDIA in key mainstream- and performance segments, similar to what it did with the Radeon RX 5000 series based on the original RDNA graphics architecture, and not get into the enthusiast segment that's low-margin with the kind of die-sizes at play, and move low volumes. AMD currently only holds 12% of the gaming discrete GPU market, something it sorely needs to turn around, given that its graphics IP is contemporary.

On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."
Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."

The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.

The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.

Catch the full interview in the source link below.
Source: Tom's Hardware
Add your own comment

271 Comments on AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

#126
Dr. Dro
yfn_ratchetThis works for me. If they can match the longevity that the RX 5700XT and the RX 6000 series in general is enjoying while bringing price down, I'm all for it. A 16GB GDDR6 card with decent midrange performance (so, somewhere between the 7700XT and 7800XT current) for like $380ish would be a sharp kick in the rear to actually start sliding the price scale back down. I just hope they pick up the high end with the Radeon PRO line so they're not losing out on professional clients. I remember they were doing an awesome job of filling in the gaps left by Quadro Ada.
What longevity does a card released 5 years ago, with incomplete DirectX API support, absolutely no matrix multiplication/tensor/ray acceleration functionality, buggy drivers, deficient encoding hardware, reliability issues to the extent it received literally 5 steppings while it was being manufactured, limited VRAM size from day one have? The 5700 XT is a good case of how to not make and maintain a GPU.
Posted on Reply
#127
xSneak
"Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy."

Didn't know buying a $1200 gpu was the equivalent of buying a $120k car (911 carrera base model) o_O; i'll gladly trade my 4080 for a used 911 if anyone is out there! :D
Posted on Reply
#128
Outback Bronze
csendesmarkSomewhat understandable

According to the Steam hardware survey
You have to "go down" to the 16th place to see the first card which can be considered as a high-end.
This is not new, there are only a few exception - most of the money profit came on the mid to low tier cards.


What?
When was the last time when nVidia had competition on the top tier graphics card market?
Well, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
Posted on Reply
#129
yfn_ratchet
Dr. DroWhat longevity does a card released 5 years ago, with incomplete DirectX API support, absolutely no matrix multiplication/tensor/ray acceleration functionality, buggy drivers, deficient encoding hardware, reliability issues to the extent it received literally 5 steppings while it was being manufactured, limited VRAM size from day one have? The 5700 XT is a good case of how to not make and maintain a GPU.
"Released 5 years ago"
And still viable and in support, ergo still around.

"Incomplete DirectX API support"
Ah yes, lack of support for feature level 12_2, which includes... things it doesn't have in its design drafted in 2018. When DX12 Ultimate came out in 2020. Be real.

"No matrix multiplication/tensor/RT support"
When the competition barely had all of that in the card's heyday? When the people that have/are interested in a 5700XT probably won't be considering workloads like that?

"Buggy drivers"
This I'll concede, but that applies more to the card's earlier years than it does now.

"Deficient encoding hardware"
I will also concede VCN kind of sucking but that fact has not changed in comparison to NVENC/QSV at any point in the last decade or so. Didn't kneecap its sales nor sales of RX 6000 and RX 7000.

"Reliability issues requiring several revisions"
Much of what I could even dig up is solved by now or had existing workarounds at the time, cards that are still around can be/are fixed.

"Limited VRAM size."
Lemme grip you by the ear and rattle off some models you might be familiar with. 2070. 2070 SUPER. 2080. 2080 SUPER. 3060Ti. 3070. 3070Ti. 4060. 4060Ti 8G. A580. A750. A770 8G. Released around the same time or newer, or far newer, all hampered by the same 'lack of VRAM' you rest the blame on AMD for as if they were supposed to have some moment of divine providence to realize that The Last of Us Part 1 Remastered: Extra Shitty Port Edition (2023) will need more than 8GB. When the cards that roughly match it in performance have the same memory sizes and are still being used TODAY.

I highlighted how long the 5700XT has lasted as a card that you can slap into a PC and still use within its means, specifically in the segment that consumer Radeon targets: value-conscious gaming. Even its geriatric Polaris predecessor the RX 480/580 is still seeing use. Much of what you cite as it being a 'bad example' are issues that were either relevant only in its youth or a result of the card, shocker, being old. Get a grip.
Posted on Reply
#130
john_
64KSome AMD fans are AMD's worst enemy. All of the mind-numbing hate just puts people off from wanting to be associated with the brand. AMD deserves better fans than that imo because they really do accomplish a hell of a lot with little resources.
It's ALWAYS AMD's mistake, or bad AMD fans or whatever with an AMD reference on it. NEVER EVER EVER Nvidia's or Intel's fault.

That's why AMD is limiting their investment in the PC gaming market. Why invest in a market where the consumers will go with any logical or illogical excuse and buy the competitor's product? Why invest in a hostile market?
Posted on Reply
#131
cerulliber
I think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
Posted on Reply
#132
evernessince
RedelZaVednoThis is extremely bad news for us enthusiasts. Without competition Ngreedia's prices will skyrocket. Prepare to pay at least $1499 for 5080 and $1999 for 5090:mad:
Unfortunately I don't even think you can call that skyrocketing, it's just normal at this point.
john_It's ALWAYS AMD's mistake, or bad AMD fans or whatever with an AMD reference on it. NEVER EVER EVER Nvidia's or Intel's fault.

That's why AMD is limiting their investment in the PC gaming market. Why invest in a market where the consumers will go with any logical or illogical excuse and buy the competitor's product? Why invest in a hostile market?
Yep, I remember when people accused AMD of forcing FSR only in starfield (a sponsored title) with zero proof and the game ended up having DLSS anyways. I don't seem to remember people making the same complaint for games that implemented DLSS and only added FSR later. People were outraged that AMD might want to promote it's own features in a game it's paying to sponsor meanwhile they praise Nvidia for doing the same thing.

People also complained that AMD was forcing it's sponsored games to use a large amount of VRAM, again with zero proof.

Meanwhile when Nvidia has a bug or an issue like the terrible 12VHPWR adapter, 3000 series transient spikes and noise feedback in the 12v sense pin, New World bricking cards, or the discord bug that lowered clocks, people blamed everyone but Nvidia.
Outback BronzeWell, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
Yep, AMD needs a Ryzen moment for their GPUs. They need to provide enough of a value advantage to make customer take notice, because most aren't even considering AMD.

That said I'm not sure they could have a Ryzen moment because Nvidia has been very aggressive in the past with it's pricing to prevent AMD from gaining marketshare. Nvidia could lower mid to low end GPU prices temporarily just to crash AMD and then things would return to normal the gen after. As we've seen with their AIBs and the AI market, they aren't afraid of coercion and other illegal tactics either.
Posted on Reply
#133
csendesmark
Outback BronzeWell, this is the problem isn't it...

Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.

I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.

I am wishing Radeons well. We need it!
I was thinking about a week ago,
My last Geforce was the NVIDIA GeForce 8800 GTS 320 in 2007,
I loved it, but kept buying ATi then AMD because it supported 3 displays (Eyefinity), sometimes it was better and sometimes it was cheaper.
Now I starting to hate it since AMD doing nothing to make the cool stuff accessible with their GPU-s (mostly AI which sometimes I use) and the incredibly s*#t drivers.
They just not deserving my money anymore...
cerulliberI think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
I don't have recent experience with Nvidia drivers... but I can tell AMD is not doing well on that department.
Posted on Reply
#134
Eliad Buchnik
In other words : we can't compete in next generation on high end, so we will try to compete in the low and mid end.
Nothing wrong with that, but as rumors have it RDNA 4 will top at RX7900XT to RX7900XTX performance with rt performance of 4070 to super and price of 500-600$, at best it will be OK performance/dollar improvement from last gen with stronger RT performance. All nvidia has to do is price their card within 10-15% more from AMD equivalent and people will pay the extra.
Posted on Reply
#135
mechtech
That's fine. My typical budget is $250-$350CAD anyway for a gpu.
Eliad BuchnikIn other words : we can't compete in next generation on high end, so we will try to compete in the low and mid end.
Nothing wrong with that, but as rumors have it RDNA 4 will top at RX7900XT to RX7900XTX performance with rt performance of 4070 to super and price of 500-600$, at best it will be OK performance/dollar improvement from last gen with stronger RT performance. All nvidia has to do is price their card within 10-15% more from AMD equivalent and people will pay the extra.
Nvidia can price their cards 40% higher and people will still buy them. Just like iphones.............
Posted on Reply
#136
sephiroth117
Issue for me is that someone who sees a company release a 4090, will more likely purchase a GPU from them.

Those products (4090) not only sells relatively well but mainly serve as a flagship, a technological demonstrator and it works really well for Nvidia.

Now tbh, 2000-2500 EUR RTX5090, is that really relevant ? It's a very restricted class of consumers that can purchase one, it's way more than one month of minimum wage ("smic" around 1300 EUR) in France just to give reference.

...but you see a RTX 5090, marvel of engineering, it's a gigantic soft power, a gigantic ad for many consumers.

I think they should be in-between, no need to pursue the 4090/5090, but a 5080Ti would be the perfect spot for a 8950XTX, something with a smaller delta than a 7900XTX/4090..but not a direct 5090 contender, a cheaper in-between one that shows you still can pull a fight
Posted on Reply
#137
Dr. Dro
cerulliberI think AMD already can compete with nvidia
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
Hm, let's see.

4070 Ti Super's 16 GB memory is not a hindrance to its performance, even at the highest resolutions that go beyond this class of hardware's capabilities in today's most VRAM hungry titles



4070 Ti Super will match the 7900 XT's raster performance



4070 Ti Super should be more than 20% faster at RT than the 7900 XT



4070 Ti Super is more power efficient than 7900 XT even in its strongest factory overclocked models, while pushing the heaviest ray tracing workloads - generating less heat to do so - in this specific case, the 7900 XT is likely power limited, but even at full tilt its still ~20 watts of difference under full load




4070 Ti Super features AD103 dual-engine NVENC which will record in 4:4:4 and HDR at high resolutions and frame rates at near zero latency

developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

4070 Ti Super has access to NVIDIA Studio drivers which have stability and productivity application compatibility in mind

www.nvidia.com/en-us/studio/

4070 Ti Super supports the full breadth of functionality in NVIDIA's RTX ecosystem, including Canvas, RTX AI, DLSS frame generation and ray reconstruction technologies, as well as full support for Intel's XeSS and AMD FidelityFX

I could go on, but it's obvious that AMD cannot compete and this gap will significantly widen as RDNA 4 targets RTX 4080's performance, lacks most of its feature set, and Blackwell's release is imminent. The return isn't worth the investment, Radeon buyers' mindshare doesn't line up, the wafers can be used at more profitable segments, the investment software-side is almost if not more of an Herculean effort to get up to par than the hardware... it's good that they came clean. AMD just cannot do it.
Posted on Reply
#138
Querkle
BRING BACK CROSSFIRE! please... ;)
Posted on Reply
#139
Ravenas
For 4K gaming, the 7900 XTX is perfectly acceptable. Average frames are ~ 20 FPS lower than the 4090, but you spend ~ $750 less than the 4090. With that $750 savings, you're averaging ~ 90 FPS across all games 4K tested in W1z's suite in Windows 11 (which is a hog, at some point I would love to see some mini reviews in Linux). Devote the money to a good processor and ram. RT is still a fantasy concept in 4K without scalers or ml. I honestly hate the concept of frame generation outside of consoles (the last 2 gens). I'm looking for pure raster performance without degraded imaging.

Competitive games like Counter Strike and DOTA 2 where high frames matter, you will see greater than 200+ frames in 4K max settings. I have a high 4k monitor with 240 hz. refresh, but it's just investment for future cards. The monitor really isn't a use case quite yet.

I honestly felt like AMD already gave this market away with their absence in competition for the 4090. Going further, would be a deal breaker for me on future cards with AMD... This generation, I would have loved to have seen a 7950 xtx (similar to the Asrock version I have www.techpowerup.com/review/asrock-radeon-rx-6950-xt-oc-formula/) or at minimum a binned Navi 31 similar to the XTXH (found in the power color ultimate version I have www.techpowerup.com/gpu-specs/powercolor-red-devil-rx-6900-xt-ultimate.b8752 ). With that being said, again they have already given up the enthusiast market. I'm already disappointed, and going further would be a bad move by AMD in my opinion.

Right now I have multiple AMD cards, 7900 XTX, 7900 XT, 6950 XT, 6900 XTXH, 6700 XT, and 6750 XT. These cards perform well for all their use cases. However, for my main PC, I'm looking for very good raster in 4K. AMD giving this market away loses me as customer.

AMD is in great position to compete, but investors want gains in AI.
Posted on Reply
#140
Bagerklestyne
AusWolfIt depends on your target resolution and settings, too, I guess. To me, a 7900 GRE/XT level GPU with improved RT for £400-450 sounds great. Add reduced video playback power consumption into the mix, and I've got my next GPU.
I'd be willing to consider that.

I've just moved to all 4k (after our monitor discussion I pulled the trigger on a MSI MPG321URX) so 4k/high refresh is the aim. I'll be playing league of legends and solitaire at high refresh I guess.

I think after seeing what Wukong, Star Wars outlaws and now Space Marine 2 take to run at 4k and not get even near 120fps I'll be moderating my expectations for a generation or 3.
Posted on Reply
#141
wolf
Better Than Native
evernessinceI remember when people accused AMD of forcing FSR only in starfield (a sponsored title) with zero proof
Half of the friggen tech press accused AMD of doing this, and yeah there was no proof, just a mountain of supporting evidence for that conclusion, they didn't come to this conclusion because they just wanted to hate on poor old AMD.
Posted on Reply
#142
wheresmycar
As long as AMD doesn't classify 7900XT/XTX level performance as "enthusiast GPU" its all good in the hood!

And exactly what are we referring to as the "enthusiast GPU segment"? You've got the APEX flagship, higher tier, mid and low. Not competing (or can't) with the 5090 makes sense but when did 5080/5070 level performance fall short of the "enthusiast GPU". Seem likes we have to blow insane amounts of money at the fastest brick in town to get a green card for the enthusiast graphics club?
Posted on Reply
#143
evernessince
wolfHalf of the friggen tech press accused AMD of doing this, and yeah there was no proof, just a mountain of supporting evidence for that conclusion, they didn't come to this conclusion because they just wanted to hate on poor old AMD.
"mountain of supporting evidence"? Where, if it's that easy to prove then provide it. I followed the issue when it was going down and the only "evidence" provided was people's observations.

People jumped to conclusions when AMD's marketing didn't explicitly say they weren't blocking DLSS but AMD's marketing department has got to be the worst marketing department of any company in the world. You cant rule out that it's just incompetence. Frank Azor came in shortly after that and clarified that they do not mandate FSR only in AMD sponsored games.

Could AMD have restricted DLSS prior to public outcry? Sure, but why haven't these people also complained on any of the occasions a game released with DLSS and not FSR? Why were these people silent about any of the numerous worse things Nvidia has done like GameWorks, GPP, ect? GPP never went away, all the top SKUs from Gigabyte, MSI, and ASUS are Nvidia only. Public and press response? Nothing. I explicitly remember people giving Nvidia a pass for anti-competitive gameworks features when there was explicit proof provided showing games like Crysis 2 used way too high levels of tessellation (far beyond any visual benefit and even tessellated objects outside of view, there's a video on Youtube demonstrating this still up) or when games like The Witcher 3 was running hairworks on models that weren't even visible.

The cope was either "Nvidia has a right to push it's prorietary features" or outright denial. I pointed out to TPU multiple times that the top SKU from MSI, Gigabyte, and ASUS weren't available on AMD cards anymore and they said 'wait a few weeks'. Well I let them know a few months after that that they still weren't available but I guess TPU didn't care for the story. If we are going to hang AMD for starfield not launching with DLSS and FSR based some frankley pretty weak evidence, it's a complete double standard to not do the same to Nvidia, particularly when in many cases the evidence is stronger.
Posted on Reply
#144
wolf
Better Than Native
evernessince"mountain of supporting evidence"? Where, if it's that easy to prove then provide it. I followed the issue when it was going down and the only "evidence" provided was people's observations.
If you didn't believe it then why would you believe it now? I can provide it, but I can tell right now it won't change your mind as it's already made up. Personally I found AMD's inability to provide a simple yes or no answer to a very straight forward question asked of them by multiple reputable tech press outlets very telling. If they we're absolutely not doing it, it would have been extremely easy and great PR to just say so. At best their PR and marketing department are woefully incompetent and mishandled one of the easiest (possible) perception wins of the year (in the timeline where they're innocent), and at worst they absolutely intentionally blocked competitor upscaling technology, riiight up until they either wound back the requirement or the clause expired, and then they still couldn't give a clear, unambiguous retrospective answer. It's not a criminal law case, so both sides of the argument will never get a beyond all reasonable doubt/proof type finding, we're all free to make a judgement based on a balance of probabilities and available evidence, and I, like several reputable tech press outlets, believe it's more likely than not AMD did the bad thing intentionally so they shut up for months while doing damage control. You're free to disagree and make your own judgement naturally, but like I doubt I can convince you they did it, you are extremely unlikely to convince me they didn't, unless you have something new to share we didn't all see at the time. My point is that this many reputable, reasonable people didn't come up with it out of thin air because they have an AMD hate boner, they came to the conclusion because of it's relative plausibility.

And no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.
Posted on Reply
#145
SailorMan-PT
Prima.VeraPrepare for 2000$+ video cards from nGreedia. /facepalm.
Well, if people are dumb enough to change every year their phone for a simmilar one, then they are probably dumb enough to pay nGreedia's callous prices too, so...
Quite funny bro. I like this names you gave Nvidia. NGreedia. That's so great But you have absolutely right. With those dumb gamers out there, who pays every scale of high prices. When they're asked for which reason. They say, I want the best performance on the GPU market. Of course they doing this. No matter if the prices looking like 1800 2000 2200 2400 2600 2800 and 3000 bucks
Posted on Reply
#146
evernessince
wolfIf you didn't believe it then why would you believe it now? I can provide it, but I can tell right now it won't change your mind as it's already made up. Personally I found AMD's inability to provide a simple yes or no answer to a very straight forward question asked of them by multiple reputable tech press outlets very telling. If they we're absolutely not doing it, it would have been extremely easy and great PR to just say so. At best their PR and marketing department are woefully incompetent and mishandled one of the easiest (possible) perception wins of the year (in the timeline where they're innocent), and at worst they absolutely intentionally blocked competitor upscaling technology,
Have you not made the argument that AMD's marketing department is incompetent in the past? I don't think that would be a controversial statement. I think there's sufficient evidence to apply Hanlon's Razor in this instance. It's seems contradictory for you to say in past comments that AMD's marketing department is bad (and I would agree) and suddenly not consider that in this scenario.

Frank Azor from AMD did clarify a few days later but obviously by then the damage was already done.

None of your comment of which touches on the fact that the same auspices which started the uproar over a lack of DLSS at launch were present in reverse in many titles that supported only DLSS at launch. It's a clear double standard. Notwithstanding any of the other things Nvidia has done (as pointed out in my last comment) where an equal or greater level of evidence was provided than what you are using to spear AMD here, yet I don't see people complaining about those.
wolfriiight up until they either wound back the requirement or the clause expired, and then they still couldn't give a clear, unambiguous retrospective answer.
You could say that of every company nowadays, Nvidia included.
wolfIt's not a criminal law case, so both sides of the argument will never get a beyond all reasonable doubt/proof type finding, we're all free to make a judgement based on a balance of probabilities and available evidence, and I, like several reputable tech press outlets, believe it's more likely than not AMD did the bad thing intentionally so they shut up for months while doing damage control. You're free to disagree and make your own judgement naturally, but like I doubt I can convince you they did it, you are extremely unlikely to convince me they didn't, unless you have something new to share we didn't all see at the time. My point is that this many reputable, reasonable people didn't come up with it out of th
Tech Outlets both didn't come to that conclusion nor should they given the lack of concrete evidence. It would be insanely bad journalism to publish an article with a definitive conclusion based on mere observation. It's one thing to report on a story, it's another to pass something as fact.

Yes, clearly I'm unlikely to change your mind because your own idea of what happened is mismatched with what actually happened. In your mind, every outlet confirmed AMD blocked DLSS (they didn't) and every reasonable person. By extension you are dismissing people who disagree as unreasonable. Your logic doesn't leave any room for disagreement or room to change your mind.
wolfAnd no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.
You are completely assuming said titles wouldn't have had them regardless. This is classic confirmation bias, you are seeing this in a way that confirms your self-admitted non-flexible version of events.
Posted on Reply
#147
Dr. Dro
evernessinceFrank Azor from AMD
I wonder if Frank ever paid that guy his $10... it's been years, surely the interest added up
Posted on Reply
#148
wolf
Better Than Native
evernessinceHave you not made the argument that AMD's marketing department is incompetent in the past? I don't think that would be a controversial statement. I think there's sufficient evidence to apply Hanlon's Razor in this instance. It's seems contradictory for you to say in past comments that AMD's marketing department is bad (and I would agree) and suddenly not consider that in this scenario
I have and they absolutely are, but both can also be true. And I did consider it, it's my one of two options, but the option I consider less likely, or perhaps also true. Razors also aren't some mic drop argument winning statement, they often have merit but I'd even wager that should the events I believe have occurred, AMD would be hoping people just think of this razor.
evernessinceyet I don't see people complaining about those.
People can and do complain about what Nvidia does all the damn time, as do I when I feel what they do is shitty, and it was absolutely a rebuttal used in this particular fiasco too. Saying but what about Nvidia isn't an effective rebuttal to this assertion.
evernessinceTech Outlets both didn't come to that conclusion nor should they given the lack of concrete evidence.
Tech outlets didn't say "100% undeniably AMD did this" which is also not what I am saying, I am saying I believe AMD did do this, as it's likely, which is what they said too.
evernessinceYes, clearly I'm unlikely to change your mind because your own idea of what happened is mismatched with what actually happened. In your mind, every outlet confirmed AMD blocked DLSS (they didn't) and every reasonable person. By extension you are dismissing people who disagree as unreasonable. Your logic doesn't leave any room for disagreement or room to change your mind.
Right back at you, you don't know conclusively what happened either, yet you are stating it as if its facts. I don't consider it to be fact, I just believe it is what occurred on a balance of everything I saw, heard and read. I have absolutely left you room to change my mind, but you've offered nothing I didn't already know. I welcome you to present either a new viewpoint or evidence that might change my mind.
evernessinceYou are completely assuming said titles wouldn't have had them regardless. This is classic confirmation bias, you are seeing this in a way that confirms your self-admitted non-flexible version of events.
That's also possible, but you can't also assume with 100% certainty that would have happened regardless of the fiasco either.

Why do you insist I am non flexible on the version of events? I told you why I believe what I believe and readily concede it isn't definitive proof, it just seems to me, to be the most likely thing that happened when I consider all evidence I have consumed. You came to a different conclusion, that doesn't make your version fact however, and if anything, you are coming across completely non-flexible on the topic. I would readily entertain irrefutable proof that this didn't happen. Hell, I'd have even taken AMD's word for it at the time, but they didn't even want to offer that.

So now that we've opened this can of worms, do we keep repeating ourselves or is there something new to add that might sway you or I? if there is nothing new, well I think we both said our piece on why we believe what we believe to have occurred, and in the absence of something revelatory this conversation is going nowhere.
Posted on Reply
#149
tussinman
Hopefully they actually make these aggressively priced.

All I heard was "RDNA 3 is cheaper to produce" and yet the 7600/7700 series was nothing special both performance and price wise. Even the 7800XT (power assumption aside) was the same performance as the 6800XT
Posted on Reply
#150
Punkenjoy
There is a big pile of bullshit in that thread, that is something.

But since the subject is AMD/Nvidia, that is not a surprise (and both fanboys are guilty)


But anyway, AMD is somewhat Lying, or well masquerading the truth at least.

- Current Gen is not competitive at the spec level with Nvidia.
- They declared in another news today that they want to unify RDNA/CDNA because they have too much issue designing all theses chips. Also they changed their whole architecture way too much. If you look on the Zen side, they redesign the I/O die every 2 gen. Since RDNA, the whole GPU got redesigned each generation and they had to do something similar for CNDA. This is way too much for them and it's not efficient at all.

- I assume that RDNA 4 is bellow initial expectation (when they did the first design). They aren't expecting to be able to compete with it on the high end no matter what they do, it will be a loss.
- They probably have low hope too for RDNA 5 but they need to focus quickly and get back on track to regain market share and be competitive for the next generation of consoles.
- They then make the decision to scrap what would be unprofitable anyway, the high end, and focus on the mid range where they could still sell some GPU.

This just seems like their strategy is to survive for few years while they get their shit together.

That is fine, they were not on the path of winning anyway with Radeon. They do not seems to divest from the adventure. They just seems to focus. Let's hope they get back on their feets quickly. Maybe they can do in few years a "real Ryzen moment". I do not think they right now know what need to be done to regain market leadership.


Nvidia having no competition is great for them on the short term. But company that have near monopolistic market share and low competition in tech have always failed in the long run. They generally transform themselves after few year in a market milking machine instead of being a innovation machine and they get caught flat footed when competition is back.

It have happen to Intel, IBM, etc,

Personally i don't care much if the market slow down. I would for sure like to be able to upgrade to newer higher end GPU for cheap frequently, but at the same time, i recall how long i had my i7 2600K and i wouldn't mind keeping a GPU that is capable to playing games greatly for that long.

There is positive in everything.
Posted on Reply
Add your own comment
Oct 6th, 2024 11:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts