• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Radeon extremely low market share, only 8% in Q3 2022?

What do you want from AMD to improve, so that makes you buy a Radeon?

  • Add maximum mining performance

    Votes: 0 0.0%

  • Total voters
    157
I would say this article is sensationalist but that's not uncommon on these "hardware" sites.
The way we should look at market share numbers is not quarter by quarter but by using a moving average so by averaging x amount of quarterly results.

3dcenter.org has this chart in their article: https://www.3dcenter.org/dateien/abbildungen/GPU-Add-in-Board-Market-Share-2002-to-Q3-2022.png

For example if you want to know the market share of the previous generation of cards (Ampere vs RDNA 2) then average the data since the Ampere launch and the result is 81.33% vs 18.66%. And this is normal, AMD's market share has been around 20% for a long time. Not really healthy from the consumers perspective because NV is controlling the market but that's the way it is.
This generation (Ampere vs RDNA2) was basically about crypto mining and so IMO this timeframe is not a good indication of AMD's "mindshare" (market share) because it didn't matter if RDNA 2 cards would've been better perf/$ _in gaming_ because everything was sold out all the time.
What this timeframe shows is that AMD has much lower capacity vs Nvidia when it comes to providing GPUs.

Nevertheless AMD has much work to do if they want a higher percentage: marketing, drivers, presentation. But I'm not sure if they care at all. The GPU business has low margins, they're much more likely to use the available production capacity at TSMC for Epyc CPUs. I think they view their GPU lineup as a proving ground for their semi-custom partners (Sony, Microsoft, others).

I have to add that NV is a master at manipulating people. I mean they introduced a feature (ray-tracing) which is basically a gimmick (barely anybody turns it on even on NV cards, it drastically lowers performance and barely noticable) but now two generations after the introduction it's hard to sell AMD cards because they're slower than slow in RT. And they didn't even improve the RT performance since the Turing generation, what we see in games is still just improved raster performance, but turning on RT has the same performance penalty on Turing, on Ampere and on Lovelace. Not sure if this is intentional by NV or their architecture has bottlenecks somewhere. So yeah.
Nv drivers are not flawless
 
NVIDIA doesn't have lower image quality than Radeon, this has been debunked by just about every publication ever...

Radeon image quality is better even if it's hard to prove it with regular comparison static images.
It's just there and very noticeable during gameplay.

That's also what people have been saying for years:

1669540203982.png


1669540244458.png


 
I don't need convincing to switch to AMD. I already have.
 
Radeon image quality is better even if it's hard to prove it with regular comparison static images.
It's just there and very noticeable during gameplay.

That's also what people have been saying for years:

View attachment 271823

View attachment 271824


Let's just pretend that you're arguing in good faith here, which you're not, your evidence is this anecdote from a couple of "specialists" on YouTube's comment sections?

ATI had been "cheating" with mipmapping on Unreal and Quake before you were born. Different GPUs have different optimal texture formats, this has only begun changing in the age of the unified shader GPU, and even then - it is not uncommon for GPU drivers from either brand to completely replace shaders on the fly to optimize for their own architecture.

You're just trying to justify your own senseless rhetoric with even more senseless nonsense. Just stop dude.

I'm not replying to you further until you actually bring something to the table - in good faith - and with actual technical data to back it up.
 
Last edited:
Let's just pretend that you're arguing in good faith here, which you're not, your evidence is this anecdote from a couple of "specialists" on YouTube's comment sections?

ATI had been "cheating" with mipmapping on Unreal and Quake before you were born. Different GPUs have different optimal texture formats, this has only begun changing in the age of the unified shader GPU, and even then - it is not uncommon for GPU drivers from either brand to completely replace shaders on the fly to optimize for their own architecture.

You're just trying to justify your own senseless rhetoric with even more senseless nonsense. Just stop dude.

I'm not replying to you further until you actually bring something to the table - in good faith - and with actual technical data to back it up.
In my opinion, AMD's better image quality is a relic from the past, just like Nvidia's better driver quality is. The Nvidia image quality argument stayed with us from the GeForce FX era (Youtube link, 4:48), and the AMD driver quality argument remained from the 5700 series. Both arguments are obsolete as of today.
 
Future game engines will incorporate it directly and you will have no choice to ignore it, or deactivate it.
That's not how it works though.
A tech not available to lowest common denominator of a target audience will never be made mandatory (unless dev is an idiot or targeting a niche a la VR).
This denominator isn't just defined by AMD, but also majority of Nvidia users who don't own a >$500 card, or whatever the minimum for a great RTRT experience these days.
Do note that "mandatory" is equivalent to "lowest settings," and then start comparing to perf cost of things like bilinear filtering and FXAA.

Engines already "incorporate" ray tracing. See UE4 and Lumberyard. Even first party engines have jumped in; RE Engine and Frostbite I know of. Yet still we don't see games that require rendering RT shaders.

AMD obviously doesn't do drivers for consoles.
Only the one who made the chip can write any driver that hopes to properly utilise it.

Projecting PC experience on consoles is probably unwise here. At the very least, lack of fragmentation in consoles would make QA much easier/tighter.
 
At this rate it will be AMD representing Conslow vs Nvidia representing PC ;)
 
At this rate it will be AMD representing Conslow vs Nvidia representing PC ;)

"at this rate"?!

that's what has been happening for years now. Since the PS4 era
 
I, for one couldn't care less about raytracing but i will say DLSS > FSR at this time.
 
Radeon image quality is better even if it's hard to prove it with regular comparison static images.
It's just there and very noticeable during gameplay.

That's also what people have been saying for years:

View attachment 271823

View attachment 271824

No it's very provably not. That is some placebo-grade horseshit rightthere.
 
I, for one couldn't care less about raytracing but i will say DLSS > FSR at this time.
I, for one, couldn't care less about DLSS or FSR. They all worsen the quality of a native resolution image in the hopes of more FPS. No thanks, I'd rather just power through native 1080p than spend money to upgrade my monitor and look at a second-grade, scaled back 1440p or 4K image. I remember when the question used to be "who provides the better graphics" instead of "who provides less shit graphics", and for me, it still is.
 
I, for one, couldn't care less about DLSS or FSR. They all worsen the quality of a native resolution image in the hopes of more FPS. No thanks, I'd rather just power through native 1080p than spend money to upgrade my monitor and look at a second-grade, scaled back 1440p or 4K image. I remember when the question used to be "who provides the better graphics" instead of "who provides less shit graphics", and for me, it still is.

I'd rather run native resolution too.. But my monitor is 3840X1600 @ 165hz so DLSS is kind of required for that FPS and resolution.. Unless i can snag a 4090 :)
 
no one would use fsr or dlss if they didn't need to, that goes without saying.
no one will be turning them turned on when playing, today's games, in 10 years
 
no one would use fsr or dlss if they didn't need to, that goes without saying.
no one will be turning them turned on when playing, today's games, in 10 years
Technically, you don't need to use them, the same way you don't need to buy a 4K monitor, either. I'm still okay with 1080p native with no upscaling magic. If someone isn't, that's fine too. All I'm saying is, that it's a choice rather than necessity.
 
The graphics division is reporting losses. It can't forever be subsidized by the other divisions.

It not severe to the point where the logical conclusion is bankruptcy. GPU division was propping up CPU div for the entirety of their bulldozer days and they didn't declare bankruptcy then, you memory hole that for whatever reason?

Anyway, the push for ai/ml is what has been moving gpus and this is where Nvidia has a leg up on AMD, they need to improve there.
 
no one would use fsr or dlss if they didn't need to, that goes without saying.
no one will be turning them turned on when playing, today's games, in 10 years

Except when today games get remastered with modern graphics in 10 years, just like Witcher 3 which came out close to 8 years ago.

Beside, DLAA and DLDSR provide superior IQ to TAA, yet only 4090 owners can comfortably use those features for now ;)
 
Technically, you don't need to use them, the same way you don't need to buy a 4K monitor, either. I'm still okay with 1080p native with no upscaling magic. If someone isn't, that's fine too. All I'm saying is, that it's a choice rather than necessity.

it's a bit more complicated, imagine the games you play all run fine in 4k so you go 4k monitor, then comes the new beast game, the CP77, badly optimized or just hardware heavy, and you can no longer get the frame rate you want unless you turn them on. But all other games still do fine. Going 1080p or 1440p on a 4k monitor is not ideal. It can be a necessity.

Or you need the 4k for productivity (it does make a world of difference) and then the same monitor must game.

That's the beautiful thing about PC's, each case is different.
 
it's a bit more complicated, imagine the games you play all run fine in 4k so you go 4k monitor
Except that most of my favourite games would run fine at 4K with my current hardware, but I still think that upgrading to 4K is a huge waste of money. Better resort to upscaling to 4K on a 1080p monitor, than to downscaling on a 4K one.

As you said, each case is different. :)
 
Except when today games get remastered with modern graphics in 10 years, just like Witcher 3 which came out close to 8 years ago.

Beside, DLAA and DLDSR provide superior IQ to TAA, yet only 4090 owners can comfortably use those features for now ;)
DLAA is usable on my 3080ti with many games. You absolutely do not need a 4090 to use the full feature stack of RTX.

I see that, as usual, people downplay the revolution that is AI in graphics.

 
Beside, DLAA and DLDSR provide superior IQ to TAA, yet only 4090 owners can comfortably use those features for now ;)
Those technologies are totally unnecessary for most people, but since you've got a 4090 to use them, goon on you! ;)
 
Those technologies are totally unnecessary for most people, but since you've got a 4090 to use them, goon on you! ;)
Replacing MSAA/TAA with superior AA without lowering render resolution is unnecessary?

I guess lots of things are, by definition, "unnecessary", such as luxury foods, computers with more than 640K RAM, or Radeon GPUs with performance in areas other than raster.
 
Replacing MSAA/TAA with superior AA without lowering render resolution is unnecessary?
If your monitor resolution isn't overestimated compared to the computing power of your PC, then why would you lower your render resolution in the first place?

And no - it is not necessary. I'm not saying that it isn't nice, but it's not necessary for a solid gameplay experience. It's a technical extra, nothing more.

We live in an age of lots of sudden technological advancements, but we shouldn't forget that they're only devices to make a game a tiny bit better - they are not experience-defining in any way.
 
OMFG.

This thread is a pile of steaming shite. Where's my 'poo' emoji?
 
OMFG.

This thread is a pile of steaming shite. Where's my 'poo' emoji?
If you consider the title and poll, it's kind of a self-fulfilling prophecy, wouldn't you think? :laugh:
 
Everything from reviewers to the companies them selfs stops the market specially after the mining craze that went on and with the launch of Nvidia's 4090 and 4080 and AMD also set to release their 7000 series makes people wait to see the performance gains.

Even if Nvidia would say we don't make a 4070 and below until 2024 and shoots all OEMs and more in the feet people would look for better deals or just buy the 4090 or 4080 even if AMD god forbid said the same thing we will only launch the 7900XT and XTX either people buy that or the last highend gens gpu's used for people that want or most have the newest.

I am personally always excited but I got a RX 6800 XT and I am happy with the card and in the 30 series I tried RTX 3090 and RTX 3070 but never was happy missed the RX 6800 XT so now I try to commit myself to stay on this card for the next couple of years and focus my money on something else not hardware related and if a hdd or ssd dies it will be replaced but for other components no plans for an upgrades on my side.
 
Back
Top