• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's GPU market share hits 90% in Q4 2024 (gets closer to full monopoly)

Status
Not open for further replies.
no, they aren't. maybe among the vocal minority who will always defend their brand, but I think the criticism is quite universal. if you're an amd fan looking for internet squabbles with an nvidia one, you will always find them no matter what.
Theres a certain saying for this but I can't think of it right now.. Well said though! Would like to say the same thing happened back when Intel was the top dog with CPU's.
 
I think the criticism is quite universal
Criticism doesn't mean anything if there are no consequences, Nvidia has such a long history trying to pull these sorts of things it's hard to believe anything serious is going to happen, even the 970 lawsuit things was kind of a joke, only applied to US and the compensation was on a per request basis and this looks way worse than that.
 
Last edited:
no, they aren't. maybe among the vocal minority who will always defend their brand, but I think the criticism is quite universal.
I agree with Vya Domus, the criticism doesn't mean anything if Nvidia isn't punished for their actions.
Nvidia has gotten away with so much I don't expect them to get a class action for the missing ROPs, since the 12VHWPR connectors melting got blamed on the user without any investigation from consumer safety groups, Nvidia will probably get a pass on selling defective cards.
 
can't judge this yet, the sales at the end of the year will show if 9070xt has indeed sold more units than 5080/5070ti, but in order to do so, amd need to start selling the card, unless you want people to buy 7900xtx's that will cost more than 9070xt, perform worse with rt than 9070xt, and lack fsr4. deal of a freaking lifetime. you are blaming people for buying nvidia instead of amd, when amd hasn't showed their card yet. seems like you both would like nvidia to fail more than you'd like amd to do well. no wonder you're finding conflict everywhere.
 
Last edited:
can't judge this yet, the sales at the end of the year will show if 9070xt has indeed sold more units than 5080/5070ti, but in order to do so, amd need to start selling the card.

Yeah, hopefully with the non stop shit show that is Nvidia currently AMD can get back up into the 20-30% area which would set up amd for a good run with udna.
 
Yeah, hopefully with the non stop shit show that is Nvidia currently AMD can get back up into the 20-30% area which would set up amd for a good run with udna.
from the leaks 9070xt looks great, we'll see about availability though. given how poor blackwell launch looks, the most important thing is getting their cards out, which they aren't.
 
from the leaks 9070xt looks great, we'll see about availability though.

The bar hasn't been this low in over a decade maybe even since 2008 so if they can't clear it there probably is no hope of them ever actually competing again honestly this is in my opinion the worst Nvidia generation since at least 400 series so unless we get a 500 series like refresh in 6 months there is no reason amd can't capitalize assuming they don't have their head so far up their @$$ they can't see how good of an opportunity they have in front of them.
 
from the leaks 9070xt looks great, we'll see about availability though. given how poor blackwell launch looks, the most important thing is getting their cards out, which they aren't.
I doubt it will have good availability. RDNA4 is going to be disputing fab allocation with other higher-margin, higher-volume products.
there is no reason amd can't capitalize assuming they don't have their head so far up their @$$ they can't see how good of an opportunity they have in front of them
I do agree with the overall idea for the long run reputation, but I believe going all-in in this opportunity is not really the most profitable option in the short term, nor really setups much for UDNA.
 
I do agree with the overall idea for the long run reputation, but I believe going all-in in this opportunity is not really the most profitable option in the short term, nor really setups much for UDNA.

Yes and in all honesty we don't even know how AMD can price these at and still be profitable but trying to start at 60-70% margins just to drop to close to 0 margins after a couple months isn't the play I'm more hoping that they are aggressively smart and not just go for short term gains at the expense of long term market share.

We as consumers need them to get to at least a 30/70 split with as close to 50/50 as possible although 40% is probably their cealing but even 25-30% plus Playstation is respectable honestly.

They've dropped from 35-40% over the last decade to 10% market share becuase of their own choices hopefully they don't make the same mistake again when Nvidia is fumbling the 5000 series launch harder than any previous one I can remember out side of 400 series.
 
You didn’t address that your own source says it’s a reference card. I wonder why?

Anyway, buh-bye.
It says it's a reference PCB. Your source for the dell card doesn't eve state that. Not a reference CARD wich is the Founders Edition.

I'm sorry, but I gotta call you out on this. Pathtracing and high resolution textures on such a lightweight system... just isn't viable, even if it had 3 times as much VRAM. An R5 9600X+4070S system is pretty much the definition of midrange. It's just good enough to give you a taste of next-gen, while being utterly incapable of keeping up with workloads such as that game.
That is exactly why no one should buy a 12 gb midrange card because it's "better at raytracing". Texture are super light on the core for a huge improvement on visual fidelity, as long as Vram is avaiable. Lack of Vram can show as awful 1% lows AND/OR lack of textures displayed wich is very hard to notice when the reviewer are just benchmarking to get the numbers out.

Rdna2 was very good at launch around 2020/21 imo, but it's aging really, really poorly due to underwhelming rt and lack of AI hardware to support FSR4. Vram won't save a card is technologically behind.
Oh god,

The whole competing 30xx lineup other than the 3090 is not able to run todays games with the tech you are talking about because they lack vram. That has aged poorly, not rdna2.

Shame that W1zz hasn't included RDNA 2 in the RTX 50 tests, hope he does that for the 9070's. Guru3D has data on most but it's probably not too updated, some of Hilbert's charts over there still have Kepler cards in them, and I can't see that data being too new, unless he's been retesting hardware that far back... his reviews aren't putting them in great light, in some cases like his Witcher 3 tests, placing stuff like 6800 XT below 3070 Ti, even at 4K where the 8 GB is not-so-great

You realize how old Witcher 3 is, and how little use charts that have no indication on wether the reviewer actually looked at the textures to spot if they are missing are? That is a gigantic concern when talking about low vram cards, as they could even get a slight performance boost because they don't even have to render them. The 2080ti was also going to cost same used as the 6800 new minus taxes, they were basically the same tier for many customers. They are the same price just now on the used market, but the rdna card has 16gb vram.

How would you know, are you sitting on discord listening to your mic feedback?

There are many comparison videos, just google it on youtube - krisp vs broadcast.

Wasn't the nvidia broadcast feature required? Now you wouldn't even know if you are not using it. The exact opposite of required :D

can't judge this yet, the sales at the end of the year will show if 9070xt has indeed sold more units than 5080/5070ti, but in order to do so, amd need to start selling the card, unless you want people to buy 7900xtx's that will cost more than 9070xt, perform worse with rt than 9070xt, and lack fsr4.
AMD has stated they will provide FSR4 on RDNA3, just as they did provide ryzen 5800x3d to b350 boards eventually. You are making things up.
 
Last edited:
no, they aren't. maybe among the vocal minority who will always defend their brand, but I think the criticism is quite universal. if you're an amd fan looking for internet squabbles with an nvidia one, you will always find them no matter what.

We vote with our wallets, though.
(Says someone who owns a 4090. I'm a hypocrite.)
 
We vote with our wallets, though.
(Says someone who owns a 4090. I'm a hypocrite.)

Personally, I do vote with my wallet. Although this botched launch bodes rather ill for Nvidia in the future. I'll be following RDNA 4 closely...
 
Personally, I do vote with my wallet. Although this botched launch bodes rather ill for Nvidia in the future. I'll be following RDNA 4 closely...

I wish AMD hadn't just given up on the high-end market. I also hope my 4090 doesn't go simmering anytime soon cause I don't want to be paying for any more leather jackets.
 
Nvidia is the greedy company pushing the price envelope further up, they are the ones making over 60% margins on a card.

What are they supposed to be RMA'ed with? The issue is there is very little supply in the first place, there are hardly any cards around yet so many issues with the few cards sold.
With nv a sucker is born every moment
 
Stop the bickering and stay on topic.

Please and thanks.
 
Stop the bickering and stay on topic.

Please and thanks.
No. We bicker over who's bickering at the bickerer. /s


Seriously- AMD is dropping market share not because of RT. But because of pricing and perceived "future proofing". I wish they would invest the money they are already investing in their AI datacenter cards into the consumer side. Which they are now doing. They made a mistake going to 2 different architectures for data center and consumer. So hopefully next generation will fix that and bring them back into the competition since they are unifying the architectures again.

It goes to show for longevity that the very compute cores in the current Instinct cards is basically the same exact compute cores used in GCN5. lol
 
No. We bicker over who's bickering at the bickerer. /s


Seriously- AMD is dropping market share not because of RT. But because of pricing and perceived "future proofing". I wish they would invest the money they are already investing in their AI datacenter cards into the consumer side. Which they are now doing. They made a mistake going to 2 different architectures for data center and consumer. So hopefully next generation will fix that and bring them back into the competition since they are unifying the architectures again.

It goes to show for longevity that the very compute cores in the current Instinct cards is basically the same exact compute cores used in GCN5. lol
Ok lets just leave ot at this, it is called UDNA, it is being taped out.
 
Ok lets just leave ot at this, it is called UDNA, it is being taped out.
I know. lol. Looking forward to seeing the fruits of that tape out. I just hope they go full bore on it and give it the bleeding edge periphery equipment to go with it. HBM3 for example.
 
I don't think HBM is ever returning to consumer cards, until they find a way to bin stacks before assembling the processor.

This was always the biggest issue with it since Fiji, if any of the memory didn't work or the interposer had any defects you had to trash the whole thing, even if the core was good. Once bonded, there is no removing it anymore.
 
Yeah, hopefully with the non stop shit show that is Nvidia currently AMD can get back up into the 20-30% area which would set up amd for a good run with udna.

Yup, it's not even fun to discuss the GPU War anymore when Radeon marketshare is so close to non-existence that Nvidia is trying hard to suppress their own marketshare as not to become a monopoly :kookoo:
 
The whole competing 30xx lineup other than the 3090 is not able to run todays games with the tech you are talking about because they lack vram. That has aged poorly, not rdna2.
again, if you can't do the research yourself, at least read the posts that respond to your made-up claims directly. Just because 3070/4060Ti struggle (which is true and documented), doesn't mean all ampere do, like you keep insisting. 3080 10G is fine. 3080 10G does better with RT than 7900gre which has 16gb, let alone its rdna2 counterparts. Ig 62 fps avg. is poor, what is 48 then ?
Screenshot 2025-02-25 080424.png


AMD has stated they will provide FSR4 on RDNA3, just as they did provide ryzen 5800x3d to b350 boards eventually. You are making things up.
they said it's possible but they don't have time to work on it - sounds like planned architectural obsolescence.

You realize how old Witcher 3 is, and how little use charts that have no indication on wether the reviewer actually looked at the textures to spot if they are missing are? That is a gigantic concern when talking about low vram cards, as they could even get a slight performance boost because they don't even have to render them.
it'd be great if you could actually prove that 10/11/12GB cards have that problem with anything other than this old hub video, which doesn't address 10/12GB cards at all.
 
Last edited:
How could Nvidia still on the rise even at those prices with missing rops and design flaws?
 
How could Nvidia still on the rise even at those prices with missing rops and design flaws?
Missing ROPs is covered under warranty. It won't stop anyone from buying nvidia cause that's what warranty is there for. What other design flaws are you referring to?
 
Missing ROPs is covered under warranty. It won't stop anyone from buying nvidia cause that's what warranty is there for. What other design flaws are you referring to?

He's probably talking about the connector that while technically isn't made by Nvidia is stupidly implemented at a min.
 
He's probably talking about the connector that while technically isn't made by Nvidia is stupidly implemented at a min.
I love the connector though. If the nitro 9070xt has a 12vh, im inclined to support the card (you know, by actually buying it, not with forum posts :p )
 
Status
Not open for further replies.
Back
Top