• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

2015: 18%
2019: 18.8%
2020H2: 18%
2022: 10%
2023Q4: 19%
As @Assimilator said, that was always only one or two quarters in a row (Q2-Q3 2015, Q4 2018, not 2019, Q4 2020) and then since Q3 2022 the last seven quarters in a row.

What is it with that random line that isn't even a real line? Did you just fail at drawing a straight line from the first to thal last shown quarter or did you connect random quarters on purose?
 
As @Assimilator said, that was always only one or two quarters in a row (Q2-Q3 2015, Q4 2018, not 2019, Q4 2020) and then since Q3 2022 the last seven quarters in a row.

What is it with that random line that isn't even a real line? Did you just fail at drawing a straight line from the first to thal last shown quarter or did you connect random quarters on purose?
Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.

Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.
 
Hope they don't expect people to buy 8GB & 12GB cards this time around. They should really not release any 128Bit or even 192Bit cards.
 
The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages.
No, it is not and that's my point. Up until Q1 2020 there is much up and down and while the second quarter in the graphic, Q2 2014, was their highest, Q1 2020 was pretty high too. But even until Q1 2022 it was short but heavy downs and loong but slow ups.

The only trend I see is that they never get above 40% marketshare and that they have been below 20% for the last seven quarters - but even here, with ups of 5%-points.

Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.
You can talk someone to death. They have certainly done better before, but theay have always come up with inovations to carve out their marketshare. RDNA3 hasn't been as successfull as hoped, but RDNA2 showed again that innovation can beat long term market domination to a certain point.

Since I remember the release of Radeon 9700 Pro and HD5870 that had NV overwhelmingly beat for months and also HD48x0 and the whole GCN1-3 series that didn't beat NV in every aspect but were so competitive that prices went down to an alltime low, I am certain AMD Radeon will rise up again.
 
Last edited:
Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.

Is AMD a story of lost and wasted potential? Damn sure. I don't disagree on that. But it is what it is, and I do believe they have a strategy now that's working for them, with little reasons included to completely lose the PC market, even a small presence won't kill their margins on GPU, its just extra.

Is anybody actually buying those high margin 7900 cards? AMD’s current strategy seems to be competing with Intel. That’s not a good sign.
 
No, it is not and that's my point. Up until Q1 2020 there is much up and down and while the second quarter in the graphic, Q2 2014, was their highest, Q1 2020 was pretty high too. But even until Q1 2022 it was short but heavy downs and loong but slow ups.

The only trend I see is that they never get above 40% marketshare and that they have been below 20% for the last seven quarters - but even here, with ups of 5%-points.
Yeah you could also read that out of it. But another trend along the whole graph is that their peaks consistently lowered over time, and they bottom out in an ever shorter cadence, until they arrive at consistent below 18% share. They had some not-too-shitty years, but nowhere is there consistent growth YoY.

If anything those ups show that there is potential to regain market share. But selling a quarter or a few quarters 'better' than your usual trend of 'down YoY' isn't a positive. It just means you reacted to the market proper with either pricing or timing. And in AMD's case, its always pricing. Pricing under the competition.
 
Hope they don't expect people to buy 8GB & 12GB cards this time around. They should really not release any 128Bit or even 192Bit cards.
The people will buy what is offered. Thinking otherwise is applying DIY tech enthusiast mindset to general consumers. And general consumer doesn’t really choose what he is buying, he is TOLD - by marketing, influencers, friends, just general mind share of a company is enough often. He has no fucking idea what a memory bus is. But he knows that NVidia = PC gaming. And, as such, even the most, from our perspective, crippled cards will absolutely sell.
 
Is anybody actually buying those high margin 7900 cards? AMD’s current strategy seems to be competing with Intel. That’s not a good sign.
I did, if that counts for anything haha. No regrets really, its working fine. And that's all it does, too.

But again, strategy is more than just the tiny slice of PC dGPU for consumers. GPU is bigger than that. For AMD, its also APU, its console business, its laptop products, etc.
And I do think Intel is their more direct competitor, especially now that they're also doing GPU; AMD has to stay ahead of them much more so than Nvidia, who's limited in the x86 space.
 
Oh that was just my free hand 'straight line', don't think too much of it. The trend is clear anyway, isn't it. And yes, obviously if the trend is down, at some point you're gonna sink below your occasional bottom percentages. The point was, no matter what AMD has done over all those years, (rebrands, revivals, etc.) they never consistently clawed back share.
Yes, but there is a time at which accountants and investors start to get twitchy and asking "is this particular line of business still worthwhile to be investing in, maybe we could and should be redirecting these resources to more profitable segments of the business"? And human nature being what it is, that time is likely going to align with a drop from double- to single-digit marketshare. According to the trend we're observing, that drop is likely going to happen within the next year. I can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.

I don't want to see AMD exit the desktop GPU market, regardless of how many people call me a fanboy how many times, because I'm scared of what that would mean for consumers. I'm just presenting data and saying, this looks really bad, and AMD needs to change something to make it not bad - and endless forum posts accusing NVIDIA of being greedy, or anticompetitive, or whatever are not it. Nor is buying AMD products out of a misplaced sense of brand loyalty.
 
Yes, but there is a time at which accountants and investors start to get twitchy and asking "is this particular line of business still worthwhile to be investing in, maybe we could and should be redirecting these resources to more profitable segments of the business"? And human nature being what it is, that time is likely going to align with a drop from double- to single-digit marketshare. According to the trend we're observing, that drop is likely going to happen within the next year. I can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.

I don't want to see AMD exit the desktop GPU market, regardless of how many people call me a fanboy how many times, because I'm scared of what that would mean for consumers. I'm just presenting data and saying, this looks really bad, and AMD needs to change something to make it not bad - and endless forum posts accusing NVIDIA of being greedy, or anticompetitive, or whatever are not it. Nor is buying AMD products out of a misplaced sense of brand loyalty.
Absolutely agreed but endless forum posts about AMD needing to do better haven't worked for them either :)
It is what it is, sometimes things need to turn to absolute shit before they get better.
 
It will be case of More you buy more you save with GB203 priced to make Gb202 more attractive.
Leatherjacket isn't going to live that quote down, is he? :)

It's like RT "just works" (except when it doesn't)

But again, strategy is more than just the tiny slice of PC dGPU for consumers. GPU is bigger than that. For AMD, its also APU, its console business, its laptop products, etc.
And I do think Intel is their more direct competitor, especially now that they're also doing GPU; AMD has to stay ahead of them much more so than Nvidia, who's limited in the x86 space.
Yeah, Nvidia's bread and butter is currently datacentre compute.

AMD have made fantastic inroads to the servers market with EPYC but their enterprise/datacentre GPU solutions lack the CUDA support they need to gain any significant traction. I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
 
faster than a 4080?. a 5090 needs to be “price increase percentage” faster than a 4090. a 7900xtx is faster (tpu reviews, cp2077) than a 4080…
 
But another trend along the whole graph is that their peaks consistently lowered over time, and they bottom out in an ever shorter cadence, until they arrive at consistent below 18% share. They had some not-too-shitty years, but nowhere is there consistent growth YoY.
You are a genius. The same is true for NV, since it's a duopol and NVs graph exactly mirrors AMDs. Only while AMDs never goes above 40%, NVs never goes below 60%.
For AMD, its also APU, its console business, its laptop products, etc.
That's hopefully one big reason to continue developement in GPU.

I can very conceivably see AMD decide to discontinue desktop GPUs is the question comes up at that point, and focus the resources currently being spent there on consoles and CPUs.
That's what I fear might happen, too, but see above.
But perhaps we are interpreting AMDs chances to much from the highend point of view. After all, their marketshare was quite strong in the times of Polaris & Vega vs Pascal.
 
Leatherjacket isn't going to live that quote down, is he? :)

It's like RT "just works" (except when it doesn't)


Yeah, Nvidia's bread and butter is currently datacentre compute.

AMD have made fantastic inroads to the servers market with EPYC but their enterprise/datacentre GPU solutions lack the CUDA support they need to gain any significant traction. I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
I think people forget its a big, big pie here, and AMD's gotten itself quite a few slices of it at this point.

We're actually talking about them competing with Intel. And let's be real here: their CPU product is now better, their GPU product is lightyears ahead of them too. Hasn't Nvidia ALWAYS been something they couldn't quite catch? I'm not seeing a difference here in the overall perspective. Every time, even when, rarely, Nvidia did objectively worse, Nvidia won.

You are a genius
Thanks! Luckily I can't detect the sarcasm here :)
 
Absolutely agreed but endless forum posts about AMD needing to do better haven't worked for them either :)
It is what it is, sometimes things need to turn to absolute shit before they get better.
I still remember people unironically saying that AMD should sell the Radeon division to Samsung when that particular rumor was floating around the time when the two companies announced that AMDs graphics IP will be coming to Exynos SOCs.

I think Nvidia knew this and they've been playing the long game with the better part of two decades of investment in CUDA's software/API/ecosystem monopolisation.
Yes, it’s called being a forward looking tech company with a solid business plan that is focused on a holistic product - hardware, software, all the surrounding ecosystem.

Alternatively, it can be called an EVUL move from an EVUL anti-competitive company that is led by a Dark Lord in a Leather Jacket of EVULNESS +7 who is bent on not allowing gamers to play the latest AAA slop at highest detail out of sheer spite for poor AMD and their paladins of virtue representing them on the forums.

You know, either/or.
 
I think people forget its a big, big pie here, and AMD's gotten itself quite a few slices of it at this point.

We're actually talking about them competing with Intel. And let's be real here: their CPU product is now better, their GPU product is lightyears ahead of them too.

I don’t think dGPU marketshare is a priority for AMD at this point. And you can’t really say they’re lightyears ahead of Intel when Intel already beats them in RT. I mean what else can AMD really lean on in terms of software over Intel? At the rate Intel’s GPUs are improving AMD won’t be ahead for long, assuming Intel is actually serious about this market of course.
 
The Ngreedia fanbois are really insane.

They make it sound like all AMD gpus are at least half as slow to their Ngreedia counterpart.

Yes, bring the RT nonsense and as stated, only influencers aka reviewers care about that.

Only 2 games (so far) are a decent sample of RT (Cyberpunk and Control) but doesnt add anything to gameplay, neither justifies the insane performance hit.
 
I don’t think dGPU marketshare is a priority for AMD at this point. And you can’t really say they’re lightyears ahead of Intel when Intel already beats them in RT. I mean what else can AMD really lean on in terms of software over Intel? At the rate Intel’s GPUs are improving AMD won’t be ahead for long, assuming Intel is actually serious about this market of course.
Where is Intel faster in RT? Relatively perhaps... but they have yet to make a truly fast GPU. Scaling up is exactly their biggest challenge. That's how their initial product failed so hard and long: they couldn't scale properly, basing themselves on IGPU technology, and then even when they rewrote the blueprint, all we got was A770.

Absolute, raw performance is the only real indicator. Because even RT performance will be based on that; you can accelerate all you want, but you can't accelerate past your raster/raw perf capability.

Also, alongside scaling their GPU up, there is the matter of die space/cost. How big is their die? How big is AMD's? And that's where chiplets come in. If AMD can improve that further, they have a technological advantage here, and the first release isn't horrible with RDNA3. Its not perfect. But it still moved the ball forward.
 
You'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
And if your buddy was the only Radeon customer in existence, I would declare AMD drivers 100% defunct. But since AMD has sold millions and millions of Radeons since their inception, I'm not going to worry about your buddy's problems too much.
 
Where is Intel faster in RT? Relatively perhaps...
Relatively, they're even faster than Nvidia (just looked at one test, in rasterizer, 4060 is 11% faster than 770 in 1440p, with RT it's only 1%). But yeah, scaling is their problem and the reason I wasn't invested in their first GPU generation at all.
Scaling was AMDs problem more than once, too.
 
When should we expect "the future" to arrive? It's been over half a decade since the RTX line launched and RT performance still stinks.
Not only is performance lackluster, but meaningful RT implementation is limited to a handful of singleplayer titles.

Looks like Star Wars Outlaws will have some sort of RT implementation, and I look forward to seeing how that is.
 
Relatively, they're even faster than Nvidia (just looked at one test, in rasterizer, 4060 is 11% faster than 770 in 1440p, with RT it's only 1%). But yeah, scaling is their problem and the reason I wasn't invested in their first GPU generation at all.
Scaling was AMDs problem more than once, too.
I am still convinced that besides this, RT is an overblown thing and the end result of that technology will become known within engine technologies, not as poster child 'muh RTX is ON' bullshit. We're still early adopting this tech, and it could go any number of ways, the only real consistent way I see right now is stuff like Nanite, or the implementation that CryEngine shows us in Neon Noir. Visually remarkably close to 'real RT (ahem... with denoising and numerous other tweaks, so define real...)', but without a hard performance hit, I can even run that on Pascal.

Especially if you want cross platform compatibility, which is rapidly becoming a must especially for anything that also release outside the PC camp (but even within, think handhelds!), please do explain to me how you're going to guzzle extra power to enable RT proper if you haven't even got that to show the best of raster graphics.

It ain't happening. Literally every market movement except the one Nvidia tries to convince us of, is moving in the opposite direction. RT will only work if you get your game from the cloud. So, how does that mix exactly with Nvidia selling you 1500 dollar GPUs I wonder? Where is this long term RT perspective on dGPU?
 
The Ngreedia fanbois are really insane.

They make it sound like all AMD gpus are at least half as slow to their Ngreedia counterpart.

Yes, bring the RT nonsense and as stated, only influencers aka reviewers care about that.

Only 2 games (so far) are a decent sample of RT (Cyberpunk and Control) but doesnt add anything to gameplay, neither justifies the insane performance hit.
Clearly the only DIY enthusiasts who are able to think for themselves are the ones still buying AMD cards, everyone else has been programmed. Are nVidia's influencers using MKUltra tech in their videos? Or is that LSD in the water supply that makes games with proper RT implementation look so good?
 
Last edited:
Sucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
And they are, but the future is not the present, and tell me a gpu that can properly run pathtracing on a modern games without any trickery, and tell me how many of those modern games have pathtracing to begin with.
Besides, not only are we a ways off, I'd argue that games still look plenty good without path tracing, I can wait.
 
Back
Top