• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
AMD inability to compete is because no one will buy their chips even though they are very competitive against Nvidia's offerings. Luckily, you Assimilator has just volunteered to buy AMD as your next graphics card to help drive down Nvidia prices. I will join you and together we will show everyone that the only way to bring about a competitive market is for everyone to stop buying brand and gimmicks and start buying great performance per dollar tech regardless of what name is on the box.

5080 Ti/5090 here I come.

3080 Ti has been great, but it's time for an upgrade.

Hoping for around ~40% better performance than Ada, more is great of course.

3080 Ti to 5090/5080 Ti ideally around 2x faster.

Since I framecap to 237 FPS, faster/more efficient cards also means a lower total wattage which is always nice, unless new games/software push the GPU that much harder, which I doubt. Ada was a significant efficiency leap and very tempting, but I don't upgrade every gen of GPU.
I'll either be buying a 9950X3D and a Radeon 8900XTX for my next build or skip a generate and get Zen 6 and RDNA5. Since AMD is best for gaming in my opinion and will continue to focus equally between gaming and AI, my dollars will continue to go to them until Nvidia stops wasting resources on RT and AI.
 
Nvidia stops wasting resources on RT and AI.
AMD's already all-in on AI just like everyone else in the market, lol, and we're just one gen away from seeing if they're going to finally improve their RT. What are you going to do if they follow nvidia?
 
Yeah! More stagnation! Let's vote for stagnation!

I don't know what you're looking at, but I'm seeing a slight uptick per tier, with all things increased except the bus width. GDDR7 makes up for part of the deficit though so bandwidth won't be worse than Ada relatively, that's good. But capacity, still 12GB in the midrange and 8GB bottom end? You're saying this is a good thing, now?
No, I'm saying it's good enough without sufficient competition.

Ada is already bandwidth constrained at the lower tiers. Nvidia is trying real hard to keep those tiers to what, 1080p gaming?
Every company wants to segment their products. When there's no competition that becomes a lot easier.

To each their own, but I think in 2025 people would like to move on from 1080p.
I've only played in 1440p since 2019. The 4060 Ti I switched to earlier this year has not given me any problems in this regard despite "only" 8GB VRAM and "only" a 128-bit bus. The only people you regularly see bemoaning NVIDIA GPUs are not the people who own one.

As for AMD's inability to compete... RT fools & money were parted here. AMD keeps pace just fine in actual gaming and raster perf and is/has been on many occasions cheaper. They compete better than they have done in the past. Customers just buy Nvidia, and if that makes them feel 'screwed over'... yeah... a token of the snowflake generation, that also doesn't vote and then wonders why the world's going to shit.
It's nothing to do with RT and everything to do with marketing and advertising. When people read the news they see "NVIDIA" due to the AI hype, and that has had a significant and quantifiable impression on ordinary consumers' minds. AMD has completely failed to understand this basic concept, they seem to be operating on the assumption that having a slightly worse product at a slightly lower price point is good enough, and the market has very obviously shown that it absolutely is not. AMD has options to fight back against the mindshare that NVIDIA has with things like price cuts, but again because AMD doesn't understand they need to do this, they aren't.

Let's make it clear here, AMD is staring down the barrel regarding GPUs. The last 7 quarters are the worst for them since Jon Peddie Research started tracking this metric a decade ago, they had never dropped under 18% until Q3 2022, and with the upcoming Blackwell launch and nothing new from AMD we can expect NVIDIA to breach 90% of the desktop GPU market. That is annihilation territory for AMD GPUs, that is territory where they consider exiting the desktop consumer market and concentrate on consoles only. That is territory where your company should start pulling out all the stops to recover, yet what is AMD doing in response? Literally nothing.

And it all compounds. If NVIDIA believes they're going to outsell AMD by 9:1, NVIDIA is going to book out 9x as much capacity at TSMC, which gives them a much larger volume discount than AMD will get, which means AMDs GPUs cost more; AIBs will have the same issue with all the other components they use like memory chips, PCBs, ... Once you start losing economies of scale and the associated discounts you get into even worse of a position regarding being able to manipulate your prices to compete.
 
Last edited:
Let's make it clear here, AMD is staring down the barrel regarding GPUs. The last 7 quarters are the worst for them since Jon Peddie Research started tracking this metric a decade ago, they had never dropped under 18% until Q3 2022, and with the upcoming Blackwell launch and nothing new from AMD we can expect NVIDIA to breach 90% of the desktop GPU market. That is annihilation territory for AMD GPUs, that is territory where they consider exiting the desktop consumer market and concentrate on consoles only. And what is AMD doing in response? Literally nothing.
See this is conjecture. Who said this? AMD isn't saying this, they're simply continuing development and they're not trying to keep pace with Nvidia because they know they can't.

Is AMD staring down the barrel? Is this really worse here than the years they were getting by on very low cashflow/margin products, pre-Ryzen? Are we really thinking they will destroy the one division that makes them a unique, synergistic player in the market?

There are a few indicators of markets moving.
- APUs are getting strong enough to run games proper, as gaming requirements are actually plateau-ing, you said it yourself, that 4060ti can even run 1440p. Does the PC market truly need discrete for a large segment of its gaming soon? Part of this key driver is also the PC handheld market, which AMD has captured admirably and IS devoting resources into.
- Their custom chip business line floats entirely on the presence and continued development of RDNA
- Their console business floats on continued development of RDNA - notably, sub high end, as those are the chips consoles want
- The endgame in PC gaming still floats on console ports before PC-first games at this point and with more cloudbased/unification between platforms, that won't get less, it will get more pronounced.
- AI will always move fastest on GPUs, another huge driver to keep RDNA.

Where is heavy RT in this outlook I wonder. I'm not seeing it. So Nvidia will command its little mountain of 'RT aficionado's on the PC', a dwindling discrete PC gaming market with a high cost of entry, and I think AMD will be fine selling vastly reduced numbers of GPU in that discrete PC segment because its just easy money alongside their other strategic business lines.

This whole thing wasn't new or hasn't changed since what, the first PS4.

AMD is fine, and I can totally see why they aren't moving. It would only introduce more risk for questionable gains, they can't just conjure up the technology to 'beat Nvidia' can they? Nvidia beats them at better integration of soft- and hardware.

Still, I see your other points about them and I understand why people are worried. But this isn't new to AMD. Its story of their life, and they're still here and their share gained 400% over the last five years.
 
Curious to see how the 5060/Ti maybe a 5070 will end up.
I have no upgrade plans left for this year but sometime next year I wouldn't mind to upgrade my GPU and thats the highest I'm willing to go/what my budget allows. 'those will be plenty expensive enough where I live even second hand..:shadedshu:'
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still 12% less bandwidth on paper than a vanilla 4070 which is okay at 1440p, but that's with lower-latency GDDR6. I'm not 100% sure you can just compare bandwidth between GDDR6 and GDDR7 because latency will have doubled, clock for clock - which means (only a guess here) that the 5060Ti will have 88% the bandwidth of a 4070 but ~50% higher latency. That's going to make it considerably handicapped compared to a 4070 overall, so I guess the rest of it is down to how well they've mitigated that shortcoming with better cache, more cache, and hopefully some lessons learned from the pointlessness of the 4060Ti.
 
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still more than 12% less bandwidth than a vanilla 4070 which is a decent 1440p offering, but that's with lower-latency GDDR6, I'm not 100% sure bandwidth comparisons between GDDR6 and GDDR7 are possible because latency will have doubled, clock for clock - which means (only a guess) that the 5060Ti will have 88% the bandwith of a 4070 but ~50% higher latency.
They could fix the latency with cache
 
Two generation gap? For me the 2080ti to 4070ti was a 50% jump.

Settle for nothing less! :cool:
50% jump is great considering you went down in the stack by ~2 tiers and are only using ~30 W more power compared to FE 2080 Ti (4080 Ti doesn't exist, 4070 Ti S 4080 lite arguably a different tier than 4070 Ti).

I'm hoping two generations plus same tier or 1-2 tier up (5090/5090 Ti?) is enough to double performance.

Fingers crossed lol. If I do go 5090/Ti I'll likely keep it three generations to recoup the extra cost.

They could fix the latency with cache
Maybe, still I think xx60 class cards will be native 1080/DLSS 1440 for at least this next gen.

Important to bear in mind 1080p on PC or 1440p DLSS arguably looks better than "native" 4K on console, which is realistically the competition at the entry level.

Native in quotes because consoles typically vary resolution and make heavy use of mediocre upscaling when playing at 4K, that or have a 30 FPS frame target which is pathetic.
 
Here we go again...

Should be:

5090 - 512-bit 32GB <-- Needed for 4K Max settings in all games with 64GB being overkill.
5080 - 384-bit 24GB <-- 16GB is too little for something that will be around the power of a 4090.
5070 - 256-bit 16GB <-- Sweet spot for mid range.
5060 Ti - 192-bit 12GB <-- Would sell really well.
5060 - 128-bit 8GB <-- 8GB is fine if priced right...

And for the people slating AMD I had the ASUS 7900XTX TUF Gaming OC and it was incredible! Sure the street lights would flicker when I was 4K gaming but hey ho...
 
4060Ti 16GB is a 1080p card in 2023. I bought one (needed the VRAM buffer for work) and dumped it into the second PC in the living room with a 4K TV. It can barely handle 1440p without performance nosediving because there's simply not enough bandwidth.

If they're going to keep it on a 128-bit bus, GDDR7 is maybe going to turn it into a 1440p card. At 448GB/s it's still 12% less bandwidth on paper than a vanilla 4070 which is okay at 1440p, but that's with lower-latency GDDR6. I'm not 100% sure you can just compare bandwidth between GDDR6 and GDDR7 because latency will have doubled, clock for clock - which means (only a guess here) that the 5060Ti will have 88% the bandwidth of a 4070 but ~50% higher latency. That's going to make it considerably handicapped compared to a 4070 overall, so I guess the rest of it is down to how well they've mitigated that shortcoming with better cache, more cache, and hopefully some lessons learned from the pointlessness of the 4060Ti.
I'm not planning to upgrade my resolution/monitor so I'm fine in that regard. :)
2560x1080 21:9 is somewhere between 1080p and 1440p based on my own testing over the years and most of the time I'm running out of raw GPU raster performance first when I crank up the settings at this resolution so I wouldn't exactly mind 12 GB Vram either but 16 is welcome if its not too overpriced. 'I'm also a constant user of DLSS whenever its a available in a game so that helps'
Tbh if the ~mid range 5000 serie fails to deliver in my budget range then I will just pick up a second hand 4070 Super and call it a day. 'plenty enough for my needs'
 
They could fix the latency with cache
Yeah, that's what they said about Ada, and that didn't work - so I'll believe it when I see performance scaling without a huge nosedive!

Maybe a combination of refinements to the cache that they got wrong with Ada and the switch to GDDR7 will be enough. As always, it'll really just come down to what they're charging for it - the 4060Ti 16G would have be a fantastic $349 GPU but that's not what we got...

Tbh if the ~mid range 5000 serie fails to deliver in my budget range then I will just pick up a second hand 4070 Super and call it a day. 'plenty enough for my needs'
If the major benefits to the 50-series are for AI, the 40-series will remain perfectly good for this generation of games.
 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090

The RX 7900XTX Trades blows with the RTX 4080 Super mostly edging it out
The RX 7900XT beats the RTX 4070Ti Super
The RX 7900GRE Beats the RTX 4070 Super
The RX 7800XT Beats the RTX 4070
etc....

All while offering much better prices


relative-performance-2560-1440.png


relative-performance-3840-2160.png
 
Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090
Nvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.
 
What a monstrous difference from the largest chip to the level below. More than 2x bigger. :')
 
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
You'd be surprised how often have I heard "Aaaaand AMD display driver just crashed" from my buddy rocking a 6600 XT on a new AM5 system while playing the same game online.
 
What a monstrous difference from the largest chip to the level below. More than 2x bigger. :')
4090 wasn't fully enabled, not even close.
5090 probably won't be either.

These 100% enabled die numbers aren't representative of consumer cards, but Quadro ones.
 
AMD inability to compete is because no one will buy their chips even though they are very competitive against Nvidia's offerings. Luckily, you Assimilator has just volunteered to buy AMD as your next graphics card to help drive down Nvidia prices. I will join you and together we will show everyone that the only way to bring about a competitive market is for everyone to stop buying brand and gimmicks and start buying great performance per dollar tech regardless of what name is on the box.


I'll either be buying a 9950X3D and a Radeon 8900XTX for my next build or skip a generate and get Zen 6 and RDNA5. Since AMD is best for gaming in my opinion and will continue to focus equally between gaming and AI, my dollars will continue to go to them until Nvidia stops wasting resources on RT and AI.
Sucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
 
Nvidia brand loyalists are fixated on three things:
  • RT
  • DLSS
  • The internet myth that AMD has fundamental driver problems and Nvidia doesn't
Outside of those three things, the GPU market looks very even and competitive with AMD doing slightly better in performance and price as you pointed out. But even if all three of my points above didn't exist, these loyalists would still buy Nvidia. But I appreciate you and everyone else doing what they can to prevent the blind fealty to one company that threatens to ruin our DIY PC building market that we love so much.
RT still isn’t viable as the performance hit it still to big without DLSS

DLSS is ok but so is FSR

And yea I hear that a lot. Which is funny because I’ve used AMD since the HD 4000 days and haven’t had driver issues since Hawaii. Which quite some time ago.
 
4090 wasn't fully enabled, not even close.
5090 probably won't be either.

These 100% enabled die numbers aren't representative of consumer cards, but Quadro ones.
That’s actually an important point that people seem to miss. If the chart turns out correct (and that’s a big IF), then I would wager that a full GB202 with 64 gigs will be the most expensive pro-card config. Said 64 gigs might not even be GDDR7, perhaps, we had the precedent with RTX6000 Ada using regular GDDR6 instead of 6X. Would be interesting to see if this go around the yields will actually be enough to create a fully enabled card. With AD102, there never WAS a full-chip card. And the 4090 was obvious dregs sold for a ton to consumers.
 
Yeah! More stagnation! Let's vote for stagnation!

I don't know what you're looking at, but I'm seeing a slight uptick per tier, with all things increased except the bus width. GDDR7 makes up for part of the deficit though so bandwidth won't be worse than Ada relatively, that's good. But capacity, still 12GB in the midrange and 8GB bottom end? You're saying this is a good thing, now? Ada is already bandwidth constrained at the lower tiers. Nvidia is trying real hard to keep those tiers to what, 1080p gaming?

To each their own, but I think in 2025 people would like to move on from 1080p. The 8GB tier is by then bottomline useless and relies mostly on cache; the 12GB tier can't ever become a real performance tier midrange for long, its worse than the position Ada's 12GBs are in today in terms of longevity. Sure, they'll be fine today and on release. But they're useless by or around 2026, much like the current crop of Ada 12GBs.

As for AMD's inability to compete... RT fools & money were parted here. AMD keeps pace just fine in actual gaming and raster perf and is/has been on many occasions cheaper. They compete better than they have done in the past. Customers just buy Nvidia, and if that makes them feel 'screwed over'... yeah... a token of the snowflake generation, that also doesn't vote and then wonders why the world's going to shit.

You can't fix stupidity. Apparently people love to watch in apathy as things escalate into dystopia, spending money as they go and selling off their autonomy one purchase and subscription at a time.

Personally I blame AMD for not being able to compete for so long in terms of perf/watt, software (read: following in nVidia’s footsteps), drivers, *compatibility with emerging technologies such as RT and AI especially* (call/cope it how some may) etc… Their recent move of leaving the high end to Nvidia was basically them admitting defeat and now prices are sky high. The fact of the matter is integrated graphics makes a dGPU a nonessential part of a system, and by that I mean since you technically aren’t forced to buy one in the same vein that you’re forced to buy DRAM (especially given that dGPUs are interchangeable, not being locked to a certain vendor like you would be with a CPU socket for example), you really have to sell a product based on merit more than anything.

And if anyone thinks AMD are innocent in all this, don’t forget, they launched their 7900XTX at $1,000. So they aren’t gonna save you either.
 
That’s actually an important point that people seem to miss. If the chart turns out correct (and that’s a big IF), then I would wager that a full GB202 with 64 gigs will be the most expensive pro-card config. Said 64 gigs might not even be GDDR7, perhaps, we had the precedent with RTX6000 Ada using regular GDDR6 instead of 6X. Would be interesting to see if this go around the yields will actually be enough to create a fully enabled card. With AD102, there never WAS a full-chip card. And the 4090 was obvious dregs sold for a ton to consumers.
Yeah and 4090 Ti was cancelled likely because no competition for 4090. With RDNA4 supposedly being 7900XTX at 7800XT prices I doubt the full die 5090/Ti is needed either.

Why sell 90-100% enabled dies to consumers when you can sell them for 2-3x the price as Quadro cards anyway?
 
Personally I blame AMD for not being able to compete for so long in terms of perf/watt, software (read: following in nVidia’s footsteps), drivers, *compatibility with emerging technologies such as RT and AI especially* (call/cope it how some may) etc… Their recent move of leaving the high end to Nvidia was basically them admitting defeat and now prices are sky high. The fact of the matter is integrated graphics makes a dGPU a nonessential part of a system, and by that I mean since you technically aren’t forced to buy one in the same vein that you’re forced to buy DRAM (especially given that dGPUs are interchangeable, not being locked to a certain vendor like you would be with a CPU socket for example), you really have to sell a product based on merit more than anything.

And if anyone thinks AMD are innocent in all this, don’t forget, they launched their 7900XTX at $1,000. So they aren’t gonna save you either.
The prices were sky high before 'AMD admitted defeat'. It has had zero impact - Nvidia released SUPER cards with a better perf/$ at somewhere around the same time period. Let's also not forget that AMD's RDNA3 price points were too high to begin with, so even their market presence hasn't had any impact on pricing. They happily priced up alongside Nvidia. It wasn't until the 7900GRE and 7800XT that things got somewhat sensible, and competitive versus the EOL RDNA2 offerings, which were also priced high in tandem with Ampere and lowered very late in the cycle.

The real facts are that no matter what AMD has done in the past, their PC discrete share is dropping. They're just not consistent enough and this echoes in consumer sentiment. Its also clear they've adopted a different strategy and are betting on different horses for quite a while now.

There is nothing new here with RDNA3 or RDNA4 in terms of market movement. Granted - RDNA3 didn't turn out as expected, but what if it did score higher on raster? Would that change the world?
 
Is AMD staring down the barrel? Is this really worse here than the years they were getting by on very low cashflow/margin products, pre-Ryzen? Are we really thinking they will destroy the one division that makes them a unique, synergistic player in the market?
Yes, it has literally never been worse for their GPU division than today. Until Q3 2022 AMD had rarely dropped below 20% marketshare and when they did they pulled back above that level within maximum 2 quarters... since then they have had 7 consecutive quarters below that threshold. That's nearly 2 years of failure to not just gain, but hold marketshare. That's staring down the barrel.

1718114023435.png



Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090

The RX 7900XTX Trades blows with the RTX 4080 Super mostly edging it out
The RX 7900XT beats the RTX 4070Ti Super
The RX 7900GRE Beats the RTX 4070 Super
The RX 7800XT Beats the RTX 4070
etc....

All while offering much better prices


View attachment 350831

View attachment 350832
Thanks for demonstrating exactly the same failure of understanding that I documented for AMD's marketing department in my post.
 
Last edited:
Yes, it has literally never been worse for their GPU division than today. Until Q3 2022 AMD had rarely dropped below 20% marketshare and when they did they pulled back above that level within maximum 2 quarters... since then they have had 7 consecutive quarters below that threshold. That's nearly 2 years of failure to not just gain, but hold marketshare. That's staring down the barrel.

View attachment 350840



Thanks for demonstrating exactly the same failure of understanding that I documented for AMD's marketing department in my post.

If we're looking at trends (I don't deny their share is lowest of all time, mind)...

2015: 18%
2019: 18.8%
2020H2: 18%
2022: 10%
2023Q4: 19%

They've been 'rock bottom' many times before. And if you draw a line over this graph, isn't this just the continuation of the trend of the last decade?

1718114416857.png


Sucks to be you, but Path Tracing is the future of videogame lighting, even AMD will have to optimize for it.
Oh? I must have missed that statement after Cyberpunk ran at sub 30 FPS on a 4090.

I think it mostly sucks for people who expect Path Tracing to be the norm. They're gonna be waiting and getting disappointed for a loooong time. Game graphics haven't stopped moving forward despite Path Tracing. Gonna be fun :)
 
Last edited:
Where's the 384-bit model with 24GB GDDR7 though? Seems like a big gap between the top model and the next one down
That's going to be next year's Super Titanium Ultra Max Plus Extreme GPU releases.
Please stand by.
 
Back
Top