• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Possible Specs of NVIDIA GeForce "Blackwell" GPU Lineup Leaked

@ARF : But only with clamshell and that would not make any sense.

GB203 with only 16GB will run out of VRAM / or will need software hacks to lower the image quality / textures resolution in a game like Microsoft Flight Simulator, which already saturate the 20-GB RX 7900 XT.

1718131692432.png


 
Last edited:
Then better wait until the 24GB-(Super-)version with 8x3GB arrives later.

We only just 16GB on xx80(S) and then on xx70TiS, after staying with 8GB on xx80 for two generation and then upgrading to 10/12GB and for three generations on xx70(Ti) and then upgradig to 12GB. Don't expect NV to give you another upgrade after just on generation!
 
Then better wait until the 24GB-(Super-)version with 8x3GB arrives later.


We only just 16GB on xx80(S) and then on xx70TiS, after staying with 8GB on xx80 for two generation and then upgrading to 10/12GB and for three generations on xx70(Ti) and then upgradig to 12GB. Don't expect NV to give you another upgrade after just on generation!

Someone at nvidia must take the responsibility that the products it launches lack enough amount of VRAM, and need to be updated according to the current games market.

GTX 980 2015 4GB
GTX 980 Ti 2015 6GB
GTX 1080 2017 8GB
GTX 1080 Ti 2017 11GB
RTX 2080 2018 8GB
RTX 2080 Ti 2018 11GB
RTX 3080 2020 10/12GB
RTX 3080 Ti 2021 12GB
RTX 4080 2022 16GB
RTX 5080 2024 or 2025 16GB?

Good luck selling that.
 
That "5060" feels like yet another 4060 bruh moment.
 
Err, nope.
GB202 = 28GB
GB203 = 24GB
GB205 = 20GB
GB206 = 16GB
GB207 = 12GB

This is ok now.

Trust me, I *WANT* to be wrong, but all evidence points towards Nvidia opting for the lower capacity.

Primarily, I don't think higher density GDDR7 dies are available yet, and when they finally are, they will come with a profit-eating higher cost. Nvidia will likely justify it on the $3000 5090, and maybe eventually on the 5080 Ti/Super/Ultra/Whatever but for the cost-effective models at xx50/60/70 we're going to get screwed because those cards are all about maximising Nvidia's profit, not pleasing end-users.

"If you want more VRAM, spend more money, you filthy peasants."
- Jacket Man, probably.

Is a kilowatt PSU already needed for a single card?
It was with the 30-series, mostly due to spikes tripping the safeties on many high-end 850W units rather than average power draw of a 3080-equipped system exceeding 850W sustained power draw.
 
Your post definitely smells of fanboying :wtf:

Which is so laughable considering AMD has no problem competing with Nvidias offerings outside of the RTX 4090

The RX 7900XTX Trades blows with the RTX 4080 Super mostly edging it out
The RX 7900XT beats the RTX 4070Ti Super
The RX 7900GRE Beats the RTX 4070 Super
The RX 7800XT Beats the RTX 4070
etc....

All while offering much better prices


View attachment 350831

View attachment 350832

1 to 4% faster in raster, 20% slower in RT, higher median power consumption, very hit and miss driver support, lacks access to the Nvidia ecosystem. The 4080 will provide a better gaming experience, I can guarantee you that.

only circumstances I'd consider the 7900XTX is if you can purchase it for $300 less
 
Yes, but those are not typical gaming loads, this is something that you would want from a workstation card, not a GeForce.

I agree that the VRAM on the lower models is not enough, and the GB203 specs are too low.

The xx90 cards are prosumer products. A huge portions of their sales, particularly the 4090, is from AI and professional work.

Nvidia is factoring that in when deciding how much VRAM to equip their xx90 cards with. We've had two generations now with 24GB and prosumer workloads are demanding more VRAM than that.

Games themselves could probably use more VRAM as well if Nvidia wasn't intentionally holding back the market in that regard. You can't expect devs to use VRAM most people don't have and by extension this means Nvidia has great influence over how much VRAM games will use. If they continue to put 8GB VRAM cards out then 8GB of VRAM will continue to be enough so long as they are the dominent player in the market. Of course I expect it to increase but only when Nvidia absolutely has to, as goes with the trend of giving people as little as possible for their money whether that be up front or in regards to longevity.

It's ironic, in the enterprise market Nvidia caters to what customers what while in the gaming market gamers cater to what Nvidia wants. If gamers are going to argue for Nvidia that 8GB is still fine, perhaps they deserve 8GB cards ad infinium. It's a self perpetuating prophecy.
 
Not impressed, that 5080 is already gimped right out the gate, and because it is a supposed xx80 card, nVidia will already charge $1000, get screwed nVidia.

I am done with GPU's, this machine will all likely, be my last, it's been fun, but I just can't bother anymore. At certain tiers nVidia, you have priced yourself out of the market, everyone has their limits and you pissed on their concerns. No one like being punked or ripped off.
 
I really wish NVIDIA had decided to increase the VRAM capacity and bus width over Ada. Not because more VRAM and a wider bus actually does anything for performance, but because it would at least stop Radeon fanboys crying about how NVIDIA is screwing buyers over. News flash, the 88% of people who own an NVIDIA GPU only feel screwed over by AMD's inability to compete.
Do you ever stop kissing Nvidia's ass?
 
Not impressed, that 5080 is already gimped right out the gate, and because it is a supposed xx80 card, nVidia will already charge $1000, get screwed nVidia.

I am done with GPU's, this machine will all likely, be my last, it's been fun, but I just can't bother anymore. At certain tiers nVidia, you have priced yourself out of the market, everyone has their limits and you pissed on their concerns. No one like being punked or ripped off.

It's a shame but you're not alone. I see too many PC gamers expressing your sentiments and the road ahead only looks worse. The reality is that most PC gamers and console gamers are in the same boat. They have definite limits on what they can spend on hardware. Nvidia has pushed pricing to such an extreme that many, who ordinarily would not have, turned to the second hand market which introduces additional risks. iirc you were one that was struggling to upgrade during the mining craze. Can't blame the pricing on that anymore. Now it just amounts to simple greed by Nvidia and retailers.
 
when the amount of complaints is visibly superior to NVIDIA users. I've switched to NVIDIA shortly after and had no such episode since. Such a stark difference might sound ridiculous, but yeah, not once did I have a significant issue after updating NVIDIA drivers.
And that folks is the myth in action with the desired effect.

Would you believe me if I said I had so many Nvidia driver problems that I switched to AMD and never had any again?
 
And that folks is the myth in action with the desired effect.

Would you believe me if I said I had so many Nvidia driver problems that I switched to AMD and never had any again?
Yes.
 
I’m going to diverge from the never-ending driver quality argument and just point out that RTX Voice/Broadcast, RTX Remix, ChatRTX, and soon, G-ASSIST, ACE, etc… either don’t have any equivalent from AMD or the equivalent is half-baked in comparison. There is just so much more you can do with one team’s products than the other…

Edit: forgot to mention
Canvas
Omniverse
NVENC
 
I’m going to diverge from the never-ending driver quality argument and just point out that RTX Voice/Broadcast, RTX Remix, ChatRTX, and soon, G-ASSIST, ACE, etc… either don’t have any equivalent from AMD or the equivalent is half-baked in comparison. There is just so much more you can do with one team’s products than the other…

Edit: forgot to mention
Canvas
Omniverse
NVENC
I didn't even know besides NVENC about those, so I can presume that a typical user doesn't know about those either.
 
I didn't even know besides NVENC about those, so I can presume that a typical user doesn't know about those either.

Canvas is basically using generative AI to create photorealistic images out of simple mspaint-like strokes, and Omniverse is practically why Nvidia is worth $3 trillion right now

 
Is a kilowatt PSU already needed for a single card?
As long as GPUs only require one GEN5 power connector, there will be no need for 1000W and above PSUs, though keep in mind that power supplies are most efficient when running at 50% load.

Things may change if we ever see a GPU with two GEN5 power connectors. In that case, we need at least a 1200W PSU (ideally 1500W+) to power something like the upcoming 5090 which could end up sipping 800W, if not that then maybe a 6090.
 
The GB203 only 256-bit & 10,752 CUDA cores looks barely better than 4080specs and we all know more cache is not a silver bullet. It could br that 5080 will be no faster or even a bit slower than 4090 in rasterization at 4K and above. No cempetition = no progress :banghead:
You forget the key bonus points that will sell this one at the same price as a 4080

- DLSS4.x with ultra super framerate acceleration, definitely better than 3, so you can't miss it, and it can only run on Blackwell because Huang said so
- Even betterrer RT performance, which is what Nvidia is going to tout as the 'real performance gap' versus Ada, don't mind raster, its no longer relevant, there is always an RT game to distort reality
- Some subscription to some Nvidia service model (3 months of free GF Now?)
- A lot of marketing to drive the above home
 
Not impressed, that 5080 is already gimped right out the gate, and because it is a supposed xx80 card, nVidia will already charge $1000, get screwed nVidia.

I am done with GPU's, this machine will all likely, be my last, it's been fun, but I just can't bother anymore. At certain tiers nVidia, you have priced yourself out of the market, everyone has their limits and you pissed on their concerns. No one like being punked or ripped off.

Unfortunately $1,000 I suspect would be on the low end of expected pricing. Nvidia wanted to charge $1,000 for the gimped 4080 until that got canceled due to backlash. Looks like for the 5000 series the gimped xx80 will now be the default but I'm not so sure Nvidia will have the price follow suit. That's kind of been Nvidia's strategy anyways, if something receives backlash just try again later with a different name or on the sly. The GPP is a good example, Nvidia fully implemented it with the 4000 series. All the top model brands are reserved for Nvidia only cards now. Not a single ounce of outrage. Clearly Nvidia's strategy works.
 
You forget the key bonus points that will sell this one at the same price as a 4080

- DLSS4.x with ultra super framerate acceleration, definitely better than 3, so you can't miss it, and it can only run on Blackwell because Huang said so
- Even betterrer RT performance, which is what Nvidia is going to tout as the 'real performance gap' versus Ada, don't mind raster, its no longer relevant, there is always an RT game to distort reality
- Some subscription to some Nvidia service model (3 months of free GF Now?)
- A lot of marketing to drive the above home
Let's not forget AMD's bonus to sell their GPU's for the same price as an Nvidia product they are competitive in raster to;

-FSR SR FG SS XXX 4.1, definitely better than FSR 1.0, and just as good as DLSS because it's open source and Lisa Su Bae said so
-still lacking in RT performance, but more sponsored games with half assed effects to show they're barely behind in games where you struggle to spot the difference.
- +++++ VRAM so they can claim at least one spec advantage and cater to people who want more than a decade of high textures, well past the GPU's ability to push good fps in modern AAA titles
-some ridiculous marketing or PR blunder because it wouldn't be Radeon Technologies Group without them managing an own goal too
 
Let's not forget AMD's bonus to sell their GPU's for the same price as an Nvidia product they are competitive in raster to;

-FSR SR FG SS XXX 4.1, definitely better than FSR 1.0, and just as good as DLSS because it's open source and Lisa Su Bae said so
-still lacking in RT performance, but more sponsored games with half assed effects to show they're barely behind in games where you struggle to spot the difference.
- +++++ VRAM so they can claim at least one spec advantage and cater to people who want more than a decade of high textures, well past the GPU's ability to push good fps in modern AAA titles
-some ridiculous marketing or PR blunder because it wouldn't be Radeon Technologies Group without them managing an own goal too
Yay for stagnation!
 
Back
Top