• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 5000 Blackwell Memory Amounts Confirmed by Pre-Built PC Maker

I get it though Nvidia does not want companies using GeForce cards for professional work and instead wants them to buy insanely high margin professional cards. It just sucks gamers get the short end of the stick over it. A lack of any real competition is probably a factor but a really distant 2nd.
What professional cards? Quadro is dead. The x90 GeForce cards are the professional cards now.

This is why Nvidia has such a high gaming GPU market share. All of their GPUs are gaming GPUs now. Technically...
 
Cuz the 7900XTX is the real RX 6800 replacement duh..... Even though it uses a ton of silicon....
Well then, that leads to a horrible realization. How / why are we expecting the market dominant company to "compete" when the company with the 10% and shrinking market share is selling their midrange cards as top dogs?
 
What professional cards? Quadro is dead. The x90 GeForce cards are the professional cards now.

This is why Nvidia has such a high gaming GPU market share. All of their GPUs are gaming GPUs now. Technically...


Nope they still make them.

Well then, that leads to a horrible realization. How / why are we expecting the market dominant company to "compete" when the company with the 10% and shrinking market share is selling their midrange cards as top dogs?

I was exaggerating... the 4080 was 50% faster and had 60% more vram it was fine being called an 80 class GPU if one considers the 3080 an 80 class GPU... The problem was the price and always has been...

I don't really care what they call a card they could call the 4090 a 4060 charge 1600 for it and I would have still bought one.

I do feel like everything below the 4080 should have been shifted down a tier but that's only because it would have made them awesome products instead of the mostly meh products we ended up with.
 
I was exaggerating... the 4080 was 50% faster and had 60% more vram it was fine being called an 80 class GPU if one considers the 3080 an 80 class GPU... The problem was the price and always has been...

I don't really care what they call a card they could call the 4090 a 4060 charge 1600 for it and I would have still bought one.

I do feel like everything below the 4080 should have been shifted down a tier but that's only because it would have made them awesome products instead of the mostly meh products we ended up with.
Oh I know you were, but someone said the 4080 should have been named a 4070 (and it's not just one person, it's a very popular opinion).
 
Oh I know you were, but someone said the 4080 should have been named a 4070 (and it's not just one person, it's a very popular opinion).

That would have been awesome who wouldn't want that. Imagine a 4080 like product for 600 usd it would have killed AMD they'd have to drop out of the gpu business.

They'd definitely lose money on a 500 usd 7900XTX...
 

Nope they still make them.
I stand corrected. Is it a full product line catering to all price ranges, just like Quadro was?

I was exaggerating... the 4080 was 50% faster and had 60% more vram it was fine being called an 80 class GPU if one considers the 3080 an 80 class GPU... The problem was the price and always has been...
Yes. With +50% performance at +100% price compared to last gen, the 4080 is the worst value GeForce card Nvidia has ever released. Point blank.
 
I stand corrected. Is it a full product line catering to all price ranges, just like Quadro was?
1735810216660.png

Yes, it is, and if you look at their vram sizes you will know why the gaming cards dont have that much vram

As for price ranges, from an arm and a leg to several other body parts
 
I stand corrected. Is it a full product line catering to all price ranges, just like Quadro was?


Yes. With +50% performance at +100% price compared to last gen, the 4080 is the worst value GeForce card Nvidia has ever released. Point blank.

Afaik here is I think the lowest one but I could be wrong i honestly don't pay much attention to them.



70% uncharge here in the states but I agree the pricing sucked but a 1000 usd 7900XTX wasn't better in my book and the 900 usd 7900XT was a joke.

Pretty much the whole last generation kinda sucked only the 4070/7800XT were at least passable more so with the 4070 super... Talking about launch not what the amd cards dropped to in price after all of 5 people bought them.

I just hope this generation is slightly better in the below 1k market but I'm not holding my breath... AMD will do somthing stupid and the 5060 and 5060ti will suck with the 5070 being a 12GB card again which was fine 2 years ago but kinda sucks now leaving the 5070ti to save the under 1k lineup on the green side.... Who knows what the 9070 will be if it loses to the 5070 wtf....
 
70% uncharge here in the states but I agree the pricing sucked but a 1000 usd 7900XTX wasn't better in my book and the 900 usd 7900XT was a joke.

Pretty much the whole last generation kinda sucked only the 4070/7800XT were at least passable more so with the 4070 super... Talking about launch not what the amd cards dropped to in price after all of 5 people bought them.

I just hope this generation is slightly better in the below 1k market but I'm not holding my breath... AMD will do somthing stupid and the 5060 and 5060ti will suck with the 5070 being a 12GB card again which was fine 2 years ago but kinda sucks now leaving the 5070ti to save the under 1k lineup on the green side.... Who knows what the 9070 will be if it loses to the 5070 wtf....
I agree. I am cautiously hopeful of the 9070 XT. Not so much of the 5060 / Ti. Nvidia admittedly doesn't give a rat's arse about mid-range cards anymore, whereas AMD will only have mid-range this time around.
 
someone said the 4080 should have been named a 4070 (and it's not just one person, it's a very popular opinion).
Maxwell:
Titan X has 24 SM (100%)
980 Ti has 22 SM (91.7%)
980 has 16 SM (66.7%)
970 has 13 SM (54.2%)

Pascal:
Titan XP has 30 SM (100%)
1080 Ti has 28 SM (93%)
1080 has 20 SM (66.7%)
1070 has 15 SM (50%)

Turing:
Titan RTX has 72 SM (100%)
2080 Ti has 68 SM (94.4%)
2080 has 46 SM (63.9%)
2070 has 36 SM (50%)

Ampere:
Titan Ampere doesn't exist
3090 Ti has 84 SM (100%..?)
3080 Ti has 80 SM (95.2%)
3080 has 68 SM (80.9%)
3070 has 46 SM (54.8%)

Ada:
Titan Ada doesn't exist
4090 has 128 SM (clearly not 100% because Titan was supposed to have 144 SM; never released purely because of greed; 88.9%)
4080 has 76 SM (52.8%) [even if we consider 4090 a 100% it's still 59.4% which is lower in any xx80 GPU released prior]

Based on the die complexity alone, we might suggest 4080 being a glorified 4070 Super at best. A 4070 for 210% price if we're being more critical, yet not beyond reason.

The main reason 4080 gets us almost a 100% generational uplift over a 3070 is the fact NVIDIA had gone from a very bad node (worse than what AMD used in their RDNA2!) to a very good one (better than what AMD used in their RDNA3) in one go. That allowed to increase clocks by almost 50%. Another 40ish percent is architectural improvements.
 
Maxwell:
Titan X has 24 SM (100%)
980 Ti has 22 SM (91.7%)
980 has 16 SM (66.7%)
970 has 13 SM (54.2%)

Pascal:
Titan XP has 30 SM (100%)
1080 Ti has 28 SM (93%)
1080 has 20 SM (66.7%)
1070 has 15 SM (50%)

Turing:
Titan RTX has 72 SM (100%)
2080 Ti has 68 SM (94.4%)
2080 has 46 SM (63.9%)
2070 has 36 SM (50%)

Ampere:
Titan Ampere doesn't exist
3090 Ti has 84 SM (100%..?)
3080 Ti has 80 SM (95.2%)
3080 has 68 SM (80.9%)
3070 has 46 SM (54.8%)

Ada:
Titan Ada doesn't exist
4090 has 128 SM (clearly not 100% because Titan was supposed to have 144 SM; never released purely because of greed; 88.9%)
4080 has 76 SM (52.8%) [even if we consider 4090 a 100% it's still 59.4% which is lower in any xx80 GPU released prior]

Based on the die complexity alone, we might suggest 4080 being a glorified 4070 Super at best. A 4070 for 210% price if we're being more critical, yet not beyond reason.

The main reason 4080 gets us almost a 100% generational uplift over a 3070 is the fact NVIDIA had gone from a very bad node (worse than what AMD used in their RDNA2!) to a very good one (better than what AMD used in their RDNA3) in one go. That allowed to increase clocks by almost 50%. Another 40ish percent is architectural improvements.
Doesn't that clearly show that judging GPUs by SM's is faulty? Going by your numbers, the 3090 is better on paper (gen on gen I mean) compared to the 4090, and yet the 3090 gave us way less generational performance than the 4090 did.
 
Doesn't that clearly show that judging GPUs by SM's is faulty?
No. The 3090 provided less improvement over last generation for these reasons:
1. Bad node. Almost as bad as what they had for the Turing. Ada was built on a much better silicon which allowed to boost performance by about 45% purely by higher clocks.
2. COVID. Per se, probably not as impactful but NVIDIA ain't no charity so they enabled damage control and skimped their SKUs a bit.
3. No L2 or L3, or whatever cache to compensate for insufficient VRAM bandwidth in select scenarios. This thing not only directly boosts Ada GPUs, it also makes more room for the GPU die itself since there's less VRAM modules used which means less power required to fuel them.

But all in all, xx70 historically were meant to be roughly a half of what the architecture is capable of. Yesterday, it shifted to xx80. Today, to xx80 Super. Tomorrow... better be no tomorrow at this rate.
 
No. The 3090 provided less improvement over last generation for these reasons:
1. Bad node. Almost as bad as what they had for the Turing. Ada was built on a much better silicon which allowed to boost performance by about 45% purely by higher clocks.
2. COVID. Per se, probably not as impactful but NVIDIA ain't no charity so they enabled damage control and skimped their SKUs a bit.
3. No L2 or L3, or whatever cache to compensate for insufficient VRAM bandwidth in select scenarios. This thing not only directly boosts Ada GPUs, it also makes more room for the GPU die itself since there's less VRAM modules used which means less power required to fuel them.

But all in all, xx70 historically were meant to be roughly a half of what the architecture is capable of. Yesterday, it shifted to xx80. Today, to xx80 Super. Tomorrow... better be no tomorrow at this rate.
So there are other things beside SMs that matter for performance, which is my point...? Again, looking at the data you provided, the 3090 is a beast and the 4090 mediocre, yet in reality the 4090 gave a huge performance increase, much bigger than the 3090 could ever dream off
 
So there are other things beside SMs that matter for performance, which is my point...?
For absolute performance, yes.
For relative, no. And we're ONLY talking relative. 4080 relative to maxed out Ada is like 3070 relative to maxed out Ampere, or whatever xx70 relative to maxed out respective arch. End of story. Even if this all was hot bollocks and 4080 was a legit xx80 GPU for whatever reason it's still a 12 hundred dollar piece of nonsense. Since when is this okay for a non-halo GPU?
 
Back
Top