• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Pricing and Performance

100 euros/dollars wouldn’t even cover the BOM on a triple fan 6600XT
You must have forgotten the real cost of things because you buy all the goods at premium prices and spend too much time with electronic devices on the internet, which affects your judgment badly. Let's send a question to you. Does the 6600 XT really need a giant case and a triple fan? No and no, even using such a cooling, the costs of the card are not particularly high. A few hundred grams of aluminum, some copper, etc. Do you really claim over 100 $/€ BOM? This cooler probably has almost no R&D in it, it was probably designed for another video card with a higher power consumption, maybe even a few generations ago, its development has long paid off. Yes, with a big cooler, the card looks very powerful and very productive, thanks to the marketing teams. Hehehe.
 
Semiconductor chips are an exception to the rule - RnD gets exponentially more expensive and so do new nodes. The latest player to enter based on the rumors is Xiaomi with plans to design their own SOCs and they are not what I would call a poor upstart company. Prior to them we had Apple doing the same and, well, same thing. A new unknown player just stepping in and starting making competitive CPUs/GPUs/SOCs is just not feasible these days.
If the net price of a product with the same performance baked on a smaller node is higher, then the product shouldnt exist! A shrink only has a point if it reduces the cost to produce the same performance or enables more net performance, and actually should be doing both. Otherwise, why shrink it?
 
@Vayra86
Multiple reasons - not willing to buy multiple node allocation, more dies per wafer, additional architectural RnD - pick whatever. We don’t actually really get “same performance on smaller node” products that cost more anyway, it’s a bit of a farfetched scenario. Blackwell is the same node as Ada and previous gens that WERE on smaller nodes DID at least theoretically at MSRP move the price for performance levels down. I am also not sure how this relates in any way to the discussion at hand?

@TumbleGeorge
Since you are fighting windmills and ignored both the original context and my response to the other guy I can only do the same thing I did for him and prescribe medication. I am not in the mood to wrangle you onto the proper discussion path considering that I have actually addressed the main coat of the product in a previous post and it wasn’t the cooler.
 
@Vayra86
Multiple reasons - not willing to buy multiple node allocation, more dies per wafer, additional architectural RnD - pick whatever. We don’t actually really get “same performance on smaller node” products that cost more anyway, it’s a bit of a farfetched scenario. Blackwell is the same node as Ada and previous gens that WERE on smaller nodes DID at least theoretically at MSRP move the price for performance levels down. I am also not sure how this relates in any way to the discussion at hand?
You are saying its inevitable the cost of chips keeps rising. Its a fallacy, because economically that doesnt check out. The only reason the prices go up today is because demand outpaces supply constantly - and it enables the abominations we see in the consumer market today. It has absolutely nothing to do with the fact a shrink is 'the only way to improve'. This industry will be undergoing big changes soon. Something is gonna give and as a consumer I couldnt care less whether or not my gpu is made on the smallest possible node. I think its likely we will see more diversification there.
 
Saw this just earlier. Nvidia apologists need a reality check. You can't tell me there wasn't inflation and node-shrink-related cost increases between 2012 and 2018. Past 2018, crypto and ML happened, Nvidia (and the GPU industry) gave gamers the middle finger: simple as that. AMD are just as bad as following suit with the bad pricing. TSMC too. And now with 50/90 serieses, the AIBs are joining in taking huge cuts too, adding large amount on top of the already-bad MSRP.

"Gamers" (what a dirty word that must seem to them) propped up the industry for 25+ years and now are treated like second-class citizens. This shouldn't be how one deals with long-term customers, but of course they can, and they will, without ML (on GPU) dying like crypto did. Sucks.

As for the chart, I wish the creator had adjusted for inflation to make the price decrease between 2012 and 2018 more obvious. Also not taken into account is how cut-chips seem to be calculated based on their chip size rather than the actual area utilized (which can be estimated by using CUDA core ratio of full chip, e.g. 5080/GB203-400, vs the cut chip, e.g. 5070 Ti/GB203-300).

1745081012824.png

https://mr-september.github.io/Nvidias_Core-ner_Cutting/
 
Let's just do a quick math check :

Blackwell cost :
GB202 = 750mm2 = $8565 (closest to full die "RTX Pro 6000")
GB202* = 750mm2* = $1999 (RTX 5090)
GB203 = 378mm2 = $999 (RTX 5080)
GB205 = 263mm2 = $550 (RTX 5070)
GB206 = 181mm2 = $379 (RTX 5060 Ti 8GB = because it's cheapest that uses full die with all VRAM bus)

per mm2 :
GB202 = 11,42$/mm2 (RTX Pro)
GB202* = 2,67$/mm2 (RTX 5090)
GB203 = 2,64$/mm2
GB205 = 2,09$/mm2
GB204 = 2,09$/mm2

Per 300mm wafer (perfect scaling, + full die) :
GB202 = 72 dies
GB202* = 72 dies
GB203 = 156 dies
GB205 = 224 dies
GB204 = 347 dies
^Based on : https://www.silicon-edge.co.uk/j/index.php/resources/die-per-wafer
(I square rooted die size for actual height/width for easier calculation, it is at best an ESTIMATION)

Money making per wafer :
GB202 = 72 x 8565$ = 616 680$
GB202* = 72 x 1999$ = 143 928$ (*not all die is used = higher yields)
GB203 = 156 x 999$ = 155 844$
GB205 = 224 x 550$ = 123 200$
GB204 = 347 x 379$ = 131 513$

Verdict : Assuming we drop RTX Pro as option, NV makes most money on each RTX 5070 card sold.
 
Last edited:
Saw this just earlier. Nvidia apologists need a reality check. You can't tell me there wasn't inflation and node-shrink-related cost increases between 2012 and 2018. Past 2018, crypto and ML happened, Nvidia (and the GPU industry) gave gamers the middle finger: simple as that. AMD are just as bad as following suit with the bad pricing. TSMC too. And now with 50/90 serieses, the AIBs are joining in taking huge cuts too, adding large amount on top of the already-bad MSRP.
Wait, what? At least past 2018 - which your picture shows - there was 7nm (mass production in 2018) that brought a noticeable cost increase and then 5nm (mass production in 2020) that brought another one. And while 3nm has been there for a couple of years for some reason GPUs have not moved onto that one - not even data center ones where within enterprise version of reason cost is not an issue. There have absolutely been node-shrink-related cost increases.

Nvidia Ampere did not get price increases because it was on an older - essentially inferior - node from the other - cheaper - supplier.
 
High End card (say a 5060)
how is a upper entry level GPU "high end"?

People don't like to hear this but NVidia is no longer a Gaming/GPU company. They are a premium, top of the line AI Company that sells GB202 Dies for (starting at) $10k to different tiers of professional customers and you get a slight binning reject with half the VRAM for 3000 (5090). From their perspective this is actually cheap. from the consumer side that expects good gaming GPUs that's awful but it is the new reality.
 
how is a upper entry level GPU "high end"?

People don't like to hear this but NVidia is no longer a Gaming/GPU company. They are a premium, top of the line AI Company that sells GB202 Dies for (starting at) $10k to different tiers of professional customers and you get a slight binning reject with half the VRAM for 3000 (5090). From their perspective this is actually cheap. from the consumer side that expects good gaming GPUs that's awful but it is the new reality.
With the GPU pricing going higher and higher, I find it possible for the first time that PC gaming could die in some point. Especially when developers don't optimize games for shit anymore because all those scaling technologies etc.
 
With the GPU pricing going higher and higher, I find it possible for the first time that PC gaming could die in some point. Especially when developers don't optimize games for shit anymore because all those scaling technologies etc.
Complacency is the end of most things, eventually
 
How did we get here? How can a new-gen High End card (say a 5060) be worse in performance than an Enthusiast Class (say a 3080) from 2 generations prior? It just seems wrong.

Am I nuts for thinking so?
It's throughput to make "superficial gains" & make performance claims.
This why I hate D.L.S.S & frame generation those are all throughput increasements, not raw performance.
Test to match performance with D.L.S.S on verse something doing the same frame rate at native resolution could show you what I'm talking about. There is technically a difference in ms between the two. Less so with D.L.S.S up scaling & more noticeable with frame generation.
 
Last edited:
How did we get here? How can a new-gen High End card (say a 5060) be worse in performance than an Enthusiast Class (say a 3080) from 2 generations prior? It just seems wrong.

Am I nuts for thinking so?
Things won't improve much, until PS6 comes out.
 
With the GPU pricing going higher and higher, I find it possible for the first time that PC gaming could die in some point. Especially when developers don't optimize games for shit anymore because all those scaling technologies etc.
Right there is no point. Why buy a game to run like ass on a 5090. Makes it not even worth pc gaming at all.

At some point I think I will go back to consoles. For 500 to 800 dollars and everything works, sitting on a couch with a controller and my buddies/wife seems all the more alluring when pc games run like total ass on the latest hardware. Junk software is what we get from the copy and paste github bro coders now.
 
sitting on a couch with a controller

I grew up on console's loving a controller, then moved to KB and Mouse and I don't think I'll be going back to controller again even if I have shit hardware inside my case. Just fell in love using KB and mouse especially for first person shooters.

I still do use a controller but for mame games like Mortal Kombat, Street fighter II and the likes :)

But yes, I agree PC's are getting expensive like when they first started. There was a time there when it was fairly cheap to build aka i5 2500K era give or take.

5090 is still around $7K AUD here. Waiting for prices to drop if that ever happens...
 
Back
Top