In the US, the average farmer exists on subsidies, and has a gross profit margin of 12-15%. If I do the math, Nvidia is telling me they are 5 to 6.25 times as important as the average farmer.
First you're confusing gross and net profit margins: NVidia's net margin is about 55%, and the margin on their consumer cards is less than half that: perhaps 25%. Historically their margins have been much lower -- in the 12% range a decade ago, and negative (losing money) five years before that. And a decade from now, these high margins will have attracted so much new competition that NVidia will again be back to those low levels. And while the
average farmer has a net margin of about 12%, the most profitable farms have operating margins that top 50%:
Large-scale family farms were more likely to have stronger financial performance than other farms, according to USDA, Economic Research Service (ERS) researchers using data from the 2011–20 Agricultural Resource Management surveys (ARMS). ERS researchers categorized farms as low risk if they had...
www.ers.usda.gov
But your biggest error is the belief that margins are somehow dictated by 'importance', rather than a whole host of factors you should have learned in microeconomics. Since NVidia must amortize bilions in R&D costs, they could make considerably more profit by
lowering the price and selling more cards -- if they could get the capacity from TSMC. Since they can't, lowering the price would simply revert us to the situation in 2021, where artificially low prices led to product shortages, outages, and widespread scalping.
That registers to me as a gigantic middle finger when they want to charge (429-379) $50 for less than 10 dollars worth of DRAM chips.
Again, this reveals a deep misunderstanding not just of the GPU market, but of manufacturing in general. The sole difference between these two cards is *not* "$10 in DRAM chips". The 8GB variant has its own design costs and manufacturing run startup costs, costs to design, order, and print boxing that, while nearly identical, isn't. Many countries have per-product registration and certification costs, and it'll require separate inventory, tracking, and -- to a more limited degree, sales and marketing. And since the 8GB variant will sell in lower volume than the 16GB,
all those fixed costs must be amortized against a smaller number of units. The gross margins
must be set higher.
But even if this wasn't true, it ignores reality, and a concept known as equilibrium price. NVidia has only a certain amount of GPUs. If it sets a $10 differential between the 8GB and 16GB cards, then everyone wants the larger card -- the shelves run bare, while 8GB boxes gather dust. What then? Scalpers step in and start reselling the 16GB for a higher price, while desperate retailers begin selling the 8GB at below MSRP. And then guess what?
You wind up with the exact same situation -- a large price gap between the two cards. In fact, economics theory predicts that, due to deadweight losses, that gap will be even larger than if NVidia did it what it's doing now, and attempting to price the models so that both models sell to meet production -- but only just.