• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Jensen Huang Discloses NVIDIA Blackwell GPU Pricing: $30,000 to $40,000

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,328 (3.87/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
Jensen Huang has been talking to media outlets following the conclusion of his keynote presentation at NVIDIA's GTC 2024 conference—an NBC TV "exclusive" interview with the Team Green boss has caused a stir in tech circles. Jim Cramer's long-running "Squawk on the Street" trade segment hosted Huang for just under five minutes—NBC's presenter labelled the latest edition of GTC the "Woodstock of AI." NVIDIA's leader reckoned that around $1 trillion of industry was in attendance at this year's event—folks turned up to witness the unveiling of "Blackwell" B200 and GB200 AI GPUs. In the interview, Huang estimated that his company had invested around $10 billion into the research and development of its latest architecture: "we had to invent some new technology to make it possible."

Industry watchdogs have seized on a major revelation—as disclosed during the televised NBC report—Huang revealed that his next-gen AI GPUs "will cost between $30,000 and $40,000 per unit." NVIDIA (and its rivals) are not known to publicly announce price ranges for AI and HPC chips—leaks from hardware partners and individuals within industry supply chains are the "usual" sources. An investment banking company has already delved into alleged Blackwell production costs—as shared by Tae Kim/firstadopter: "Raymond James estimates it will cost NVIDIA more than $6000 to make a B200 and they will price the GPU at a 50-60% premium to H100...(the bank) estimates it costs NVIDIA $3320 to make the H100, which is then sold to customers for $25,000 to $30,000." Huang's disclosure should be treated as an approximation, since his company (normally) deals with the supply of basic building blocks.



View at TechPowerUp Main Site | Source
 
If it costs $30k they are starting to give up some of their obscene profit margins.

I'm still searching for an explanation as to why they didn't produce this new chip at 3nm. These R&D numbers seem even stranger when you consider that each chip offers minimal architectural advancement, biggest improvements comes from the fact of gluing two chips together...
 
we had to invent some new technology to make it possible."
Yea, and it's called "The NEW Jacket financing plan" or "Fleecing da AI monster- 2024 & Beyond", hahahaha

Happy Lets Go GIF by Holler Studios
........
College Basketball Sport GIF by Basketball Madness
 
If it costs $30k they are starting to give up some of their obscene profit margins.

I'm still searching for an explanation as to why they didn't produce this new chip at 3nm. These R&D numbers seem even stranger when you consider that each chip offers minimal architectural advancement, biggest improvements comes from the fact of gluing two chips together...
He's BS'ing everyone with that R&D cost, 70%+ margin... lol.
 
He's BS'ing everyone with that R&D cost, 70%+ margin... lol.

Given that the chip cost 6k ish to make it's probably 80%+ lmao. I mean if it was me I'd be charging as much as companies are willing to spend so can't really blame them.
 
Going by those prices the consumer 5090 will be $2000 easily :eek:
 
Total BS from a total BS human. We are looking at another +-30% price increase over the awful current gen. $1000 for a midrange card.
 
I'm still searching for an explanation as to why they didn't produce this new chip at 3nm.

If the chips were backported from 3nm to 5nm+, that means they failed to comply with the initial performance metrics. Maybe 3nm is broken for such large and extremely power hungry chips?

Total BS from a total BS human. We are looking at another +-30% price increase over the awful current gen. $1000 for a midrange card.

That won't work. Because there is a red line for the gamers beyond which the sales will be physicallly impossible.
You see - you can't sell a garbage RTX 5060 Ti that is 5% faster than RTX 4070 S for 1000$, if the latter already costs around 600$.
You need a very significant performance increase which won't happen... because greed.
But greed means that either way they are approaching a wall that will make the whole GPU segment DOA.
 
Given that the chip cost 6k ish to make it's probably 80%+ lmao. I mean if it was me I'd be charging as much as companies are willing to spend so can't really blame them.
Also worth to keep in mind that unlike us poor mortal, the people buying those are probably getting their money back and more. If I were selling something that allow people to make millions, I would charge accordingly TBH.
 
Hi,
Leather jacket man does this at least twice a day lol
Titanic Leonardo Dicaprio GIF by Top 100 Movie Quotes of All Time
 
If it was easy to R&D and produce a chip that capable, more companies would have done it already.
nVidia can charge as much as they want as long as no one else is capable enough to compete them.

I would really like to see if nVidia had x86 license, what they could produce.
 
If it was easy to R&D and produce a chip that capable, more companies would have done it already.
nVidia can charge as much as they want as long as no one else is capable enough to compete them.

I would really like to see if nVidia had x86 license, what they could produce.
They have ARM license and are going to release this year supposedly a very competitive CPU. Time will tell I guess...
 
I think Nvidia is sort of tipping it's hand in regards to technical limitations by demonstrating that they have two big dies that act as one.

It's similar to AMD's Zen 1 in that both dies are the same but different in that Zen 1 chips are designed as parts of the whole and modular in nature. Hence why AMD was able to add additional chiplets and they'd be interoperable. Meanwhile the Nvidia design appears to be fixed in nature, you get two full dies connected. By extension Nvidia's is not a chiplet architecture.

If Nvidia had the technical knowledge to implement a chiplet architecture it makes little sense why they'd go with two big dies over smaller dies that would be vastly cheaper to produce with higher yields. In addition, in Nvidia's design they are clearly not going to be able to use those massive dies across all product segments. By extension this means that like other monolithic products you'll need to have different dies for the various SKUs. In a chiplet based architecture you build your SKUs out of chiplets to address all segments of the market, which allows a lot of flexability in terms of which parts of the market you want to allocate product to and it allows you to bin chiplets among your entire yield and not just the yield of a specific SKU. It appears Nvidia's design lacks modularity and scalability which fundementally makes it two monolithic dies and not chiplets.
 
They have ARM license and are going to release this year supposedly a very competitive CPU. Time will tell I guess...

The same for ARM. ARM is not allowed to produce x86 CPUs.
Basically both ARM and nVidia would destroy Intel and AMD...
 
If it was easy to R&D and produce a chip that capable, more companies would have done it already.
nVidia can charge as much as they want as long as no one else is capable enough to compete them.

I would really like to see if nVidia had x86 license, what they could produce.
Nvidia's previous attempts at CPUs haven't been earth shattering so I doubt they would have done any better than AMD or Intel at x86 CPUs.
 
The same for ARM. ARM is not allowed to produce x86 CPUs.
Basically both ARM and nVidia would destroy Intel and AMD...
ARM on the consumer desktop won't ever happen, and probably won't go very far outside of niche use cases in the datacenter either. There is no ARM architecture or product that is as fast as a 96 core Zen 4. ARM can't even compete with it's own products against the performance of x86/AMD64.

nGreedia are trying to argue that their new ARM servers are amazing, but all I can see is an expensive platform, with significantly less performance than AMD's Epyc.
 
Last edited:
Back
Top