• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA B100 "Blackwell" AI GPU Technical Details Leak Out

Well i can think of a few models, AI can dominate or be extremely talented.

It's just at the very early years - wait until you can have AI develop your next CPU/GPU without any human intervention even. A zillion of possibilities really.
 
Because of the cost though, these are only an option for the mega corps or mega rich.
 
I wonder what AMD's response to this will be.. Well, time to get ready for whatever the MI350X brings to the table, because this is definitely going to require something new for AMD to compete with it. The current MI300X is a bit better than the H200, if I remember correctly.
Considering they are already working with chiplets, they can do 1.5 times or even twice the MI300, plus 1.5 times or twice the HBM at, probably 1000-1200W and call it a day.
 
Because of the cost though, these are only an option for the mega corps or mega rich.

You do know that, tech in today's formula 1 for example, will sooner or later reach consumer cars right?

Takes a couple of years.
 
You need to work on reading comprehension it seems, it's a joke upon the joke last line from the Twitter post....come on man.
The only time it's is written here on techpowerup, either in the news article itself or in the images on this article, is by the staff member, I don't follow links off this site to twitter, sorry not sorry, it may have proved wiser to include a screenshot of the post that said it if the context was to be apparent, nothing wrong with my reading comprehension.
 
The only time it's is written here on techpowerup, either in the news article itself or in the images on this article, is by the staff member, I don't follow links off this site to twitter, sorry not sorry, it may have proved wiser to include a screenshot of the post that said it if the context was to be apparent, nothing wrong with my reading comprehension.
What....the image of the post is right there under the article...with an added visual joke from the poster to boot
 
Why so scared of Jensen and running away D.Labelle style, who knows. This leak is clearly intentional and controlled. Not so detailed at all. Multi chip and hbme3e is well known by now. Predecessor H200 had 141MB already. So what's new, just pumping it a day before for more impact.
 
What....the image of the post is right there under the article...with an added visual joke from the poster to boot
And where in that picture does it say leather-jacketed one?
 
And where in that picture does it say leather-jacketed one?
No that is the joke upon the joke from the Twitter post, idk what is going on here, are you offended that the common joke association of Jensen and his eternal same wardrobe of leather jackets exists?
 
No that is the joke upon the joke from the Twitter post, idk what is going on here, are you offended that the common joke association of Jensen and his eternal same wardrobe of leather jackets exists?
For a news post from a staff member, I find this content in the article itself to be in very poor taste.

I expect it from a considerable few in the user base, but not the staff, you guys can do better than that.
The context helps, and could/should have been included if it's a reference to a quote from the post that was not shown.
 
The context helps, and could/should have been included if it's a reference to a quote from the post that was not shown.

I mean idk how much more reference you need, TPU reported news partly based on what the twitter post stated, added a hyperlink to the post in the article AND added a screenshow of the twitter post underneath...

Apart from that you did not answer the question, are you offended by the association joke about Jensen and his leather jackets, based on the fact that that is the only thing he ever wears (willingly)?
And that is the catalyst to the negative criticism of the reporting in this article?
 
I've already made my thoughts on the whole situation perfectly clear in my previous replies, this is going nowhere. Please stop trying to argue your point.
 
Maybe jensen is a wannabe biker, or it's his midlife crysis jacket.
 
Considering they are already working with chiplets, they can do 1.5 times or even twice the MI300, plus 1.5 times or twice the HBM at, probably 1000-1200W and call it a day.
Yes... funny how efficient using technology designed to be future-proof and scalable can be instead of just pouring countless billions into evolving a monolithic architecture, isn't it.
(This is a jab at Nvidia, not the user I'm replying to)
 
I, for one, am very reassured that the leather-jacketed one works for Nvidia. It makes them cooler.

View attachment 339487

Ayyyyyyy!

Let us not forget American history and the church of the Fonz - this is the ONLY leather-jacketed one. Amen.

Technically, we have 4x Confirmed Jackets: Brando, James Dean, The Fonz and The Terminator (Cameron got us joint custody).
The UK however, has Rob Halford, and being a Metal God nets him at least 2-3x Jackets.

As for Jensen...no, he does not get to be in the All-Time Cool Guy Leather Jacket Club.

Schoolhouse Rock Reaction GIF
 
If i sip a drink every time i read the word Ai, i would die by the end of the day
Totally. All this massive push Nvidia is trying to give this hype train makes me think it's not going as fast as they'd want to.

By the way, is Blackwell not their new mainstream architecture? I'm confused now.
 
I wonder how many of the latest gen consoles in series would it require to match the performance of this GPU?

For pretty much every game I can get consistenly over 120fps. Also, I don't get screen tearing or any artifacts at 999+ fps in anything including such games as Rocket League at 4K with my rtx 4090. . But when unlocked it uses 99% gpu and about 450w.

So, put it down to the 240fps or at 120fps it uses 80watts.

That's the benefit of a 4090 over a 4080. While it has a higher tdp and potential performance it can also use a lot less power than over GPUs if you cap it.

Either way, it seems too soon to release a 5090 while the prices are still an extra zero too high, there are no new games that are coming our requiring a 4090 killer, consoles need a massive refresh or all new games will become more cartoony so consoles can have the 4k 100fps+ experience and there is no new GTA or Cyberpunk coming out this year.

However, while it has been a few years after the nvidia marbles demonstration was shown, if it is released to test and I don't get 120+ fps at 4k with my 4090 @ 2910-3000mhz (typical core clock) then I'd see the point.
 
Back
Top