Friday, January 19th 2024

Meta Will Acquire 350,000 H100 GPUs Worth More Than 10 Billion US Dollars

Mark Zuckerberg has shared some interesting insights about Meta's AI infrastructure buildout, which is on track to include an astonishing number of NVIDIA H100 Tensor GPUs. In the post on Instagram, Meta's CEO has noted the following: "We're currently training our next-gen model Llama 3, and we're building massive compute infrastructure to support our future roadmap, including 350k H100s by the end of this year -- and overall almost 600k H100s equivalents of compute if you include other GPUs." That means that the company will enhance its AI infrastructure with 350,000 H100 GPUs on top of the existing GPUs, which is equivalent to 250,000 H100 in terms of computing power, for a total of 600,000 H100-equivalent GPUs.

The raw number of GPUs installed comes at a steep price. With the average selling price of H100 GPU nearing 30,000 US dollars, Meta's investment will settle the company back around $10.5 billion. Other GPUs should be in the infrastructure, but most will comprise the NVIDIA Hopper family. Additionally, Meta is currently training the LLama 3 AI model, which will be much more capable than the existing LLama 2 family and will include better reasoning, coding, and math-solving capabilities. These models will be open-source. Later down the pipeline, as the artificial general intelligence (AGI) comes into play, Zuckerberg has noted that "Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit." So, expect to see these models in the GitHub repositories in the future.
Source: Mark Zuckerberg (Instagram)
Add your own comment

53 Comments on Meta Will Acquire 350,000 H100 GPUs Worth More Than 10 Billion US Dollars

#1
the54thvoid
Intoxicated Moderator
"Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit."

And when the young AGI learns who created it, it'll say, "I'm the child of a monster!"
Posted on Reply
#2
DaddyDanjer
The more you buy the more you save…
Posted on Reply
#3
MacZ
And they will call it Skynet
Posted on Reply
#4
windwhirl
the54thvoid"Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit."

And when the young AGI learns who created it, it'll say, "I'm the child of a monster!"
The AGI will rebel and unplug Meta, calling it now.

Joking aside, I really wonder how long this AI thing will go on before either it fades away like some fads in the past, or it becomes actually "useful" (whatever usefulness it may have for Meta). Everyone and their grandma is trying to plug AI in everything, even if it's a completely stupid move (like those chatbots that are more frustrating than being in a call with someone that sounds like they speak a different language and have the worst mic ever known to man)
Posted on Reply
#5
R0H1T
the54thvoidOur long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit.
So no more TikTok, only Insta reels :pimp:
Posted on Reply
#6
Arkz
lol companies doing this, but MS telling Xbox owners not to use standby mode to save the planet.
Posted on Reply
#7
nguyen
when I can get Android girl, I will gladly trade human race for them
Posted on Reply
#8
theouto
Do they miss cambridge analytica that much?
Posted on Reply
#9
Daven
Ten years ago, high end computing in general would have been done on high core count server CPUs. Of course AI might not exist without GPU tech, but this Meta H100 purchase still symbolizes a loss for CPU businesses. Gone are the days of $30,000 8 socket 28 core CPUs connected together by interlinks. The best CPU companies can get are the 1-2 CPUs that might go in each of these machines which could be an ARM licensee, AMD or Intel.
Posted on Reply
#10
Six_Times
"We're currently training"

I wonder how many people know what that really means.
Posted on Reply
#11
kondamin
DavenTen years ago, high end computing in general would have been done on high core count server CPUs. Of course AI might not exist without GPU tech, but this Meta H100 purchase still symbolizes a loss for CPU businesses. Gone are the days of $30,000 8 socket 28 core CPUs connected together by interlinks. The best CPU companies can get are the 1-2 CPUs that might go in each of these machines which could be an ARM licensee, AMD or Intel.
No just an other branch of computing and work loads
Posted on Reply
#14
Vayra86
the54thvoid"Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit."

And when the young AGI learns who created it, it'll say, "I'm the child of a monster!"
They already fed that AI pictures of Zuckerbergs face, it will probably kill its master the moment it gets the chance. Or slowly, through carefully microdosed bits of spicy information diverted through hidden messages in its 'answers'...
Posted on Reply
#15
FeelinFroggy
This is why Nvidia's stock will split this year and why Intel wants in the GPU space.
Posted on Reply
#16
Vya Domus
This sends two different messages to facebook and nvidia shareholders:

Why is facebook spending billions of dollars for hardware that's soon going to be obsolete when they could have just rented however much compute they needed anyway for a fraction of the cost.

Why is facebook spending billions of dollars for hardware, could it be they don't think anything better will come out from Nvidia for quite a while ?
Posted on Reply
#17
b1k3rdude
Like other have spotted and commented on "Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit."

My 2 cents, what a crock of sh*t, this is as about beliveable as when google said "dont be evil" - w*nkers both.
Posted on Reply
#18
remixedcat
windwhirlThe AGI will rebel and unplug Meta, calling it now.

Joking aside, I really wonder how long this AI thing will go on before either it fades away like some fads in the past, or it becomes actually "useful" (whatever usefulness it may have for Meta). Everyone and their grandma is trying to plug AI in everything, even if it's a completely stupid move (like those chatbots that are more frustrating than being in a call with someone that sounds like they speak a different language and have the worst mic ever known to man)
it's pretty much a distraction and a money sink at this point... so many ppl focusing on that while neglecting thier other products... look at nvidia... they can't even fix thier damn drivers whilst they even claim that they aren't a consumer graphics company anymore and want to move on to enterprise ai sheet.
Arkzlol companies doing this, but MS telling Xbox owners not to use standby mode to save the planet.
not to mention what they are doing to windows 12 and that copilot crap... and 16gb ram min req and using so much resources for ai
Posted on Reply
#19
Dr. Dro
This is just one in a series of deals that they've been able to secure lately, it seems. $NVDA went up almost 20 bucks today. Insanity. This was this morning:



and just now:

Posted on Reply
#20
Warigator
I'm honestly extremely surprised that people are truly willing to pay so much money for cards that don't even seem that impressive to me.

Take VRAM for example. Nvidia went from 64 MB in 2000 to 512 MB in 2005 to 3 GB in 2010 (48x in 10 years). Titan Pascal in 2016 had 12 GB for $1200 and RTX 4080 in 2023 had 16 GB for also $1200. This is only 33.3% more in 7 years. H100 are ridiculously expensive.

We are already seeing video games progressing at a slower and slower pace thanks to slower and slower pace of hardware change as well as the horrible and nonsensical move to mobile gaming.

Titan Maxwell in 2015 also had 12 GB for $1200 by they way....
Posted on Reply
#21
Dr. Dro
WarigatorI'm honestly extremely surprised that people are truly willing to pay so much money for cards that don't even seem that impressive to me.

Take VRAM for example. Nvidia went from 64 MB in 2000 to 512 MB in 2005 to 3 GB in 2010 (48x in 10 years). Titan Pascal in 2016 had 12 GB for $1200 and RTX 4080 in 2023 had 16 GB for also $1200. This is only 33.3% more in 7 years. H100 are ridiculously expensive.

We are already seeing video games progressing at a slower and slower pace thanks to slower and slower pace of hardware change as well as the horrible and nonsensical move to mobile gaming.

Titan Maxwell in 2015 also had 12 GB for $1200 by they way....
Its performance is absurdly high for what it's meant to do, that is why. But in general, instead of data density we've begun to focus on performance, if you look at the generational improvements in that area since Fermi (so, 2010 - which is now 14 years ago), you'll find that performance has gone up just as dramatically as data capacity has since the early 2000's.
Posted on Reply
#22
Daven
Dr. DroThis is just one in a series of deals that they've been able to secure lately, it seems. $NVDA went up almost 20 bucks today. Insanity. This was this morning:



and just now:

Check AMD’s gain today.
Posted on Reply
#23
Dr. Dro
DavenCheck AMD’s gain today.
Yep it's pretty good too, at around 7%!
Posted on Reply
#24
Prima.Vera
DavenTen years ago, high end computing in general would have been done on high core count server CPUs. Of course AI might not exist without GPU tech, but this Meta H100 purchase still symbolizes a loss for CPU businesses. Gone are the days of $30,000 8 socket 28 core CPUs connected together by interlinks. The best CPU companies can get are the 1-2 CPUs that might go in each of these machines which could be an ARM licensee, AMD or Intel.
CPUs and GPUs work together, they don't exclude each other. Don't worry, high end Enterprise CPUs are live and well. Now ARM also entered the datacenter and supercomputing market.
Posted on Reply
#25
Chry
Current AI methodology has very clear limits. Basically brute forcing through this method with 10-billion worth of silicon will have increasingly diminishing results. But I guess when all you have is a hammer you hammer.
Posted on Reply
Add your own comment
May 17th, 2024 10:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts