• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

OpenAI Has "Run Out of GPUs" - Sam Altman Mentions Incoming Delivery of "Tens of Thousands"

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,198 (3.97/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
Yesterday, OpenAI introduced its "strongest" GPT-4.5 model. A research preview build is only available to paying customers—Pro-tier subscribers fork out $200 a month for early access privileges. The non-profit organization's CEO shared an update via social media post; complete with a "necessary" hyping up of version 4.5: "it is the first model that feels like talking to a thoughtful person to me. I have had several moments where I've sat back in my chair and been astonished at getting actual good advice from an AI." There are apparent performance caveats—Sam Altman proceeded to add a short addendum: "this isn't a reasoning model and won't crush benchmarks. It's a different kind of intelligence, and there's a magic to it (that) I haven't felt before. Really excited for people to try it!" OpenAI had plans to make GPT-4.5 available to its audience of "Plus" subscribers, but major hardware shortages have delayed a roll-out to the $20 per month tier.

Altman disclosed his personal disappointment: "bad news: it is a giant, expensive model. We really wanted to launch it to Plus and Pro (customers) at the same time, but we've been growing a lot and are out of GPUs. We will add tens of thousands of GPUs next week, and roll it out to the plus tier then...Hundreds of thousands coming soon, and I'm pretty sure y'all will use every one we can rack up." Insiders believe that OpenAI is finalizing a proprietary AI-crunching solution, but a rumored mass production phase is not expected to kick-off until 2026. In the meantime, Altman & Co. are still reliant on NVIDIA for new shipments of AI GPUs. Despite being a very important customer, OpenAI is reportedly not satisfied about the "slow" flow of Team Green's latest DGX B200 and DGX H200 platforms into server facilities. Several big players are developing in-house designs, in an attempt to ween themselves off prevalent NVIDIA technologies.



View at TechPowerUp Main Site | Source
 
Cool story bro.
 
gotta keep people believing otherwise it falls apart too soon.
 
Sam Altman announces new ICO... erm I mean... yeah
 
My only question is what in the hell are they doing with all of this AI stuff? I would say concocting novel ways of bending people over while making them smile & milking them for as much money as possible, but I don't think they need AI for that given the general lack of prudence among the masses.
 
Imagine how much money they're burning with all the pointless hardware purchases. They'll be lucky to keep going another 12 months.
 
Then next week another chinese comes up with an AI that runs on a MSX and Jensen strangulates Sam with his leather jacket.
 
Just needs more money, more time, they swear they will achieve their goal...... maybe they and Fusion will be ready in 50 years?
 
Imagine how much money they're burning with all the pointless hardware purchases. They'll be lucky to keep going another 12 months.

As long as M$ is willing to keep paying the bills they can last a long long time. That won't be forever though... Can't wait for the headlines trashing them in a couple weeks when Deepseek or any other upstart launches a model that's able to achieve 70, 80 or even 90% of what this does for a fraction of the cost
 
In the meantime, Altman & Co. are still reliant on NVIDIA for new shipments of AI GPUs. Despite being a very important customer, OpenAI is reportedly not satisfied about the "slow" flow of Team Green's latest DGX B200 and DGX H200 platforms into server facilities.
There is AMD out there, but I guess they don't have the software skills to incorporate AMD GPUs in their systems.
 
There is AMD out there, but I guess they don't have the software skills to incorporate AMD GPUs in their systems.
You sure about that?

Maybe AMD doesn't have the skills to incorporate the hardware needed.. just throwing that out there.
 
There is AMD out there, but I guess they don't have the software skills to incorporate AMD GPUs in their systems.

You sure about that?

Maybe AMD doesn't have the skills to incorporate the hardware needed.. just throwing that out there.
It could also be both. Grifters are, by nature, lazy people, so they'll gravitate to nVidia since CUDA is already well documented and tools already exist. Deepwhale or whatever it was called showed AMD hardware can be very competitive if you build from the ground up. Of course, because only the most driven would ever do that, not having tools ready for clients puts you at a massive disadvantage.
 
CUDA is almost 20 years old. Quite the head start.
 
"Sorry, Sam, we seem to have misplaced some ROPs somewhere. Once we find them, we'll be in touch."
 
You sure about that?

Maybe AMD doesn't have the skills to incorporate the hardware needed.. just throwing that out there.
I see them begging Nvidia for GPUs, but probably Nvidia has given priority to Musk, because, well, he is in the government, and Microsoft and Google and Meta. Add Deepseek in the mixture and OpenAI from leader in AI could drop many places.
Also OpenAI thinking of building it's own chips and Altman being in bad relations with Musk, who let me repeat is in the government, well OpenAI probably gets the last place in the priority list of Nvidia's top customers. They either differenciate themselves, or keep begging.
 
They know they can use Radeon Instincts instead, right? LLMs don't need the image generation power of NVIDIA hardware, they could just put all the stuff doing inference on Instincts and reserve the NVIDIA hardware for training. Unless they need more density, then like, what? Are you training a 1T+ model or something?
OpenAI Has "Run Out of GPUs"
... And that's a GOOD thing!
 
"Sorry, Sam, we seem to have misplaced some ROPs somewhere. Once we find them, we'll be in touch."
OMFG what a nightmare. If this ROP issue is way bigger than being communicated, that's a LOT of defects.
Around every two dozen cards missing the 8 ROPs or whatever is a whole ass other flagship card missing.
That's actually expensive.
 
The More You Buy, The More You Save
 
You sure about that?

Maybe AMD doesn't have the skills to incorporate the hardware needed.. just throwing that out there.
Well from what I know in that segment the hardware is okay, the software support still lacks behind.........but yeah it still would be smart to not only rely on one supplier. I guess many companies nowadays just rely on just in time and only one source of supplies. That's okay as long everything is fine. ;)
 
gotta keep people believing otherwise it falls apart too soon.
What, you mean the "AI" scam's run its course! No freakin way :laugh:

Imagine how much money they're burning with all the pointless hardware purchases. They'll be lucky to keep going another 12 months.
Imagine the energy, then job losses & maybe rehiring the lot of them because your stupid "AI" can't distinguish between sarcasm and a real question :shadedshu:
 
Imagine how much money they're burning with all the pointless hardware purchases. They'll be lucky to keep going another 12 months.
Yep I think the last figures I heard were they spent nine billion in 2024 to make four billion in revenue. Even their $200/month subscriptions are losing them money.

The main issue with the current AI stuff is the utility of it just isn't there for anything other than psychotic chatbots, pictures of big-titted elf girls, and source code analysis.
 
Yep I think the last figures I heard were they spent nine billion in 2024 to make four billion in revenue. Even their $200/month subscriptions are losing them money.

The main issue with the current AI stuff is the utility of it just isn't there for anything other than psychotic chatbots, pictures of big-titted elf girls, and source code analysis.

More importantly, competition have shown they can make better models for a few million...

No idea how OpenAI expect to compete, their costs are 100x too high and they'll be undercut to bankruptcy most probably.

Just a matter of time before open source competitors show up trained on Western data, instead of the Chinese stuff.
 
Back
Top