• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

We aren't on the Metric System here so we don't use European metric volts we use 'Murican volts. :p
Thanks to God imperial bits and bytes are the same as their metric counterparts. :D
 
Previously, Huang: "The more you buy, the more you save."
Huang, now: "The hungrier the GPU, the more energy efficient it is."

1727201239310.jpeg
 
funny how some still seem to have trouble with understanding a capitalistic planet, they make a product to make money, and not to prioritize effiecency or please buyers.

igoring for a moment no one gets forced to buy stuff, and besides, if you have +1k for a single part, another 10 bucks on your power bill wont matter.
 
I feel like we heard stuff along these lines when 40 dropped. especially amongst the youtube shills. My hope is that as long as you actually have the requisite connector you will probably just be able to run a ryzen chip and a 5090 with 1200 or less.
 
funny how some still seem to have trouble with understanding a capitalistic planet, they make a product to make money, and not to prioritize effiecency or please buyers.

igoring for a moment no one gets forced to buy stuff, and besides, if you have +1k for a single part, another 10 bucks on your power bill wont matter.
Also funny thing is to see a lot of gamers used to think GPU's are dedicated exclusively for gaming like GPUs was meaning: gaming processing unit.
In fast pace evolving world aplicability of PCs and particulary GPUs also changing at fast pace.
 
Also funny thing is to see a lot of gamers used to think GPU's are dedicated exclusively for gaming like GPUs was meaning: gaming processing unit.
In fast pace evolving world aplicability of PCs and particulary GPUs also changing at fast pace.
Yeah, a 5090 with more VRAM at sub $2k is actually going to be a steal for many folks that use it professionally.
 
I'm thinking the 5090 is going to be under $2,000 but it just depends on how much they cut the big chip down. One 16 pin should be fine for that. The rumor about the dual 16 pin connector could be referring to possibly a 5090 Ti or a new Titan. Titan might be attractive again for prosumers that can't afford a professional card with the AI craze still in play but it could be another $3,000 monster card if it's close to/or is the full chip but with less VRAM specs than the pro cards.
 
gaming processing unit
This reminds me of youngsters who don't know jack about analog clocks and can't make sense of a save icon sinisterly thinking it's some sorta vending machine.

Gaming or not, stuff becomes more expensive, more moronic in dimensions, and more power hungry. Ada is two years old at this point and there's still no 30 W option. One can't build an HTPC that supports DLSS/RT (well, to be realistic, RT doesn't matter on such machines), or at least has next-gen dedicated graphics inside. GTX 1030 is horribly weak and terribly old. RX 6400 is also far from ideal.
 
This wont't even be aimed at gamers.

I predict they will compare it to Titan lineup, show all the AI productivity you could do with it for the fraction of the cost of dedicated AI accelerators, and then reveal some utterly ridiculous price like $7499, and call it a bargain!

And yet, the richie rich gamer bois will buy it simply because it's "the best" and they're all about those epeen points.
 
I wouldn't use a card this crazy, even if someone gave it to me for free. It's just silly. Let's hope 5080 and 70 are more normal cards. Size, Wats, price etc. This just screams ridiculoussssssssssssssssssss
 
1727219198293.png


I spose in a class full of children I'm better off trying to fit in
 
I wonder if it will have the same sound as the Geforce 5900 GTX dust buster so loud you can hear the performance. So much power it dims the lights when it starts gaming. Awe the power you have to spend 5 grand for the power anything else you will be left behind.

I really hope this breaks Nvidia when Gamers just plain up and says no. When they get lack sales and no one wants it they will finally realized their mistakes and do a redesign. This is what happened before with Nvidia. Remember they had the Geforce series up to the last card 9800. Then they redesigned their cards and chips and the GTX series came out. Low power design and higher GPU output. This was a good idea smart thinking. The GTX ended with the 1000 series cards which I still own. And still keeps up with current games today. Now we are in the RTX series up to the 5000 now. I wonder when it will break.
 
I see that progress going backwards is the new norm nowadays.
 
Now you can add AI home servers to the list.
BTW I'm living in my home almost half century old and have all 3 AC phases in my garage. But in inhabitated part of my home I have access to 230V AC fused by 10A or 16A internal fuse and 25A fuse outdoor.
We are not used to using clothes dryers but we definitely have some spare of power for using in near future for different purposes.

So frankly speaking it is not sound good if you are admitting - average Joe in leading world economy is limited to 1800 W in his/her flat.
Actually standard 120V plugs can go as high as 50A (so 6000W) and 240V plugs can go as high as 60A (so 14,400W) but I've never seen anything of the sort in a residential environment. If somebody really needed it they can have it installed though. Keep in mind this is for a single circuit breaker. The 120V/15A standard is required for lighting only, 20A is for certain appliances (like built-in microwave ovens, etc.).

The maximum power an entire home can use is limited by the transformer out on the pole, so really it depends on the rating of the transformer and the number of homes being fed by it.

I think 3-phase is limited to commercial and industrial sites here.

Don't quote me on any of this, it's been awhile.
 
I really hope this breaks Nvidia when Gamers just plain up and says no.

Right now Nvidia doesn't even need gamers. Even less than while on the crest of cryptomadness. AI rakes in billions, and they are betting it's not going to go away. It will be painfully obvious when Blackwell releases - they have already included "AI acceleration" in RTX logo, and people are still waiting for gaming card release...

1000004654.jpg
 
Right now Nvidia doesn't even need gamers. Even less than while on the crest of cryptomadness. AI rakes in billions, and they are betting it's not going to go away. It will be painfully obvious when Blackwell releases - they have already included "AI acceleration" in RTX logo, and people are still waiting for gaming card release...

View attachment 364714
Oh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.
 
Oh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.

Sure, I'm just pointing out that while Nvidia was trying to look like a "Grown Up" corporation, not primarily makers of "childish gaming hardware" for decades, it was only recently that Data Center revenue surpassed Gaming revenue significantly. And Nvidia was talking mostly about Data Center, Professional Visualisation, Automotive in every revenue report for decades! So people are often surprised by that fact.

1000004656.jpg


But now the tides are completely turned - even so much that Nvidia is pumping other sectors with income from AI - there is absolutely no other explanation for Gaming to be raking in record growth at the end of life cycle of a generation. Not shown in that graph:
  • Second-quarter Gaming revenue was $2.9 billion, up 9% from the previous quarter and up 16% from a year ago.
 
Last edited:
You do know there is not that much performance difference between RTX 4080 Super and RTX 4090 to just throw it away?

In latest reviews it's :

14.6% in 1080p (192 vs 230 FPS)
22% at 1440p (148 vs 280.4)
27.8% at 4K (89.2 vs 114)

By "going green" you might be throwing away half the reason you spent 1750 EUR instead of 1000 EUR?
For gamers using these cards for maybe 6-8hrs a week power consumption wont matter but for professional users using these near full tilt for 80-90hrs/week while paying for electricity at commercial rates that 33% drop in electricity is massive. "AI" bubble has already showed its professionals who are hoarding these high end gpus.
 
Looks like they forgot totally the energy efficiency like they managed to have with Maxwell and Pascal.

Kinda funny when everyone talks about saving the planet etc. :laugh:
 
What a clown show, truly.
 
a lot forget that companies usually make only products that sell, so until those talking about "green" should put their money where their mouth is, and not buy anything above XX60(Ti) chips, and i dont see that happening on a global scale.

ignoring those in forums like this, or the few that saved for "years" to buy a big chip to keep for years.
 
Oh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.

PC gaming is going to the cloud with your gaming PC being something with an IGP or a low end GPU. There's no changing that. AI is not for gaming.

Stunts like Titan, 3090, 4090, and now this monster are meant for work stations with XEON/EPYC and slap four of the fuckers in the box.
a lot forget that companies usually make only products that sell, so until those talking about "green" should put their money where their mouth is, and not buy anything above XX60(Ti) chips, and i dont see that happening on a global scale.

ignoring those in forums like this, or the few that saved for "years" to buy a big chip to keep for years.

Most PC gaming is actually on xx60 or lower GPUs. That's always been the case.
 
The problem with declaring Moore's Law as dead, whether it's true or not (people have been saying it for decades, yet a 15W laptop CPU today is vastly more powerful than a 65W desktop of a decade ago, so go figure...) is that 1200W hardware isn't something current technology can run on battery power, and the population at large are pushing computing further away from desktop systems to mobile devices each day.

The real progress in the GPU industry isn't how ridiculously expensive and overkill you can make something. Running a game at twice the resolution and four times the framerate of the developer's intent doesn't really make the game that much better, and a diminishing number of customers are willing to pay an increasingly-insulting amount of money for that privilege - making the flagship GPU market something that will be taken even less seriously by game devs than it already is. They want their game to run on as many PCs and consoles as possible, which means 300GB game installs requiring a $3000 PC to run at acceptable framerates just aren't going happen.

Meanwhile, mobile-first hardware from Qualcomm and Apple is encroaching into the traditional desktop, replacing traditional architectures and legacy silicon designs with ridiculously-optimised hardware that can operate entire systems at comparable performance to a desktop in the same power budget as a motherboard chipset in a traditional desktop. They can game right now in many popular titles and it won't be long before these ultra-efficient, low-power devices are the default hardware spec that developers are targeting. Sure, your 1200W GPU has 48GB of VRAM and can render at 8K120 but the game is designed to run on a hypothetical Nintendo Switch 2 and plays best on a Nintendo Switch 2 using batteries and a 40mm fan as the sole cooling for the entire system.

PC gaming is going to the cloud with your gaming PC being something with an IGP or a low end GPU.
Cloud is increasingly irrelevant as the shift to mobile devices continues and the dependence on a stable, uncontested, low-latency internet connection cannot be guaranteed.
Latency and input lag kills fun, especially inconsistent latency and input lag - and games only get played if they're fun, IME.
 
I was not disappointed by the jokes in this thread :D :D

Nvidia can make efficient cards. Their RTX A2000 was phenomenal and now the RTX A4000 SFF is a little marvel out there, like a 3060ti @70W, fantastic efficiency.

Clearly they know how to create these gems.

I guess it's just not worth the effort plus the premium feeling you are gaming on a huge, gas-guzzler you can look through the glass and feel better about the money spent :p
Yeah, what was the price of that A2000 again? $949? You going to pay that for 3060ti performance?
 
Back
Top