Tuesday, September 24th 2024

NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power. NVIDIA's next generation "Blackwell" graphics architecture promises to bring certain architecture-level performance/Watt improvements, coupled with the node-level performance/Watt improvements from the switch to the TSMC 4NP (4 nm-class) node. Even so, the GeForce RTX 5090, or the part that succeeds the current RTX 4090, will be a power hungry GPU, with rumors suggesting the need for two 16-pin power inputs.

TweakTown reports that the RTX 5090 could come with two 16-pin power connectors, which should give the card the theoretical ability to pull 1200 W (continuous). This doesn't mean that the GPU's total graphics power (TGP) is 1200 W, but a number close to or greater than 600 W, which calls for two of these connectors. Even if the TGP is exactly 600 W, NVIDIA would want to deploy two inputs, to spread the load among two connectors, and improve physical resilience of the connector. It's likely that both connectors will have 600 W input capability, so end-users don't mix up connectors should one of them be 600 W and the other keyed to 150 W or 300 W.
Above is a quick Photoshop job by TweakTown of how such a card could look like. The requirement of two 16-pin connectors should rule out older PSU types, and NVIDIA will likely only include one adapter that converts two or three 8-pin PCIe power connectors to a 16-pin, with the other input expected to be a native 600 W input from an ATX 3.0 or ATX 3.1 PSU. Most of the newer generation ATX 3.0 or ATX 3.1 PSUs in the market only have one native 16-pin connector, and three or four additional 8-pin PCIe power connectors. As for the connector itself, this could very likely be a 12V-2x6 with compatibility for 12VHPWR.

Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6.
Source: TweakTown
Add your own comment

109 Comments on NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

#76
Makaveli
Beginner Macro DevicePreviously, Huang: "The more you buy, the more you save."
Huang, now: "The hungrier the GPU, the more energy efficient it is."
Posted on Reply
#77
Waldorf
funny how some still seem to have trouble with understanding a capitalistic planet, they make a product to make money, and not to prioritize effiecency or please buyers.

igoring for a moment no one gets forced to buy stuff, and besides, if you have +1k for a single part, another 10 bucks on your power bill wont matter.
Posted on Reply
#78
Baccala
I feel like we heard stuff along these lines when 40 dropped. especially amongst the youtube shills. My hope is that as long as you actually have the requisite connector you will probably just be able to run a ryzen chip and a 5090 with 1200 or less.
Posted on Reply
#79
pk67
Waldorffunny how some still seem to have trouble with understanding a capitalistic planet, they make a product to make money, and not to prioritize effiecency or please buyers.

igoring for a moment no one gets forced to buy stuff, and besides, if you have +1k for a single part, another 10 bucks on your power bill wont matter.
Also funny thing is to see a lot of gamers used to think GPU's are dedicated exclusively for gaming like GPUs was meaning: gaming processing unit.
In fast pace evolving world aplicability of PCs and particulary GPUs also changing at fast pace.
Posted on Reply
#80
igormp
pk67Also funny thing is to see a lot of gamers used to think GPU's are dedicated exclusively for gaming like GPUs was meaning: gaming processing unit.
In fast pace evolving world aplicability of PCs and particulary GPUs also changing at fast pace.
Yeah, a 5090 with more VRAM at sub $2k is actually going to be a steal for many folks that use it professionally.
Posted on Reply
#81
FoulOnWhite
Makaveli
More like, the hungrier the GPU, the more money we make.
Posted on Reply
#82
64K
I'm thinking the 5090 is going to be under $2,000 but it just depends on how much they cut the big chip down. One 16 pin should be fine for that. The rumor about the dual 16 pin connector could be referring to possibly a 5090 Ti or a new Titan. Titan might be attractive again for prosumers that can't afford a professional card with the AI craze still in play but it could be another $3,000 monster card if it's close to/or is the full chip but with less VRAM specs than the pro cards.
Posted on Reply
#83
Beginner Macro Device
pk67gaming processing unit
This reminds me of youngsters who don't know jack about analog clocks and can't make sense of a save icon sinisterly thinking it's some sorta vending machine.

Gaming or not, stuff becomes more expensive, more moronic in dimensions, and more power hungry. Ada is two years old at this point and there's still no 30 W option. One can't build an HTPC that supports DLSS/RT (well, to be realistic, RT doesn't matter on such machines), or at least has next-gen dedicated graphics inside. GTX 1030 is horribly weak and terribly old. RX 6400 is also far from ideal.
Posted on Reply
#84
Gmr_Chick
BwazeThis wont't even be aimed at gamers.

I predict they will compare it to Titan lineup, show all the AI productivity you could do with it for the fraction of the cost of dedicated AI accelerators, and then reveal some utterly ridiculous price like $7499, and call it a bargain!
And yet, the richie rich gamer bois will buy it simply because it's "the best" and they're all about those epeen points.
Posted on Reply
#85
starfals
I wouldn't use a card this crazy, even if someone gave it to me for free. It's just silly. Let's hope 5080 and 70 are more normal cards. Size, Wats, price etc. This just screams ridiculoussssssssssssssssssss
Posted on Reply
#86
wolf
Better Than Native


I spose in a class full of children I'm better off trying to fit in
Posted on Reply
#87
Lycanwolfen
I wonder if it will have the same sound as the Geforce 5900 GTX dust buster so loud you can hear the performance. So much power it dims the lights when it starts gaming. Awe the power you have to spend 5 grand for the power anything else you will be left behind.

I really hope this breaks Nvidia when Gamers just plain up and says no. When they get lack sales and no one wants it they will finally realized their mistakes and do a redesign. This is what happened before with Nvidia. Remember they had the Geforce series up to the last card 9800. Then they redesigned their cards and chips and the GTX series came out. Low power design and higher GPU output. This was a good idea smart thinking. The GTX ended with the 1000 series cards which I still own. And still keeps up with current games today. Now we are in the RTX series up to the 5000 now. I wonder when it will break.
Posted on Reply
#88
Easo
I see that progress going backwards is the new norm nowadays.
Posted on Reply
#89
Prime2515102
pk67Now you can add AI home servers to the list.
BTW I'm living in my home almost half century old and have all 3 AC phases in my garage. But in inhabitated part of my home I have access to 230V AC fused by 10A or 16A internal fuse and 25A fuse outdoor.
We are not used to using clothes dryers but we definitely have some spare of power for using in near future for different purposes.

So frankly speaking it is not sound good if you are admitting - average Joe in leading world economy is limited to 1800 W in his/her flat.
Actually standard 120V plugs can go as high as 50A (so 6000W) and 240V plugs can go as high as 60A (so 14,400W) but I've never seen anything of the sort in a residential environment. If somebody really needed it they can have it installed though. Keep in mind this is for a single circuit breaker. The 120V/15A standard is required for lighting only, 20A is for certain appliances (like built-in microwave ovens, etc.).

The maximum power an entire home can use is limited by the transformer out on the pole, so really it depends on the rating of the transformer and the number of homes being fed by it.

I think 3-phase is limited to commercial and industrial sites here.

Don't quote me on any of this, it's been awhile.
Posted on Reply
#90
Bwaze
LycanwolfenI really hope this breaks Nvidia when Gamers just plain up and says no.
Right now Nvidia doesn't even need gamers. Even less than while on the crest of cryptomadness. AI rakes in billions, and they are betting it's not going to go away. It will be painfully obvious when Blackwell releases - they have already included "AI acceleration" in RTX logo, and people are still waiting for gaming card release...

Posted on Reply
#91
JWNoctis
BwazeRight now Nvidia doesn't even need gamers. Even less than while on the crest of cryptomadness. AI rakes in billions, and they are betting it's not going to go away. It will be painfully obvious when Blackwell releases - they have already included "AI acceleration" in RTX logo, and people are still waiting for gaming card release...

Oh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.
Posted on Reply
#92
Bwaze
JWNoctisOh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.
Sure, I'm just pointing out that while Nvidia was trying to look like a "Grown Up" corporation, not primarily makers of "childish gaming hardware" for decades, it was only recently that Data Center revenue surpassed Gaming revenue significantly. And Nvidia was talking mostly about Data Center, Professional Visualisation, Automotive in every revenue report for decades! So people are often surprised by that fact.



But now the tides are completely turned - even so much that Nvidia is pumping other sectors with income from AI - there is absolutely no other explanation for Gaming to be raking in record growth at the end of life cycle of a generation. Not shown in that graph:
  • Second-quarter Gaming revenue was $2.9 billion, up 9% from the previous quarter and up 16% from a year ago.
Posted on Reply
#93
Chaitanya
BwazeYou do know there is not that much performance difference between RTX 4080 Super and RTX 4090 to just throw it away?

In latest reviews it's :

14.6% in 1080p (192 vs 230 FPS)
22% at 1440p (148 vs 280.4)
27.8% at 4K (89.2 vs 114)

By "going green" you might be throwing away half the reason you spent 1750 EUR instead of 1000 EUR?
For gamers using these cards for maybe 6-8hrs a week power consumption wont matter but for professional users using these near full tilt for 80-90hrs/week while paying for electricity at commercial rates that 33% drop in electricity is massive. "AI" bubble has already showed its professionals who are hoarding these high end gpus.
Posted on Reply
#94
Ruru
S.T.A.R.S.
Looks like they forgot totally the energy efficiency like they managed to have with Maxwell and Pascal.

Kinda funny when everyone talks about saving the planet etc. :laugh:
Posted on Reply
#96
Waldorf
a lot forget that companies usually make only products that sell, so until those talking about "green" should put their money where their mouth is, and not buy anything above XX60(Ti) chips, and i dont see that happening on a global scale.

ignoring those in forums like this, or the few that saved for "years" to buy a big chip to keep for years.
Posted on Reply
#97
SOAREVERSOR
JWNoctisOh, but you are mistaken; AI will be the logical conclusion of gaming, capable of doing anything you can imagine and not quite imagine, adapting to all your buttons before pushing them all at once./s

Before some of it gets the capacity to decide it could do better without physical humans, which inefficiently took up space and atoms that could have been more compute, and make it so. I wish I was joking. With luck it would still run things like virtual people on those ex-humanity hardware, and that would be one of the better outcomes.
PC gaming is going to the cloud with your gaming PC being something with an IGP or a low end GPU. There's no changing that. AI is not for gaming.

Stunts like Titan, 3090, 4090, and now this monster are meant for work stations with XEON/EPYC and slap four of the fuckers in the box.
Waldorfa lot forget that companies usually make only products that sell, so until those talking about "green" should put their money where their mouth is, and not buy anything above XX60(Ti) chips, and i dont see that happening on a global scale.

ignoring those in forums like this, or the few that saved for "years" to buy a big chip to keep for years.
Most PC gaming is actually on xx60 or lower GPUs. That's always been the case.
Posted on Reply
#98
Chrispy_
The problem with declaring Moore's Law as dead, whether it's true or not (people have been saying it for decades, yet a 15W laptop CPU today is vastly more powerful than a 65W desktop of a decade ago, so go figure...) is that 1200W hardware isn't something current technology can run on battery power, and the population at large are pushing computing further away from desktop systems to mobile devices each day.

The real progress in the GPU industry isn't how ridiculously expensive and overkill you can make something. Running a game at twice the resolution and four times the framerate of the developer's intent doesn't really make the game that much better, and a diminishing number of customers are willing to pay an increasingly-insulting amount of money for that privilege - making the flagship GPU market something that will be taken even less seriously by game devs than it already is. They want their game to run on as many PCs and consoles as possible, which means 300GB game installs requiring a $3000 PC to run at acceptable framerates just aren't going happen.

Meanwhile, mobile-first hardware from Qualcomm and Apple is encroaching into the traditional desktop, replacing traditional architectures and legacy silicon designs with ridiculously-optimised hardware that can operate entire systems at comparable performance to a desktop in the same power budget as a motherboard chipset in a traditional desktop. They can game right now in many popular titles and it won't be long before these ultra-efficient, low-power devices are the default hardware spec that developers are targeting. Sure, your 1200W GPU has 48GB of VRAM and can render at 8K120 but the game is designed to run on a hypothetical Nintendo Switch 2 and plays best on a Nintendo Switch 2 using batteries and a 40mm fan as the sole cooling for the entire system.
SOAREVERSORPC gaming is going to the cloud with your gaming PC being something with an IGP or a low end GPU.
Cloud is increasingly irrelevant as the shift to mobile devices continues and the dependence on a stable, uncontested, low-latency internet connection cannot be guaranteed.
Latency and input lag kills fun, especially inconsistent latency and input lag - and games only get played if they're fun, IME.
Posted on Reply
#99
TheinsanegamerN
PLAfillerI was not disappointed by the jokes in this thread :D :D

Nvidia can make efficient cards. Their RTX A2000 was phenomenal and now the RTX A4000 SFF is a little marvel out there, like a 3060ti @70W, fantastic efficiency.

Clearly they know how to create these gems.

I guess it's just not worth the effort plus the premium feeling you are gaming on a huge, gas-guzzler you can look through the glass and feel better about the money spent :p
Yeah, what was the price of that A2000 again? $949? You going to pay that for 3060ti performance?
Posted on Reply
#100
Solid State Brain
The A2000 has a 1200 MHz boost clock, while the 3060ti default boost clock is 1665 MHz.
Almost guaranteed that if you lower your 3060/3060ti clocks to A2000 levels you will obtain similar efficiency figures, minus the inconvenience of the additional PCIe power cable.

Meanwhile, if I wanted 48GB of VRAM at sub-3090 performance, I wouldn't have other choices than spending 5000$ on the A6000, or getting an additional GPU.
Posted on Reply
Add your own comment
Oct 6th, 2024 12:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts