• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Upcoming PCIe 12VHPWR Connectors Compatible with NVIDIA RTX 30-series Founders Edition Graphics Cards

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,919 (2.50/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
As you're most likely aware of, NVIDIA introduced a new power connector with its RTX 30-series Founders Edition graphics cards and at the time it was something of a controversy, especially as none of its AIB partners went for the connector. As it turned out, the connector was largely accepted by the PCI-SIG, with a few additions which lead to the 12VHPWR connector. The main difference between the two was the addition of a small set of sense connectors, for a 12+4-pin type connector. It has now been confirmed that the 12VHPWR will work with NVIDIA's Founders Edition cards, although this isn't a huge surprise as such, but rather good news for those that happen to own a Founders Edition card and are looking to invest in a new PSU.

However, what's more interesting in the news about the 12VHPWR connector is that it will operate in two distinct modes. If the 4-pin sense connector isn't connected to the GPU, the PSU will only deliver 450 Watts to the GPU, presumably as some kind of safety precaution. On the other hand, if the sense connector is used, the same cable can deliver up to 600 Watts, which would allow for a combined card power draw of up to 675 Watts for next generation GPUs. It's possible that we'll see cards with multiple power thresholds that will be negotiated on the fly with the PSU and we might also see PSU's that can force a lower power state of the GPU in case the overall system load gets too high. It'll be interesting to see what the new standard delivers, since so far not a lot of details have been released with regards to how the sense function works in detail.



View at TechPowerUp Main Site
 
that sense connector seems like a way to limit performance. 2 identical NextGen (40xx) cards one with this connector, one without, the one without is significantly lower clocks and limited overclocking.
 
that sense connector seems like a way to limit performance. 2 identical NextGen (40xx) cards one with this connector, one without, the one without is significantly lower clocks and limited overclocking.
As this new sense connector is part of the PCI-SIG spec, I guess we won't be seeing any new cards without it.
 
As this new sense connector is part of the PCI-SIG spec, I guess we won't be seeing any new cards without it.
I'll stick with my magic 8-ball on this one.
 
675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.
 
Moar powah seems like the wrong way to go.
 
675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.
It's the upper limit, not lower...
 
I know. But the thought that there can be video cards that can suck more than 600W is beyond stupid.
Maybe we should ask the game developers to write better code?
 
Maybe we should ask the game developers to write better code?
Is it only the code that matters here?

New connector, probably, new possibilities.
 
Is it only the code that matters here?

New connector, probably, new possibilities.
Well, obviously games are getting more advanced, but no-one really asked for RT support, yet we got it and it seems to be part of the reason why new cards need even more power.
 
Well, obviously games are getting more advanced, but no-one really asked for RT support, yet we got it and it seems to be part of the reason why new cards need even more power.
Oh. The RT you meant. Either way, I dont think game coding can mitigate in a large number the needs of a hardware, thus the power consumption that goes with it. The GPU chip arch has more influence on that matter.
 
Oh. The RT you meant. Either way, I dont think game coding can mitigate in a large number the needs of a hardware, thus the power consumption that goes with it. The GPU chip arch has more influence on that matter.
It was just one example. Some games seem to be poorly coded these days as well, as they rely more and more on the hardware people have and can as such make "sloppier" code.
Due to this, we need more and more powerful hardware all the time. All the bad console ports are a great example of this, as many of them run great on console, but suck on PC.
 
It was just one example. Some games seem to be poorly coded these days as well, as they rely more and more on the hardware people have and can as such make "sloppier" code.
Due to this, we need more and more powerful hardware all the time. All the bad console ports are a great example of this, as many of them run great on console, but suck on PC.
I'm not saying no to this though. I see it, that the games (poorly coded) just works slower using more resources. Utilization of the resources is poor and in order to achieve better performance, the card needs to run at its maximum. It does correlate to a power draw nonetheless.
 
geesh...600w + for da GPU, plus whatever the CPU & other components need, if this trend of moar increased powah continues much longer, we will all need to install a small warp core and Cryochamber chiller next to the house just to use a friggin newish 'putin box, hahaha :)
 
Maybe we should ask the game developers to write better code?
While true, it shows how experienced developers are or are not. The bigger hit is those same developers not able to integrate APIs optimally, and sometimes the API needs to be edited to 'fit in'. Its quite obvious most developers dont know how to optimize and, if rumors are true, many use a third party to do so.
 
Will NVIDIA be claiming royalties on all modern power supplies now? :rolleyes:
 
I just installed a FE card with the 12-pin connector in a computer with a ATX12VO power supply. I can't imagine there's many out there with this combo right now.

If a power supply came with a 12VHPWR connector, could it be adapted to the old 6/8-pin? If so, I don't see any reason to not make this the standard.
 
Correlation does not imply causation, children. Just because the 12VHPWR connector can provide up to 600W of power, does not mean that next-gen GPUs will consume that much power. Engineers always build in extra capacity when designing new power connectors, otherwise said connectors would be obsolete within a year's time, which would entirely negate the point of standardising those connectors.

The RTX 3080 Ti draws a maximum of 359W which means its 12-pin connector provides under 300W of power, which is half the proposed 12VHPWR. If you really think the RTX 4080 Ti is going to somehow magically consume double the power, when no graphics card generation-on-generation has ever done that, then you're not thinking; you're a moron.
 
Trying to be proprietary along with intel, f em both
I suggest you open a dictionary and look up the definition of "proprietary", because it does not mean what you think it means.
 
Back
Top