• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

They're gonna turn off a lot of buyers with this so i don't see how someone has money hungry as Jensen would allow this to happen.
 
They're gonna turn off a lot of buyers with this so i don't see how someone has money hungry as Jensen would allow this to happen.
Hi,
Only thing I've noticed that really "turns off" buyers is gpu price not a power adapter or new cable that's chicklets compared.
 
Hi,
Only thing I've noticed that really "turns off" buyers is gpu price not a power adapter or new cable that's chicklets compared.

Apparently ppl like to complain about price yet they still be buying these gpus at tripled prices. Nvidia didn't get here alone.
 
I have a feeling the 3090 exists only because RDNA2 is going to tie/beat a 2080 ti, so Nvidia felt they had to go full hog on the 3090 just so benches will show nvidia is still king. but in reality most people will only buy the 3080 due to price. meh we will see.

I'm still going with 4800x and Big Navi on their respective release dates. I love my current all AMD system. as long as I don't oc the gpu I have no issues. and fair enough, I'm to old and tired to care about that crap anymore.
 
Relax people, the recommended PSU number means nothing, that is to help people with bad or double rail PSUs. I'll be perfectly able to use an RTX 3090 with a gold 550W supply. The only thing that really trips power supplies is Intel turbo (65W suddenly -> 250W -> 65W in a few seconds). If you limit your max Intel turbo power draw to 100W (which makes no difference in gaming, I've never gotten close to 100W in gaming) then it is easy to use a 550W supply (400W available). You just need knowledge and quality.
 
Part of the reason is they didn't need to sacrifice efficiency for absolute performance (leadership) especially after the GTX 480/580 infernos...
The GTX580 (v2 Fermi) was a 'corrected version' of the 480 and solved the heat issue. Later to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.

The 3090 imo is basically Nvidias 'doomsday weapon', an all out effort to ensure it maintains the top spot in all review sites benchmark charts, to basically solidify its 'mind share' in case big Navi is better than expected. And I believe Nvidia is taking a no holds barred approach with it, and price, power, heat be damned.

I believe there was supposed to be a 3080ti, but they temporarily may have set that aside to focus on something that will virtually guarantee beats big Navi. But that leaves too big of a gap between the 3090 and 3080, so pretty sure they will come out with a 3080ti at some later point after big Navi is released.


[/QUOTE]
 
The GTX580 (v2 Fermi) was a 'corrected version' of the 480 and solved the heat issue. Later to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.

The 3090 imo is basically Nvidias 'doomsday weapon', an all out effort to ensure it maintains the top spot in all review sites benchmark charts, to basically solidify its 'mind share' in case big Navi is better than expected. And I believe Nvidia is taking a no holds barred approach with it, and price, power, heat be damned.

I believe there was supposed to be a 3080ti, but they temporarily may have set that aside to focus on something that will virtually guarantee beats big Navi. But that leaves too big of a gap between the 3090 and 3080, so pretty sure they will come out with a 3080ti at some later point after big Navi is released.

I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.
 
If your Intel Comet Lake furnace and your NVidia Ampere furnace are both pulling maximum amperes at the same time, you may very well need an 850-watt power supply. The 660-watt PSUs that I've used for most of my family's PCs in the past decade might not be adequate for that load.

What this means for me is that I will swap my Focus PX-850 power supply into the new PC that I'm building to house a yet-to-be-released graphics card and I'll put the spare SS-660XP2 Platinum in the old PC since it "only" has to power a Vega64. :D I'm willing to pay for the extra electricity for the graphics card and the air conditioning while gaming. I am much more concerned about getting adequate case ventilation to handle that much heat from a graphics card without resorting to dustbuster noise levels.

Going forward, I will also revise my current usual PSU recommendation of a Prime PX-750, which has the same compact 140 mm depth as the entire Focus series. The Prime PX-850 and 1000 take up an additional 30 mm.
 
If your Intel Comet Lake furnace and your NVidia Ampere furnace are both pulling maximum amperes at the same time, you may very well need an 850-watt power supply. The 660-watt PSUs that I've used for most of my family's PCs in the past decade might not be adequate for that load.

What this means for me is that I will swap my Focus PX-850 power supply into the new PC that I'm building to house a yet-to-be-released graphics card and I'll put the spare SS-660XP2 Platinum in the old PC since it "only" has to power a Vega64. :D I'm willing to pay for the extra electricity for the graphics card and the air conditioning while gaming. I am much more concerned about getting adequate case ventilation to handle that much heat from a graphics card without resorting to dustbuster noise levels.

Going forward, I will also revise my current usual PSU recommendation of a Prime PX-750, which has the same compact 140 mm depth as the entire Focus series. The Prime PX-850 and 1000 take up an additional 30 mm.

No. Single card systems don't need nearly that much power for gaming. Vega 64 is the worst card I've ever seen for power, I'm not sure yet if nVidia will exceed that card. You have to do very heavy CPU OCing to run out of power with a 660W P.
 
I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.

The power consumption was right up there, so the heat output would be just the same too:


Things went truly mental on the rebranded 390X cards:


But hey, Hawaii was a good name because it's hot there too.
 
Last edited:
Cannot see any problem here. This connector is almost as small as single 8-pin. If you can get a full new cable for a modular PSU and not an adapter then this is an upgrade all around.

I have a feeling the 3090 exists only because RDNA2 is going to tie/beat a 2080 ti, so Nvidia felt they had to go full hog on the 3090 just so benches will show nvidia is still king. but in reality most people will only buy the 3080 due to price. meh we will see.

I'm still going with 4800x and Big Navi on their respective release dates. I love my current all AMD system. as long as I don't oc the gpu I have no issues. and fair enough, I'm to old and tired to care about that crap anymore.

Well at least you admit that you are complaining because you are old and tired, thats respectable :P It would be great if all malcontents here admited that they are complaining because they want new cards to be bad because it fits their story and state of mind.
 
Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts...
No it's 3x more efficient then, which as I posted in the other rumor threads previously ~ is not going to happen!
 
I thought in the 290/290x case it was because of the crap cheapo reference cooler and not because it was an actual inferno like the 480/580.
As mentioned the 580 was a corrected version of Fermi.
33856.png


You mean the "crap cheapo reference cooler" of Hawaii vs the "crap cheapo reference cooler" of Fermi? Got it.

59323.png


Bottom line, AMD has been more associated with hot, loud and power hungry over last 10 years than Nvidia has in terms of ref coolers.
 
  • Like
Reactions: M2B
No it's 3x more efficient then, which as I posted in the other rumor threads previously ~ is not going to happen!
You're right. That's one of the rumors I read. There were also more conservative rumors for this card saying it's 60%-90% faster than 2080 ti, so still an extremely large difference, even if it's on the lowest end of this rumor. Just 60% faster would be insane.
 
I wonder how much that 850w recommended is because the 12pin is supposed to be able to carry up to 600w
So it's probably not smart to run such a connector on a 600w power supply.
 
The hottest card I ever had is the ATI dual GPU in a single card... I think it was the 29xxx series. You really don't need a space heater with that card.
 
Last edited:
Later to come on the other side was Hawaii (290/290x), basically AMDs own 'infernos' which were just as bad on the heat/noise front. Odd how ppl seem to forget that.
The R9 390 I had ran hot, but I don't remember it catching on fire. Just saying.
 
so motherboards are going 12v and now videocards as well, interesting
Mobo's going 12v ?, i was sure it was the 5v and 3.3v Intel is trying to push.
 
You're right. That's one of the rumors I read. There were also more conservative rumors for this card saying it's 60%-90% faster than 2080 ti, so still an extremely large difference, even if it's on the lowest end of this rumor. Just 60% faster would be insane.

60% is not that unreasonable considering that Ampere succeeds the generation that didn't make big leaps in raw performance and didn't have node shrink or much better memory. Ampere gets massive upgrades all around, if it doesn't bring massive gains then nothing will anymore.
 
60% is not that unreasonable considering that Ampere succeeds the generation that didn't make big leaps in raw performance and didn't have node shrink or much better memory. Ampere gets massive upgrades all around, if it doesn't bring massive gains then nothing will anymore.

And it has also been two years, the longest time between upgrades ever.
 
Yeah interesting - but it won't make much difference to those that always require the highest performing gaming PCs.

...unless those that required the fastest performing gaming PCs, suddenly don't.
 
I've heard rumors that the 3090 will use 390+ watts, guess that's adding up to be true

No. Single card systems don't need nearly that much power for gaming. Vega 64 is the worst card I've ever seen for power, I'm not sure yet if nVidia will exceed that card. You have to do very heavy CPU OCing to run out of power with a 660W P.
Like I said, read rumors that the 3090 will use 390+ watts

Nvidia for the last decade, went for less power and more performance route and has been highly succesfull with this approach. So in general it does not make sense they will sacrifice power consumption figures for more power. But, we will know for sure when review comes up.
It does make sense if RDNA2 is going g to be good... I've heard rumors that while RDNA2 might night get the performance crown, that it will win the low high end/upper mid tier segment based on its efficiency and performance, that the second biggest Navi could be the real star of the next gen cards.
 
Back
Top