• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

There are no 4K cards and there never will be. There still isn't a 1080p card. The goal posts move and viewport resolution is already far from the only major influence. What really will matter is how you render within the viewport. We already have many forms of pseudo 4K with internal render res that is far lower, and even dynamic scaling, on top of all the usual lod stuff etc. etc. On top of even that, we get tech like DLSS and obviously RT.

Basically with all that piled up anyone can say he's running 4K, or 1080p, whatever seems opportune at the time :)
The way I view a real 4K card (I think the same way most people view it) is that it can run the newest AAA games in 4K today. None of that non-sense you mentioned that some people may view as 4K. I could tell you I'm running 4K on Minecraft with a reduced field of view on my old R9 280X. That doesn't make my 280X a true 4K card. Sure, AAA games get better with time and also people will argue about the perfect frame rate a card should output for them, but all in all, hitting at least 60 fps on high or ultra settings on all the new AAA titles that are out today is where it's at with 4K for most people. I think 3090 could be the first card that actually will offer that and why I called it the first true 4K card.

Its not really the right perspective, perhaps.

If Nvidia has deemed it necessary to make a very big, power hungry GPU that even requires new connectors, it will royally step outside their very sensible product stack. If that is something they iterate on further, that only spells that Nvidia can't get a decent performance boost from node or architecture any more while doing a substantial RT push. It means Turing is the best we'll get on the architecture side, give or take some minor tweaks. I don't consider that unlikely, tbh. Like CPU, there is a limit to low hanging fruit.

This is not good news. It is really quite bad because it spells stagnation more than it does progress. The fact it is called 3090 and is supposed to have a 2k price tag tells us they want to ride that top-end for quite a while and that this is their big(gest) chip already. None of that is good news if you ask me.

Another option though is that they could not secure the optimal node for this, or the overall state of nodes isn't quite up to what they had projected just yet. After all, 7nm has been problematic for quite some time.
CPUs are much different than GPUs when it comes to parallel processing. You can do a lot more with GPUs by just increasing the core counts and improving the memory bandwidth. Graphics is the best-known use case for parallelism while increasing the bandwidth to the memory and that's something they are clearly doing. That's actually what they have been doing forever. So yea, at some point, increasing cores all the time, the cards have to become bigger no matter what. And they have been already becoming gradually bigger throughout the years. Nothing new. The new part is this massive jump in size from the last generation, which means they are going to increase the cores by more than usual and clock them much higher. They may even bake in some new stuff that takes advantage of all that bandwidth (over 1 TB/s). This is also a huge jump in bandwidth from 2080 ti (620GB/s).

But none of this really tells you anything about the architecture improvement of Ampere. None of the info on power consumption or size really shows you anything about that. Also, the price means very little here. It could still be some totally crazy performance increase per Watt, we don't know. People just like to speculate and be negative without actual info.
 
Last edited:
Then buying a new AC should probably be a higher priority than gaming on max settings.
LOL. Let’s see, a $6000 (+) system that is not broken vs playing on a computer, just to keep you happy so you can dictate to us how to spend our money? LOL, what hubris!

How presumptuous you are to decide that my AC which is to all the specs for the house size and is not broken should be replaced. I didn’t complain, I merely told you that it can be felt in the summer months.
 
LOL. Let’s see, a $6000 (+) system that is not broken vs playing on a computer, just to keep you happy so you can dictate to us how to spend our money? LOL, what hubris!

How presumptuous you are to decide that my AC which is to all the specs for the house size and is not broken should be replaced. I didn’t complain, I merely told you that it can be felt in the summer months.
Dude, 20 years ago I lived on the border to Africa for 3 years and I didn't have AC, and having a PC in my room was the least of my problems. Chill.

If it's that important to you, go write a complaint to Nvidia lol. Maybe they'll write you a letter back apologizing that their GPUs don't cool your room yet. Your expectations are way too overblown. Do you want them to make this GPU 50% slower just so your room stays cooler? I mean, what kind of point are you even trying to make?
 
Last edited:
Do you want them to make this GPU 50% slower just so your room stays cooler?

Since when did power requirements double for generational improvements?

Or why blow 300 extra watts for rtrt that many people find unappealing? Or not even noticable?
 
Dude, 20 years ago I lived on the border to Africa for 3 years and I didn't have AC, and having a PC in my room was the least of my problems. Chill.

If it's that important to you, go write a complaint to Nvidia lol. Maybe they'll write you a letter back apologizing that their GPUs don't cool your room yet. Your expectations are way too overblown. Do you want them to make this GPU 50% slower just so your room stays cooler? I mean, what kind of point are you even trying to make?
AGAIN, I didn’t say it was important or a major issue. I simply told you that a high performing system DOES make a difference inside the house, contrary to your stated view. That is it, nothing more.

You are the one who assumed it meant more than it did, or that I was complaining and presumed to tell me what to do with my house.
 
Your expectations are way too overblown.
That's your opinion, and considering
having a PC in my room was the least of my problems.
I suggest you consider that not everyone has the same standards and priorities that you do. I just spent $4,000 USD on an Apple laptop. Where do you think I fall in that spectrum? Without knowing anything about me or what I do, would you judge me just as you're judging @rtwjunkie?

Have some perspective, my friend. The world is a diverse place and I'd be careful about judging others unless you want to be judged yourself.
 
Still this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind.

Either you have AC or a big room. My PC can heat up my room within 5 minutes, which is why I run it remotely in a dedicated closet.

YPMD

(Your Priorities may differ)
 
I feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]
 
I feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]

Heck it is pretty ironic. But that happens on both sides really. They are called fanboys and are best ignored.

On that note, heat does not matter to me MUCH (being my computer lives in a data closet). But it's not an encouraging development either.
 
cant wait for this, going to be nice to just have a single bundle of wires going to the GPU.
I can't wait for an electrostatic Induction card where no wires are visible. ;)
 
Then buying a new AC should probably be a higher priority than gaming on max settings.

How.... American. Why not work to reduce the heat output instead? Lower electrical cost from the PC and the AC unit, more comfort.

Brute force wastefulness is not a good ideal.
 
Heck it is pretty ironic. But that happens on both sides really. They are called fanboys and are best ignored.

On that note, heat does not matter to me MUCH (being my computer lives in a data closet). But it's not an encouraging development either.

I live with what I have to live with but to celebrate potential rediculous wattage because we'll get two rt rays instead of one is not my cup of tea. If we were doubling the 2080ti raster and rtrt then I might be inclined to agree...
 
Technically 6pin and 8pin are the same, the two extra pins are just sense and a ground pin. And the sense pin is a ground too.

Technically, an 8 pin cable is "officially" rated for 150 watts and a 6 pin is "officially" rated for only 75 watts.... so if making a warranty claim, good idea not to mentioned you used a 6 pin cable beyond it's official rating. On reason it may not matter is that quite often the 8 pins are not really necessary and provided just in case of heavy overlcocking. Anoter issue to be concerned with is that cheap cable manufacturers use unsersize wiring. Some quotes from the pinout sites:

"Some video cards come with the 8 pin PCI express connector to support higher wattage than the 6 pin PCI Express connectors. It's okay to plug a 6 pin PCI Express power cable into an 8 pin PCI Express connector. It's designed to work that way but will be limited to the lower wattage provided by the 6 pin version of the cable. The 6 pin cable only fits into one end of the 8 pin connector so you can't insert it incorrectly but you can sometimes force the 6 pin cable in the wrong way if you try hard enough. Video cards can sense whether you have plugged a 6 pin or 8 pin cable into an 8 pin connector so the video card can impose some kind of restriction when running with only a 6 pin power cable. Some cards will refuse to run with only a 6 pin cable in an 8 pin socket. Others will work with a 6 pin cable at normal speeds but will not allow overclocking. Check the video card documentation to get the rules. But if you don't have any other information then just assume that if your video card has an 8 pin connector then you must plug in an 8 pin cable. "

"The PCI Express 2.0 specification released in January 2007 added an 8 pin PCI Express power cable. It's just an 8 pin version of the 6 Pin PCI Express connector power cable. Both are primarily used to provide supplemental power to video cards. The older 6 pin version officially provides a maximum of 75 watts (although unofficially it can usually provide much more) whereas the new 8 pin version provides a maximum of 150 watts. It is very easy to confuse the 8 pin version with the very similar-looking EPS 8 pin power cable "

That sense cable, when used, is to detect which cable is connected and to limit power as described above.

There's good info here on the subject


Another thing worth looking at is this video :

Around the 18 minute mark, they talk about the RX 480 ... and what happened with that card was that with the 6 pin connector, power delivery was limited to 75 watts ... Because the card was drawing more than 150 watts .... the extra wattage was not being drawn thru the cable, it was pulled from the PCI slot ... which they explain is a very bad thing. So it's not a matter so much if the cable can stand it w/o melting, it's whether the PSU will deliver more than 75 watts. They show how the 6 pin 480 causes way more than 75 watts thru the slot ... and than later, with an 8 pin connection more powerful cards and bigger loads don't do that.

So yes a 6 pin cable isn't going to melt under normal circumstances. However ...
Your PSU may not deliver 150 watts to the card ... it may be limited to 75 watts and this will impact performance
You could wind up in a situation like with the 480.... whereby the MoBo and / or card vendor did not provide adequate protections and the 750 watt limit of the card slot could be exceeded.

Advice ?

1. Don't throw away your power cables
2. If you do spend the $4 on a OEM replacement cable
 
I feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]

Bingo!

A mod told me this was REALLY NAUGHTY:

That looks like a good idea, but AMD fans want lots of connectors so they can be upset.

Welcome to AMDsuperTechpowerUp forums. You access may be denied.
 
You can definitely heat a room with a power hungry computer. My PC doubles as a space heater for my room. The rest of the house is kept very cold... but not in here. My crappy secondary computer kicks out a little bit of heat too, but not nearly as much. I run WCG on my CPUs and mine with my graphics cards. This keeps my teeth from chattering, gives me some fake internet money, and does a little good for everyone else, too.
 
Since when did power requirements double for generational improvements?

Or why blow 300 extra watts for rtrt that many people find unappealing? Or not even noticable?
How do you know that these 300 extra watts are only going to ray tracing?
AGAIN, I didn’t say it was important or a major issue. I simply told you that a high performing system DOES make a difference inside the house, contrary to your stated view. That is it, nothing more.

You are the one who assumed it meant more than it did, or that I was complaining and presumed to tell me what to do with my house.
I just never heard or read anyone ever making that claim. That is all I answered you. Apparently I was wrong and most people here actually do experience this and everybody experiences physical discomfort from it. Now 100 people want to correct me. I'll just keep worrying about other things.
That's your opinion, and considering

I suggest you consider that not everyone has the same standards and priorities that you do. I just spent $4,000 USD on an Apple laptop. Where do you think I fall in that spectrum? Without knowing anything about me or what I do, would you judge me just as you're judging @rtwjunkie?

Have some perspective, my friend. The world is a diverse place and I'd be careful about judging others unless you want to be judged yourself.
Ok, you're really gonna read me a sermon against judging others (which I wasn't even really doing if you turn on your brain and not just pile on) in a forum that judges everything all the time based on some shady leaks and has actual internet wars over their favorite GPU company over said leaks.

There's a really weird "Nvidia bad" vibe going around lately. And AMD hasn't even leaked anything yet... How does that even make sense? It's like some people just don't want 4K gaming and want to live in the past. Look, I just want 4K gaming to come as soon as possible, but hey that's just me. This card has a prohibitive price for a reason. It's about pushing the envelope and about pushing 4K gaming, not about saving your electricity bill or the planet (although the price will be in its favor). It's not about keeping your room cool, either. I just hope people can understand that.
Either you have AC or a big room. My PC can heat up my room within 5 minutes, which is why I run it remotely in a dedicated closet.

YPMD

(Your Priorities may differ)
Ok, if it's really that fast I'll think about replacing my heating system with a 3090. By the way, that joke is hilarious, let's never stop making this joke. At least I thought it was a joke all these years, but now I'm starting to take this whole sausage grill thing seriously.
 
Last edited:
in a forum that judges everything all the time based on some shady leaks and has actual internet wars over their favorite GPU company over said leaks.

This is not what we consider a positive quality of the forum, so yeah.

Ok, if it's really that fast I'll think about replacing my heating system with a 3090.

It only works if you live in a small space with poor ventilation. Guess where poor froggy lives? We call it the "gamer hive"

Some day, I'll escape.

but now I'm starting to take this whole sausage grill thing seriously.

You could certainly manage it with some clever engineering but I'd rather just use it as a gpu..
 
I find this whole power cable thing that NVIDIA has done to be very odd, not sure what to think of it. Maybe in time it will make sense. If custom AIB GPU's come with the traditional connectors it will be a non issue I guess. I'm currently using Corsair HX750i (with 10 year warranties) on two systems and won't be changing anytime soon.
 
Around the 18 minute mark, they talk about the RX 480 ... and what happened with that card was that with the 6 pin connector, power delivery was limited to 75 watts ... Because the card was drawing more than 150 watts .... the extra wattage was not being drawn thru the cable, it was pulled from the PCI slot ... which they explain is a very bad thing. So it's not a matter so much if the cable can stand it w/o melting, it's whether the PSU will deliver more than 75 watts. They show how the 6 pin 480 causes way more than 75 watts thru the slot ... and than later, with an 8 pin connection more powerful cards and bigger loads don't do that.

So yes a 6 pin cable isn't going to melt under normal circumstances. However ...
Your PSU may not deliver 150 watts to the card ... it may be limited to 75 watts and this will impact performance
You could wind up in a situation like with the 480.... whereby the MoBo and / or card vendor did not provide adequate protections and the 750 watt limit of the card slot could be exceeded.

Advice ?

1. Don't throw away your power cables
2. If you do spend the $4 on a OEM replacement cable

I can hardly take seriously their wattage analysis, the method and tools that they have use they are poor.
There is available a special current (clamp probe) for oscilloscope, this having the necessary bandwidth of 500 kHz, this be able to solely measure amperes draw at the side of the wires.
Either way , heat and lots of amperes, this will puzzle more all those planning 4K gaming.
 
There is a nice video out about thermal design, cross talk and other technical issues ... and yes, good part of it is abiut the 12 pin power connector which appears to have been included as its significantly smaller than 2 6-pins


Don't go looking for performance numbers or something to crow about for or against ... but provides a good tool for a beginner to begin understanding how heat and other design considerations are addressed....

Thermal Design - 0.00 ==> 3:00

Mechanical Design - 3:00 ==> 4:45 (Cooling System Hold Down)

Power Handling - 4;46 ==> 6:05 (Crosstalk, PCB)

Power Connector - 6:06 ==> 6:25

Form and Function Blah Blah 6;26 ==> end

I can hardly take seriously their wattage analysis, the method and tools that they have use they are poor.


You should. This site is one that half of youtube hosts use to parrot "their findings

 
Back
Top