• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,018 (1.07/day)
After the data center-oriented Hopper architecture launch, NVIDIA is slowly preparing to transition the consumer section to new, gaming-focused designs codenamed Ada Lovelace. For starters, the source claims that NVIDIA is using the upcoming GeForce RTX 3090 Ti GPU as a test run for the next-generation Ada Lovelace AD102 GPU. Thanks to the authorities over at Igor's Lab, we have some additional information about the upcoming lineup. We have a sneak peek of a few features regarding the top-end GeForce RTX 4080 and RTX 4090 GPU SKUs. According to Igor's claims, NVIDIA is testing the PCIe Gen5 power connector and wants to see how it fares with the biggest GA102 SKU - GeForce RTX 3090 Ti.

Additionally, we find that the AD102 GPU is supposed to be pin-compatible with GA102. This means that the number of pins located on GA102 is the same as what we are going to see on AD102. There are 12 places for memory modules on the AD102 reference design board, resulting in up to 24 GB of GDDR6X memory. As much as 24 voltage converters surround the GPU, NVIDIA will likely implement uP9512 SKU. It can drive eight phases, resulting in three voltage converters per phase, ensuring proper power delivery. The total board power (TBP) is likely rated at up to 600 Watts, meaning that the GPU, memory, and power delivery combined output 600 Watts of heat. Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.


View at TechPowerUp Main Site | Source
 
102 at 600W!? Wow. So they doubled their TDP overnight (-gen) even counting a shrink. Nice. Is this Nvidia doing a lil' AMD here? Moar moar moar because our core tech is really EOL with bandaids? Or is this the new reality... and will AMD follow suit. Either way, Nvidia is giving away a LOT of room to play for AMD to come up with something silly. I'm just trying to let it sink in here. Six. Hundred. Watts. I mean, two generations back we had the same SKU at 280W..
 
Why are GPU manufacturers pushing higher and higher TDPs?
Is there any laws or regulations world wide limiting this? Semiconductor makers and designers should be subject to the same regulations that apply for general household appliances and car manufacturers, where efficiency is prioritized, we need more FLOPs per Watt.
 
The 4090 has already been tested:


Dunno about the 4080 ...
 
I’d be concerned if I thought i’d ever find one at a price i was willing to pay.
 
The 4090 has already been tested:


Dunno about the 4080 ...
Ironically, he is closer than the average leak of random Youtubers like MLID :)
 
600W? Thanks, but no thanks. Summers here are hot AF, the last thing I need is additional room heater. Just imagine electricity bill, 800W for gaming PC, 300 Watts for air conditioner... That's +1KW/H consumption in order to game. Jensen has totally lost it.
Fire Burn GIF by Epitaph Records
 
102 at 600W!? Wow. So they doubled their TDP overnight (-gen) even counting a shrink. Nice. Is this Nvidia doing a lil' AMD here? Moar moar moar because our core tech is really EOL with bandaids? Or is this the new reality... and will AMD follow suit. Either way, Nvidia is giving away a LOT of room to play for AMD to come up with something silly. I'm just trying to let it sink in here. Six. Hundred. Watts. I mean, two generations back we had the same SKU at 280W..

Make me think is a die shrink really worth it. Probably better off going back to 28 or 14nm.
 
Hopper leaks proved to be inaccurate so i would rather wait and see .
 
Last edited:
600 watts? Pff. My electric radiator is more powerful.


How're they going to NEARLY double power into that chip when the pins are exactly the same?

I'd be surprised if this thing tops 450w stock! Those extra watts have always been aimed at insane overclocker dreamers (because, 50% higher power consumption for 15% faster performance == GUD)!
 
I'm curious to how much this will hold true and if it does: what kind of heat it generates, how it's properly cooled and finally what kind of performance it gives....oh and also what kind of outlandish price tag they slap on it.
 
Why are GPU manufacturers pushing higher and higher TDPs?
Is there any laws or regulations world wide limiting this? Semiconductor makers and designers should be subject to the same regulations that apply for general household appliances and car manufacturers, where efficiency is prioritized, we need more FLOPs per Watt.

Because unless manufacturing has a miraculous discovery we have diminishing returns on smaller processes and the GPU companies apparently aren't able to hugely boost efficiency of their architectures anymore. And yet they need to continue to add new capabilities. So to stay ahead they have to resort to extremes.
 
Last edited:
Is this Nvidia doing a lil' Intel here?
Fixed that for you. ;)

Man, and people said the R9 390 and Vega 64 were power hungry. Damn.
 
The 4090 has already been tested:


Dunno about the 4080 ...
Haha, damn I love this guy. His april fools video was too early this year tho! :laugh:
edit: Ah, it was the last year's one, missed that.
 
~500 watts already can't drive cyberpunk 4K/60fps so yeah . . .

50% higher power consumption for 15% faster performance
sounds about typical. :ohwell:
 
Is there any laws or regulations world wide limiting this? Semiconductor makers and designers should be subject to the same regulations that apply for general household appliances and car manufacturers,
Seem to Recollect that California is/was going to bring in regs regarding power Consumption for Electronics
 
300w is the maximum is think is tolerable for my PC. i would need a better case to make sure it do not affect all the other component, i would need to buy another PSU because a Plantinium 850w might not be enough for the spike, etc...

but there will be people that will want that and i don't see the big deal if Nvidia is making it. Nobody will be forced to use those cards. I suspect that they will be very expensive as the card will need to have a really beefy power section to be able to manage and deliver all this power cleanly to a GPU that will boost up and down.

I think the problem would be if those cards would be mainstream. but the titans just showed both Nvidia and AMD that some people will pay any amount just to have the best of the best. Those cards are for those people and i am not sure if that is a significant portion of the gaming market.

If Nvidia deliver a high end (not top end) cards that is way better than what we currently have and run at 300w, i don't see that as a problem.

The other thing is how much power AMD will use to achieve the same performance. It remain to be seen but rumors is RDNA3 will have an efficiency advantage and that is the main reason Nvidia is pushing so hard those GPU. If AMD cards use much less power to deliver same performance, the card should also be cheaper to produce since it will require less cooling, less power stages, etc..

Nvidia had those series of GPU that were not that much competitive and got pushed really hard. but they always fought back quickly. Time will tell.
 
Does the official maximum of 300W still apply in order for cards to be in PCIe official specs? Though I guess that Nvidia and AMD don't care about that anymore.

It kinda sucks that they just put out more and more power hungry cards instead of increasing the efficiency. You guys remember how efficient Maxwell and Pascal were? Damn.
 
sooner or later, this meme is going to be reality....

1648238461313.png
 
Seem to Recollect that California is/was going to bring in regs regarding power Consumption for Electronics
did in 2016, enacted in 2019. prohibits prebuilts ie alienware but not the DIY market. its also in hawaii and a few other green friendly states - oregon, washington and colorado, the usual hippie states. :p
 
Lovelace won't be anywhere near 600W, except the craziest overclocking SKUs. I will repeat, just because the 12VHPWR allows up to 600W power draw, does not in any way, shape or form mean that Lovelace will draw that. The entire purpose of that connector is to future-proof the just-released ATX 3.0 specification for at least a decade.

24 phases for voltage is hardly anything to write home about on a high-end SKU, considering the 3090 FE has 19 and 3090 overclocking SKUs can have even more (e.g. Galax RTX 3090 Hall of Fame packs in 25).

AleksandrK said:
According to Igor's claims, NVIDIA is testing the PCIe Gen5 power connector and wants to see how it...

There is no such thing as a "PCIe Gen5 power connector". There is only the 12VHPWR connector which is part of Intel's ATX 3.0 specification.

AleksandrK said:
... fairs with the biggest GA102 SKU - GeForce RTX 3090 Ti.

It's "fares".

AleksandrK said:
Igor notes that board partners will bundle 12+4 (12VHPWR) to four 8-pin (PCIe old) converters to enable PSU compatibility.

Igor does not "note" anything. He does, however, make a large number of very bold claims.
 
Fixed that for you. ;)

Man, and people said the R9 390 and Vega 64 were power hungry. Damn.
What goes around... lol. Every company falls prey to it at some point.
 
Does America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
 
Lovelace won't be anywhere near 600W, except the craziest overclocking SKUs. I will repeat, just because the 12VHPWR allows up to 600W power draw, does not in any way, shape or form mean that Lovelace will draw that. The entire purpose of that connector is to future-proof the just-released ATX 3.0 specification for at least a decade.

24 phases for voltage is hardly anything to write home about on a high-end SKU, considering the 3090 FE has 19 and 3090 overclocking SKUs can have even more (e.g. Galax RTX 3090 Hall of Fame packs in 25).



There is no such thing as a "PCIe Gen5 power connector". There is only the 12VHPWR connector which is part of Intel's ATX 3.0 specification.



It's "fares".



Igor does not "note" anything. He does, however, make a large number of very bold claims.
You say that, but Nvidia is releasing a 450W 3090ti as we speak. And the 3090 was already stretching the limits quite handsomely. I like your disclaimer 'except the craziest overclocking SKUs'... refer to the beginning of this sentence and now consider 95% of all overclocks on Ampere involve an undervolt. Who are we kidding here?
 
Back
Top