• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

102 at 600W!? Wow. So they doubled their TDP overnight (-gen) even counting a shrink. Nice. Is this Nvidia doing a lil' AMD here? Moar moar moar because our core tech is really EOL with bandaids? Or is this the new reality... and will AMD follow suit. Either way, Nvidia is giving away a LOT of room to play for AMD to come up with something silly. I'm just trying to let it sink in here. Six. Hundred. Watts. I mean, two generations back we had the same SKU at 280W..
RDNA1 and 2 have been power efficient, RDNA2 is more efficient than Nvidia's 3000 series, so what are you referring to exactly?

Ironically, he is closer than the average leak of random Youtubers like MLID :)
Why so so many people have it out for MLID on here? His leaks have been more correct than not
 
What's the problem, intel ADL is only 240 so this is only a tad more. /s
 
hmmmmm this or new car

tough call
 
*Flashbacks of GTX400 and GTX500 times*
 
600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
 
I wonder if we'll start seeing auxiliary GPU PSUs that fit into a 120mm x 120mm or 140mm x 140mm fan mount and as thick as a 45mm-60mm radiator, in the same vein as former 5.25 Bay auxiliary PSUs from VisionTek, FSP, and Thermaltake (off the top of my head) that were used to provide power to GPUs of the time because consumer PSUs couldn't keep up.

system.jpg


Or if dual-PSU cases become a thing again, because dual-systems is so 2018 and it's cheaper to buy 2 800w Bronze PSUs than 1 1600w Gold PSU. Assuming it's dual-PSU cases again, Phanteks will need an updated version of their PSU Combo adapter with the new 12+4 pin GPU connector instead of the older 8-pins. Or go back to using slaved PSU adapters for delay-turning on the second PSU. At least we now have the smaller SFX PSUs that could be squeezed into cases next to the big ATX PSUs.
 
600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
1000w+ should do... That is if we don't see power spikes on the newer cards, which are hopefully gone but if they are still there you'd probably need atleast 1200w or more...

Oh and don't forget that's only with the top end sku's, the 4070 should be two 8pin and the 4080 three 8pin, and if you consinder undervolting (which is probably much more common on those cards) you probably wouldn't need a psu upgrade.

I do wonder though how the newer PSU's will look, say 1000w, will it keep having 4/5 8pin connectors or will it just be one or two with a 450/600w capable 16pin?
 
600W? Thanks, but no thanks. Summers here are hot AF, the last thing I need is additional room heater. Just imagine electricity bill, 800W for gaming PC, 300 Watts for air conditioner... That's +1KW/H consumption in order to game. Jensen has totally lost it.
Fire Burn GIF by Epitaph Records
If I owned hell and a room with a system that is running a top end 4000-series card, I would rent the room out and live in hell. :laugh: -Chronicles of NVidia
 
I'd be surprised if this thing tops 450w stock!
Same, 600w makes for great shock and awe headlines, but it's very unlikely to be what stock GPU's call for this gen.
people said the R9 390 and Vega 64 were power hungry
They certainly were.
Lovelace won't be anywhere near 600W, except the craziest overclocking SKUs. I will repeat, just because the 12VHPWR allows up to 600W power draw, does not in any way, shape or form mean that Lovelace will draw that.
Very much this. The new spec of power delivery allows 600w, I would massively doubt even the 4090 rolls out needing anywhere close to 600w.
 
I remember all the "modern PC's are much more efficient than older ones" "use less voltage" "improved efficiency per watt" talk, etc etc, meanwhile back in 2005 with 2.4v/3v DDR, 2v+ CPU's and "inefficient" GPU's that required.... wait for it.... external pcie 6 pin/4pin molex POWA :eek: running a top of the range gaming PC on a 300w PSU, but..... 2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
 
Does America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
Yeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.

I remember all the "modern PC's are much more efficient than older ones" "use less voltage" "improved efficiency per watt" talk, etc etc, meanwhile back in 2005 with 2.4v/3v DDR, 2v+ CPU's and "inefficient" GPU's that required.... wait for it.... external pcie 6 pin/4pin molex POWA :eek: running a top of the range gaming PC on a 300w PSU, but..... 2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
 
Yeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.


Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
Double performance at the same or less power draw, now we have 250w CPU's and 450w GPU's, have we really moved on? 95w blazing hot high clocked P4's lol
 
Double performance at the same or less power draw, now we have 250w CPU's and 450w GPU's, have we really moved on? 95w blazing hot high clocked P4's lol
The funniest thing is that back then the hottest P4s felt like a stove and everyone joked about them, modern Intel chips draw way more power but it's not as bad as in the P4 days..
 
I wonder how long before governments kick in and place limits on stupidly high wattage PC parts - i know at least one place did
 
Next time I'll turn it into something I can cook my meals on.
 
Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
 
102 at 600W!? Wow. So they doubled their TDP overnight (-gen) even counting a shrink. Nice. Is this Nvidia doing a lil' AMD here? Moar moar moar because our core tech is really EOL with bandaids? Or is this the new reality... and will AMD follow suit. Either way, Nvidia is giving away a LOT of room to play for AMD to come up with something silly. I'm just trying to let it sink in here. Six. Hundred. Watts. I mean, two generations back we had the same SKU at 28
Nvidia prioritize RT cores and Tensor core performance improvements over CUDA cores performance per watt, this is why since turning Nvidia cards consumes more and more power each gen, people want their rAyTrAcInG and dLsS.... well here you go, 600w GPUs son!

Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
They say that but they didint clarify if its RT cores performance or Cuda cores performance, they promised similar claims with ampere only to find out in reviews its ray tracing performance and not that substantial actual gpu performance

2022 save the planet, buy a platinum 1kw PSU to run your 450w GPU and 250w CPU.... PROGRESS :rolleyes:
Nobody cares about the planet, corporations just say that to gain karama points all while they release countless products year after year like smartphones for example, just look how many phones Samsung have released in just 2020 and 2021 alone, all while they stamp ( save the planet) on their packaging and removes chargers lol
 
If this is less than 2x performance uts a flop.. because amd will take the performance crown with the 7900xt
 
Nvidia prioritize RT cores and Tensor core performance improvements over CUDA cores performance per watt, this is why since turning Nvidia cards consumes more and more power each gen, people want their rAyTrAcInG and dLsS.... well here you go, 600w GPUs son!


They say that but they didint clarify if its RT cores performance or Cuda cores performance, they promised similar claims with ampere only to find out in reviews its ray tracing performance and not that substantial actual gpu performance


Nobody cares about the planet, corporations just say that to gain karama points all while they release countless products year after year like smartphones for example, just look how many phones Samsung have released in just 2020 and 2021 alone, all while they stamp ( save the planet) on their packaging and removes chargers lol

Just stick to a TDP figure you are comfortable with and let others enjoy their 500W GPU :D (not me, I would just undervolt it down to ~300W), I'm sure the AD104/106/107 chips will be ultra efficient and more suitable for your gaming need.

Ada is just a die shrink of Ampere, so RT/Tensor perf would scale linearly with raster compare to Ampere.

There are AI supercomputers doing all the planet saving stuffs so in the end capitalism save itself from the problem it created :D. This would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.
 
Last edited:
This would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.
Of course, when you offload all the misery of production to either third world countries or those 'non-capitalist' ones (that are still capitalist). That being said, please do not continue this conversation as there's no point in yet another thread to go political.
 
Of course, when you offload all the misery of production to either third world countries or those 'non-capitalist' ones (that are still capitalist). That being said, please do not continue this conversation as there's no point in yet another thread to go political.
You mean jobs?
 
The upcoming RTX 4080/90 lineup is obviously directed at enthusiast 4K gamers. I would assume these cards will deliver a very significant performance leap over the previous generation high-end GPUs at the cost of a markedly higher TDP, and they will be very expensive of course. The bulk of the customers however will buy mid-range cards, and so will I provided they significantly outperform my current GTX 1080Ti FE, have a good cooling solution with a two-slot design, and a TDP of around 300 Watts. I'll have my eyes on the RTX 4070 series and my hope is some custom designs (or maybe an FE type of card) will be brought to market that fit the bill.
 
Does America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)

That's just dumb miners being dumb.

The price of electricity is literally in every mining calculator.

As far as gamers go apparently UK electricity is up to 28p/kwh, which is insane, but even so at 600W, that's £20/month, gaming 4 hours a day with a £3000 (?) GPU.

Given that people give some stupid amount of money (north of £100/month) to watch Sky Sports, then even when you deliberately buy the 'massively overpriced halo edition' GPU, the electricity, even in post-Putin inflation hellscape, is not that expensive.
 
Back
Top