• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

I am still waiting for that nice Guru, that will make a utility for us, so we can swap between DLSS versions for the games that support them with ease. :)

I'm using DLSS Swapper which does exactly what you want "swap between DLSS versions for the games that support them with ease", although it only works with Steam.

This why the price will be much higher, but you could get a 4070 that will still beat a 3090 at 4K in pure ratserisation and easily beat it for RT. It's said the 4060 Ti will match the 3090 for RT.

People will need to drop down a tier this gen if they don't want sticker shock, but you'll still get a lot stronger performance.

I guess haters like to find things to complain about, 4060Ti could very well beat 3090 @4K while using <250W, offering superior efficiency. But hey AD102 using 70% more power than GA102 while offering 100-150% more FPS means that the whole uarch is bad according to some :rolleyes:.

Meanwhile I'm just chilling with my 3090 using < 300W @ 4K RT DLSS Balanced in new AAA games, but hey DLSS is bad according to some too.
 
Last edited:
Bah, I have an Enermax MAX REVO 1500W PSU that should be able to handle this, though I ain't looking at any RTX 4000 series cards, I'd be looking at an RX 7000 series card, will see the difference in performance and price between the RX 7800 XT and the RX 7900 XT.
1.5kw should be enough for a 4000 series, I have a 1kw with 4x 8pin so I could run one but not looking forward to it.
 
My 550 W Seasonic Platinum PSU still has 8 years of its warranty left. I'm not going to bin it just because nvidia can't keep TDPs in check. Low spec gaming awaits! :cool:
 
Given price of electricity is probably only going up, I dont see this 600W feature being good for users.. or nVidia.
 
There are purists here who think Native is the only way to enjoy games.

Yeah, good for them.... I turn on DLSS in pretty much every game that supports 2.0 or later..... On a 48 inch oled from a normal sitting distance it's impossible to tell it apart from native.... I don't mind it at 1440p either though. It's really only at 1080p where I would only use it if I had a potato 2060 trying to play Cyberpunk....

Not a huge fan of the high end getting close to 600W though my 3080ti at 400w is already semi obnoxious to keep the room it's in cool as it is. Guessing they will end up actually being 450-500w cards just like custom 3090s though.
 
I don't see why the power consumption is a big deal, I mean in the past people where running 4-way sli/cfx with OC'ed 480/580's or 5870/5970's sucking down over 1KW for the cards alone to get the most performance possible. Now that multi-gpu is sadly dead I don't see how this is any different.
If your wanting a 4080/4090 to get the best performance possible the cost to run it likely doesn't matter to you.
 
Yeah on some cases there's been rising in the pricing of electricity here in Finland too. Though on my case, the provider actually informed that my pricing will be the same as before.


Yeah, good point! I remember when efficiency was almost the top thing back then. I remember especially when Core 2 Duo came, it indeed was hella efficient over the previous P4/PD CPUs.
Yeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.
 
Yeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.
Utter rubbish.
 
And here i am setting gpu limit to 75% and frame limit to 58fps on my 3080TI waiting to upgrade

hopefully AMD can do it with 400w so i can switch back.
 
So in other words the point of the article is "buy 3090s now while they're the closest they've been to their original MSRP in nearly two years' time and will work with your current PSU."
 
Who remembers when Nvidia said they were going to make video cards have more graphic power while using less real power. That was the pitch 5 years ago. And their roadmap showed it less and less power hungry video card with more output for graphic power. Strange how that all went to the trash can.
 
I don't see why the power consumption is a big deal, I mean in the past people where running 4-way sli/cfx with OC'ed 480/580's or 5870/5970's sucking down over 1KW for the cards alone to get the most performance possible. Now that multi-gpu is sadly dead I don't see how this is any different.
If your wanting a 4080/4090 to get the best performance possible the cost to run it likely doesn't matter to you.
Yes, but these were more the exceptions rather than the rule. I ran CF and SLi setups (last being 2x GTX Titan, 2x Vega64, they were in 2 different rigs which I was running at the same time) through the years, and have been using 1kW PSU since getting a SilverStone OP1000 about a decade or more back.
1.5kw should be enough for a 4000 series, I have a 1kw with 4x 8pin so I could run one but not looking forward to it.
Heck, It's possible all the PSUs I have are able to do it - Seasonic X-1250, Corsair HX1050, Corsair HX1000 (platinum) and my spare Enermax MAX REVO 1500W, these have at least 4x 8pin PCIe power outputs each, I think the MAX REVO has 6x PCIe (not sure though).
 
600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
That's one whole PSU right there. They should just bundle it with external power adapter like laptop's.
 
The 4090 has already been tested:


Dunno about the 4080 ...

LMAO, love this guy. He's also already leaked the unknown RTX 4090Ti card.
damn, we really need a private nuclear silo right in front of our house.
 
LMAO, love this guy. He's also already leaked the unknown RTX 4090Ti card.
damn, we really need a private nuclear silo right in front of our house.
That isn't the rumored RTX 4090 Ti, I'm sorry but that's totally wrong! I've heard rumors that leather man himself is gonna call it the RTX 4090 Nuclear Edition.
 
Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
Those rumours and early results are always about stupid scenarios, oh yes its twice as fast - at 8K with RTX on and DLSS (2FPS to 4FPS)


This is also going to be something stupid like a 600W capable power connector, not that the GPU's use it
 
600W. That's a nice toaster. I like toasts but not made of silicon. Double on the core and double on the power. Not so much of an improvement to me.
Btw, it is just a rumor. I'm gonna make popcorn sit tight and see how many dead bodies fall from the closet :)
 
Last edited:
600 watt capable is not 600 watt usage, we all agree on that.
However why would one need to make it 600 watt capable if not to at least kinda approach it.

it is getting too high and I said it before, its barely technological progress if yes we can do more but it also costs more energy to achieve.
Sure the performance per watt might go up....but clearly not enough if the wattage has to scale up like this constantly
 
No hidden surprises? I see I need to install a certificate, and the program uses a silent updater, not sure how I feel about that. :)

I'm using DLSS Swapper which does exactly what you want "swap between DLSS versions for the games that support them with ease", although it only works with Steam.
 
Last edited:
The 4090 has already been tested:


Dunno about the 4080 ...
What a great youtube channel! I really like the retro Voodoo inspired cards. He must put a lot of labor into these replicas.
 
600 watt capable is not 600 watt usage, we all agree on that.
However why would one need to make it 600 watt capable if not to at least kinda approach it.

it is getting too high and I said it before, its barely technological progress if yes we can do more but it also costs more energy to achieve.
Sure the performance per watt might go up....but clearly not enough if the wattage has to scale up like this constantly
When i read the OP's article about the rumor or whatever it is, it says board maximum power not cable requirements.
 
Does America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
Electricity is cheap and plentiful here on the Eastern side of the US. I pay well under 10c per kW/h. My usage is significantly higher than other homes in mu development and I've never heard a peep from the supplier or line provider.
 
Yeah, all that efficiency talk stopped with Fermi, then came back in vogue briefly during pascal then was yeeted out the window soon after in time for Turing. People in this space really only cared about efficiency when it was Nvidia. Other brand is more efficient? Cool beans, they're still second place. Other brand seized the crown? Look at how power hungry/innefficient that card is! Need a nuclear reactor.

People in this thread are griping about about how much power it auch down because in their head they know it's going to cost more than they'd care to pay. If it was priced sanely. The conversation would be more along the lines of the usual "efficiency, who tf cares about efficiency? If you can afford the card you can afford to power and extra light bulb, or it's only a few cents on the electric bill". And it was even staff and moderators on this site saying things like that.

Efficiency is and was always a thing during every single day where we still had new nodes ahead of us. Efficiency at its core, determines how far you can push a node without deviating from whatever the norm in the market is, for numerous aspects of the product. Price, performance, heat, power, etc. Efficiency is also IPC. It is die size. It is featureset. Etc.

Now that the gain from a node shrink is diminishing, suddenly efficiency is out the window and we ride the marketing wave to convince ourselves we're progressing, even though there was never a day and age where hardware could last longer than today, simply because we're down to baby steps at best.

The reality is, we're not progressing, we're degressing when GPUs need substantially more power to produce playable frames in gaming. What's underneath that, is really not quite relevant. The big picture is that in a world where efficiency is key to survive (literally), we're buying into products that counter that idea. It won't last, it can't last, and the deeper we go down that rabbit hole, the bigger the hurdle to get out of it again. RT is diving straight into that hole. DLSS/FSR are clawing us back out a little bit, but the net result is still that we're in need of more power to drive content.

Its not progress. What is progress, is the fact that GPUs keep going faster. Sure. Another bit of progress is that you can still get or create an efficient GPU. I can understand that notion too. But the commercial reality is none of those things, as @Assimilator correctly states, because quite simply commerce is still all about more bigger faster stronger, fuck all consequences as long as we keep feeding the top 3%. The question is how far your Lemming mind wants to go before diving off the cliff.

The bottom line and fundamental question here is are you part of the 3% and if you're not, you're an idiot for feeding them further. Vote with wallet, or die - or, present a horrible future to your children.

And that principle stands in quite a few of our current commercial adventures. The responsibility to change is on us, no one else.
 
Last edited:
Back
Top