• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

Oh boy so 4090 will remain king of the jungle for more than 24 months, previous best record was 2080 Ti at 24 months.

Grabbing the 5090 will get me set for 2+ years :cool:

My 2080ti had way more class than your 4090. :D

(I'm sure you also had a 2080ti).
 
Everyone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.

You are conflating theoretical maximum efficiency with actual real world efficiency. Most people run stock which means their 5090 will be guzzling 600w+ in games.

I run my 4090 power capped but I'm under no illusion that it's something most customers will do.
 
You are conflating theoretical maximum efficiency with actual real world efficiency. Most people run stock which means their 5090 will be guzzling 600w+ in games.

I run my 4090 power capped but I'm under no illusion that it's something most customers will do.
Even the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.

1725553203995.png


I doubt the 5090 will be much different, maybe 450 W.
 
Even the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.

View attachment 362159

I doubt the 5090 will be much different, maybe 450 W.

Exactly. Most of us will rarely (if ever) even get close to peak power consumption. Me, for example, I have a 43" ASUS ROG monitor with 4K/144Hz resolution and I'm always limiting my fps in games to 120fps whenever possible. Otherwise, I have V-Sync enabled globally which is nVidia's recommended setting for G-Sync (compatible) displays, i.e. I will always stay within the sync range of 144Hz/fps max.

The RTX 5090 will be able to reach those 120fps or 144fps even easier than the RTX 4090 so it will be a beast in terms of performance and efficiency. The theoretical 600W peak power consumption which is already present on some RTX 4090 custom designs (not mine... MSI Suprim X has a 520W limit) is of very little practical relevance.
 
Half a year. Closer to a third, really. CES is early January. And this is in line with what we’ve heard about 5000 series being planned for early 2025. It makes some amount of sense - NV doesn’t have real competition to rush them along, the 4000 series still sells well and getting rid of existing inventory before launching new cards is prudent and for now NV is probably more keen on using fab allocation on enterprise Blackwell accelerators to fully capitalize on the demand.
You're right, I had the wrong event in my head when I wrote that so its likely sometime in Q1.

Even the 4090 which people like to call a 600 W card since the power connector can handle up to 600 W, uses 350 W in games at stock.

View attachment 362159

I doubt the 5090 will be much different, maybe 450 W.
I mean sure but that's still pretty high and its also going to vary based on the game. 450watts is still alot when you think about it especially when you need to cool it. But then again I think power consumption gets overblown on the top end constantly, if you are buying this card you probably only care about that extra 5 FPS you can get over everyone else and not how few watts it can sip. I only ever complain about power consumption when it becomes too hard to keep it cool and it gets held back because of it.
 
You're right, I had the wrong event in my head when I wrote that so its likely sometime in Q1.


I mean sure but that's still pretty high and its also going to vary based on the game. 450watts is still alot when you think about it especially when you need to cool it. But then again I think power consumption gets overblown on the top end constantly, if you are buying this card you probably only care about that extra 5 FPS you can get over everyone else and not how few watts it can sip. I only ever complain about power consumption when it becomes too hard to keep it cool and it gets held back because of it.
"5 FPS".

Approx 50% faster than the (more expensive) 3090 Ti.

1725564651455.png


That's while using 100 W less than the 3090 Ti BTW.

1725564690512.png


I don't expect quite so much of a jump from the 4090 to the 5090 since it's not getting a major node improvement, but there's still a lot to gain from architecture, and I do get tired of "600 W" when it's closer to half that in stock gaming draw, and ~'miniscule differences that you can't notice' criticism of halo products.
 
It was already quite big the gap between the 4090 and 4080. But 24K vs 10K. That’s insane.
 
Sure, but it's still not "600 W".

It was already quite big the gap between the 4090 and 4080. But 24K vs 10K. That’s insane.
Half the memory bus too 256 vs 512 bit, but these are just rumours, I strongly suspect that's just full die numbers, and the actual chips will be cut down so the difference won't be as significant.

4090-4080 e.g. 4080S uses the full die but 4090 isn't even close.
 
I don't think even the factory overclocked 4090 variants use 600 W at stock.
Yes, despite the hyperbole people like to use to emphasise a non-existent point regarding what they consider "efficiency". The waterblocked ROG Matrix $3k card as an example uses 440 W instead of 411 W (FE), which isn't even 10% more power, for about 4% more speed, and that's about as extreme as it gets for OC'd cards.
 
"5 FPS".

Approx 50% faster than the (more expensive) 3090 Ti.

View attachment 362174

That's while using 100 W less than the 3090 Ti BTW.

View attachment 362175

I don't expect quite so much of a jump from the 4090 to the 5090 since it's not getting a major node improvement, but there's still a lot to gain from architecture, and I do get tired of "600 W" when it's closer to half that in stock gaming draw, and ~'miniscule differences that you can't notice' criticism of halo products.
I never said 600 or mentioned that, also the 5 FPS remark was more just saying people generally want the extra performance and not expecting best efficiency when purchasing a halo product. Still we wont know till it comes out how much it uses peak when gaming.
 
I never said 600 or mentioned that, also the 5 FPS remark was more just saying people generally want the extra performance and not expecting best efficiency when purchasing a halo product. Still we wont know till it comes out how much it uses peak when gaming.
If you scroll up you'll see 600 W mentioned quite a few times.
 
The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?
 
The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?
More than you think. The way the eye/brain works things are somewhat parallel, so while each cluster of "sensors" within the eye can only see up to a certain "framerate", your brain processes many clusters at the same time, so to the brain, the framerate can be much higher than what the eye "hardware" can strictly do, because it's working with more than a single "input".

But you're right about locking max frame rate to max Hz of your monitor, makes for much lower input lag anyway.
 
Yes, despite the hyperbole people like to use to emphasise a non-existent point regarding what they consider "efficiency". The waterblocked ROG Matrix $3k card as an example uses 440 W instead of 411 W (FE), which isn't even 10% more power, for about 4% more speed, and that's about as extreme as it gets for OC'd cards.

I'm not sure who's making the argument that they are 600w cards but yeah they aren't. The 5090 might be though, we will have to wait and see.

The more frames, the higher the consumption. I've been turning on V-sync for a long time in every game. When the frames are over 200 or over 300 or more, the graphics consumption is over 300W. When V-sync is on, graphics consumption is below 100W. And what to say to all that? Is there a need for the frames to be over 60. How many frames per second is the human eye capable of processing?

You should cap your FPS at whatever you feel comfortable with. There's a lot of variation from person to person in regards to motion smoothness. Some people are fine with 60 FPS while others want 500+.
 
Low quality post by Icon Charlie
HMMMMMM..... I find this MOST interesting on this sort of market speak... I wonder this has anything to do with the decline of stock prices OF LATE... SINCE NGREEDIA IS A A....I.... COMPANY NOW....
Yup I can see the GREENIES who drank the green coolaid going for the biggest and bestest and AWESOMER NEWER CARD, BASICALLY they fail fail on the basic needs of life. But they sure get their new shiny!


These are the people that give NGREEDIA the money to continue the mantra of....
THE MOAR YOU BUY.... THE MOAR YOU SAVE...SAVE...SAVE...SAVE...SAVE...
 
HMMMMMM..... I find this MOST interesting on this sort of market speak... I wonder this has anything to do with the decline of stock prices OF LATE... SINCE NGREEDIA IS A A....I.... COMPANY NOW....
Yup I can see the GREENIES who drank the green coolaid going for the biggest and bestest and AWESOMER NEWER CARD, BASICALLY they fail fail on the basic needs of life. But they sure get their new shiny!


These are the people that give NGREEDIA the money to continue the mantra of....
THE MOAR YOU BUY.... THE MOAR YOU SAVE...SAVE...SAVE...SAVE...SAVE...
Ah, hello there No-Bark. Good to see you're still kicking.

In any case, I can concede that the newer cards are going to be more 'efficient' in terms of joule/unit of work, but re: the argument above it shouldn't be the responsibility of the consumer to closely attend what is actually relevant to their living expenses and their own indoors comfort: raw power draw/heat output. The further and further that GPU manufacturers push the power ceiling on their product in this slow creeping fashion the worse and worse it gets. That's still a bad thing.

In an ideal world a new arch + new node would mean a significant (if merely 'generational') bump in performance on the same power budget at stock card-for-card, not a HUGE jump in performance for a slightly less huge jump in power draw. And that would be the selling point. This isn't datacenter, with industrial chillers and 240V/100A wall outlets. This is a small indoor room on a tiny sliver of a 3-ton unit's capacity and a 120V outlet that pops if you edge above 15A total per room. The ceiling is far lower.

I remember when people would make fun of the 480 and called it a space heater/George Foreman. That thing drew 250W max. The 4070 Super is 30W below that on tech a decade newer.

I'm very much for UV, but that's because I find it interesting how low you can push the silicon before it starts to drag its feet and because I live in a place that is very hot for half the year. I shouldn't be hearing complaints from my Michigander friend about how his 5080 makes him sweat in March. I shouldn't be telling him 'kick rocks, make your computer slower'.
 
Last edited:
RTX 4080 is $1200. Even without AI craze Jensen told us there is no More's Law any more, any increase in performance will bring increase in price.

You can take this as a rough guide:

$1200 + 50% = $1800, and that doesn't even cover the inflation!
So you think 5090 will be $7200?
 
Low quality post by 64K
HMMMMMM..... I find this MOST interesting on this sort of market speak... I wonder this has anything to do with the decline of stock prices OF LATE... SINCE NGREEDIA IS A A....I.... COMPANY NOW....
Yup I can see the GREENIES who drank the green coolaid going for the biggest and bestest and AWESOMER NEWER CARD, BASICALLY they fail fail on the basic needs of life. But they sure get their new shiny!


These are the people that give NGREEDIA the money to continue the mantra of....
THE MOAR YOU BUY.... THE MOAR YOU SAVE...SAVE...SAVE...SAVE...SAVE...

Stop yelling. We're not blind. :p
 
Exactly. Most of us will rarely (if ever) even get close to peak power consumption. Me, for example, I have a 43" ASUS ROG monitor with 4K/144Hz resolution and I'm always limiting my fps in games to 120fps whenever possible. Otherwise, I have V-Sync enabled globally which is nVidia's recommended setting for G-Sync (compatible) displays, i.e. I will always stay within the sync range of 144Hz/fps max.

The RTX 5090 will be able to reach those 120fps or 144fps even easier than the RTX 4090 so it will be a beast in terms of performance and efficiency. The theoretical 600W peak power consumption which is already present on some RTX 4090 custom designs (not mine... MSI Suprim X has a 520W limit) is of very little practical relevance.
Well yeah, lock it at 120fps and almost certainly will use less power than 4090. But a lot of people will just let it rip and at 200fps it'll be power hungry.

So would people actually buy a 5090 that got similar performance as the 4090 but at half the power say? No, IMO they'll want to see headlines reading 40% stronger and ignore power consumption. AMD is getting hammered for basically doing this with Zen 5.
 
My 2080ti had way more class than your 4090. :D

(I'm sure you also had a 2080ti).
Class in GPUs died together with the Titan name, imo. Even a x90 GeForce card is just a consumer product like any other.
 
Back
Top