Thursday, September 5th 2024

NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

NVIDIA is on the brink of finalizing its next-generation "Blackwell" graphics cards, the GeForce RTX 5090 and RTX 5080. Sources close to BenchLife indicate that NVIDIA is targeting September for the official design specification finalization of both models. This timeline hints at a possible unveiling at CES 2025, with a market release shortly after. The RTX 5090 is rumored to boast a staggering 550 W TGP, a significant 22% increase from its predecessor, the RTX 4090. Meanwhile, the RTX 5080 is expected to draw 350 W, a more modest 9.3% bump from the current RTX 4080. Interestingly, NVIDIA appears to be developing "D" variants for both cards, which are likely tailored for the Chinese market to comply with export regulations.

Regarding raw power, the RTX 5090 is speculated to feature 24,576 CUDA cores paired with 512-bit GDDR7 memory. The RTX 5080, while less mighty, is still expected to pack a punch with 10,752 CUDA cores and 256-bit GDDR7 memory. As NVIDIA prepares to launch these powerhouses, rumors suggest the RTX 4090D may be discontinued by December 2024, paving the way for its successor. We are curious to see how the power consumption is handled and if these cards are packed efficiently within the higher power envelope. Some rumors indicate that the RTX 5090 could reach 600 watts at its peak, while RTX 5080 reaches 400 watts. However, that is just a rumor for now. As always, until NVIDIA makes an official announcement, these details should be taken with a grain of salt.
Sources: BenchLife, via Wccftech
Add your own comment

88 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

#1
TheDeeGee
Everyone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
Posted on Reply
#2
Vayra86
TheDeeGeeEveryone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.

But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios. Its more depending on workload and where game X or Y ends up on your V/F curve. You just don't notice it much.
Posted on Reply
#3
dgianstefani
TPU Proofreader
Vayra86Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.
Compared to what? The competition that could draw over 600 watts and was still slower? That uses 30-50 W more in V-Sync, video playback and multi monitor (still not fixed).
Vayra86But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios.
More power is irrelevant, work done/power used is only metric re efficiency. Ada is also extremely efficient at stock, but undervolts well too.
Posted on Reply
#4
Vayra86
dgianstefaniMore power is irrelevant, work done/power used is only metric re efficiency. Ada is also extremely efficient at stock, but undervolts well too.
Of course its not irrelevant, power is heat and $$$, efficiency is how it translates to FPS. The fact is, if you have more performance under the hood you will use it and therefore use more power. But you're still just gaming.

And that's fine. But stop living in denial. TDP go up, your power bill go up, simple.
Posted on Reply
#5
dgianstefani
TPU Proofreader
Vayra86Of course its not irrelevant, power is heat and $$$, efficiency is how it translates to FPS. The fact is, if you have more performance under the hood you will use it and therefore use more power. But you're still just gaming.
It's completely irrelevant when you're talking about efficiency. The 350 W 4080S is much more efficient than the 150 W 6600XT. A 1000 W GPU could be the most efficient in the world.
Posted on Reply
#6
Vayra86
dgianstefaniIt's completely irrelevant when you're talking about efficiency. The 350 W 4080S is much more efficient than the 200 W 6600XT. A 1000 W GPU could be the most efficient in the world.
But I'm not, I'm clearly talking about thermal design power and how the bar is raised from gen to gen, and how this results in effectively using more power to do a similar task, regardless of increased efficiency.

It comes down to the simple idea that you do not upgrade to play the same content at the same settings as you used to. You upgrade to play content at higher settings or FPS than you could before. But you're still just gaming. Ergo, the increased efficiency almost never leads to power saving, and far more likely leads to using more power.
Posted on Reply
#7
dgianstefani
TPU Proofreader
Vayra86But I'm not, I'm clearly talking about thermal design power and how the bar is raised from gen to gen, and how this results in effectively using more power to do a similar task, regardless of increased efficiency.
I count efficiency several times in your wording, not thermal design power.

Power for the same task is the metric here. Let's not move the goalposts to similar.

Not to mention a more efficient GPU will use less power for a V-Sync 60 Hz situation which is the most directly comparable. Regardless of TDP.
Posted on Reply
#8
Vayra86
dgianstefaniI count efficiency several times in your wording, not thermal design power.

Power for the same task is the metric here. Let's not move the goalposts to similar.
I'll set my own goalposts as I did since the first response. Read carefully. We don't disagree. You just gloss over the fact that better GPUs are not bought to save power, but rather to play games at higher settings.
Posted on Reply
#9
dgianstefani
TPU Proofreader
Vayra86The fact is, if you have more performance under the hood you will use it and therefore use more power.
This is categorically false.




As you can see, (much) slower GPUs often use (much) more power.
Posted on Reply
#10
FoulOnWhite
People who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
Posted on Reply
#11
AusWolf
What's with the big gap? I mean, the 5090 will have 2.4x as many cores as the 5080? Why?
Posted on Reply
#12
dgianstefani
TPU Proofreader
FoulOnWhitePeople who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.
Believe it or not, even at stock you can have the fastest AND the most efficient, the two tend to go hand in hand in TDP limited scenarios which is most. The mark of a good architecture is efficiency.

But yes, this is why Raptor Lake sold well, it's extremely fast and people don't really care about CPU power draw unless they're overheating, the power draw is getting in the way of performance, or they're using a laptop.
Posted on Reply
#13
TheDeeGee
Vayra86Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.

But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios. Its more depending on workload and where game X or Y ends up on your V/F curve. You just don't notice it much.
Tested in various games (even RTX Remix mods with Path Tracing), and it's always 5-6%.
Posted on Reply
#14
dgianstefani
TPU Proofreader
TheDeeGeeTested in various games (even RTX Remix mods with Path Tracing), and it's always 5-6%.
In line with my own testing. With Ampere though it doesn't UV as well as Ada, since the GDDR6X is hungry and the Samsung node isn't as good as the TSMC. I suspect the RDNA 3 cards don't UV well either since the multi chiplet design.
Posted on Reply
#15
Legacy-ZA
AusWolfWhat's with the big gap? I mean, the 5090 will have 2.4x as many cores as the 5080? Why?
Well, because, there will be in-betweens, the 5080 Ti, The 5080 Super, the 5080 Super Duper, the 5080 Hyper Speed, the 5080 GTFO etc. :roll:
Posted on Reply
#16
Caring1
FoulOnWhitePeople who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
Why not both?
Posted on Reply
#17
Bwaze
I'm not worried about power consumption - the way this is going, RTX 50X0 generation won't be targeted to gamers at all, except for maybe lowest tier cards - this is all gor the new saviour of revenue, AI! I expect prices to be even worse than during cryptomadness.
Posted on Reply
#18
Dr. Dro
That seems like such a wide gulf between the 5090 and 5080, unless the 5080 is a quarter of the price... it won't seem worth it
Posted on Reply
#19
64K
FoulOnWhitePeople who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
Agreed. I'm wanting a 5090 and electricity usage isn't even on my list of concerns. Even if it draws 500 watts constantly in gaming (which it won't) I game about 20 hours a week on average. At 14 cents per kWh for me that comes to $6 a month. Probably the real cost will be much less but even worst case I'm looking at buying a $2,000 card so $4-$6 a month is trivial all things considered. I may have to deny myself 1 cup of Starbucks coffee a month to cover that extra expense. :p
Posted on Reply
#20
Caring1
64KAgreed. I'm wanting a 5090 and electricity usage isn't even on my list of concerns. Even if it draws 500 watts constantly in gaming (which it won't) I game about 20 hours a week on average. At 14 cents per kWh for me that comes to $6 a month. Probably the real cost will be much less but even worst case I'm looking at buying a $2,000 card so $4-$6 a month is trivial all things considered. I may have to deny myself 1 cup of Starbucks coffee a month to cover that extra expense. :p
You have extremely cheap electricity, mine is triple that at least. Oops, I meant over double, not quite triple yet, despite the constant increases
Posted on Reply
#21
Shou Miko
I also seen on X that RTX4090 D og RTX 4080 variant with double the amount of memory.

I didn't think Nvidia allowed board-partners to do, but I do not remember if it was a "custom" chinese job.
Posted on Reply
#22
close
Dr. DroThat seems like such a wide gulf between the 5090 and 5080, unless the 5080 is a quarter of the price... it won't seem worth it
You need to know both prices before determining which one is worth it.
Posted on Reply
#23
AusWolf
Legacy-ZAWell, because, there will be in-betweens, the 5080 Ti, The 5080 Super, the 5080 Super Duper, the 5080 Hyper Speed, the 5080 GTFO etc. :roll:
Why not just call them 5081, 5082, 5083, etc.? Zero is not the only number in existence, surely. :roll:
Posted on Reply
#24
64K
Caring1You have extremely cheap electricity, mine is triple that at least.
Damn, well I guess the electricity bill is a sore spot for you anyway. Here in the US the average is 16.4 cents per kWh. Mine is cheaper due to hydroelectric power plants in the region.
Posted on Reply
#25
Legacy-ZA
AusWolfWhy not just call them 5081, 5082, 5083, etc.? Zero is not the only number in existence, surely. :roll:
I am sure leather jacket man Jensen can't wait to upsell a 4050 (3050) as a 5080, 4060 (3060) as a 5081 etc etc. xD
Posted on Reply
Add your own comment
Oct 6th, 2024 13:41 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts