• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,081 (1.09/day)
According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.



View at TechPowerUp Main Site | Source
 
So the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.
 
Haven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
 
You know...

My Sharp II Carousel MicroWave Runs on 400 Watts...

Just saying...
But you don't game on your Sharp II Carousel microwave for several hours at a time. It also doesn't dump any extra heat into your PC case.

Just saying... ;)

Haven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
I don't know how it's on Nvidia now. The last card I had from them was a 2070. On that, TDP was GPU chip only power.

On AMD, TDP is total card power.

The difference between the men and the boys is the price of their toys.
Oh, what a bleak, materialistic look on the world! I'm astonished.
 
My Maytag Microwave is good for 1000 watts. My Lian-Li Edge PSU is good for 1300 watts. We will fear no GPU.
The difference between the men and the boys is the price of their toys. Saddle up kids. It's gonna be a rough ride.

I miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :P
 
I miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :p
I miss the days when buying PC parts was cool among a few, and not just a device of some sick (penis) wallet measuring contest for the masses. :(
 
But you don't game on your Sharp II Carousel microwave for several hours at a time. It also doesn't dump any extra heat into your PC case.

Just saying... ;)


I don't know how it's on Nvidia now. The last card I had from them was a 2070. On that, TDP was GPU chip only power.

On AMD, TDP is total card power.


Oh, what a bleak, materialistic look on the world! I'm astonished.
That saying is older than me. Maybe to a Brit but not an American. Capitalism rules, illegally if I can get away with it. Just ask the orangeman.
 
4090 also had a max TGP of 600W, so 5090 will probably draw the same amount of power or slightly higher. Maybe AIB 5090 will use around 520W at stock.
 
That saying is older than me. Maybe to a Brit but not an American. Capitalism rules, illegally if I can get away with it. Just ask the orangeman.
Just because it's old, it doesn't make it less stupid. Capitalism might rule the business world and/or politics, but it doesn't rule me. I buy what I want/need, not what I'm told.

What you own ends up owning you. Think about it.
 
Don't care about the power draw since ill limit it to whatever power I want it to run at (takes 10 seconds) but I firmly believe that power draw should be kept steady between tiers ( meaning, 5080 should be at similar power draw to 4080 etc.). First of all, because it makes it easier to compare gen on gen, and second of all it creates a "rule" about what kind of equipment ( PSU , case) is required to run a specific tier. Flip flopping around power targets and naming is just a way to confuse the consumer about what they are actually buying. Both amd and nvidia are guilty of this.
 
Now you're getting weird dude. This electric junk doesn't rule a GD thing.
It does if you feel obliged to buy the latest one every single time whether you need it or not. You're a slave to buying things. Sorry, but it's true.
 
It does if you feel obliged to buy the latest one every single time whether you need it or not. You're a slave to buying things. Sorry, but it's true.
Or it's just curiosity. Like I bought a 13900k, 14900k and the 9800x 3d just because I wanted to test them / see performance improvements compared to 12th gen. It's a tech forum, people do that kind of thing around here.
 
Just because it's old, it doesn't make it less stupid. Capitalism might rule the business world and/or politics, but it doesn't rule me. I buy what I want/need, not what I'm told.

What you own ends up owning you. Think about it.

Correct: I still classify myself as a customer, not a consumer. I have nothing to to prove to anyone, I buy what I "need" I live my life simplistically. I don't need new cloths every week, or the latest car, or w/e.

I just hate being taken advantage of when I do need to make these purchases when the time comes, that is what annoys me, and the realization comes, that I have to fork out money to idiotic price points because others have no self control.
 
Don't care about the power draw since ill limit it to whatever power I want it to run at (takes 10 seconds) but I firmly believe that power draw should be kept steady between tiers ( meaning, 5080 should be at similar power draw to 4080 etc.). First of all, because it makes it easier to compare gen on gen, and second of all it creates a "rule" about what kind of equipment ( PSU , case) is required to run a specific tier. Flip flopping around power targets and naming is just a way to confuse the consumer about what they are actually buying. Both amd and nvidia are guilty of this.
I agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
 
I agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
It's also practical, im running my 4090 locked to 320w. If the review shows the 5090 being 50% faster than the 4090 while pulling 30% more power, I really have no idea how that translates to my usecase. Is it going to be 10% faster at the same 320w? Is it going to be 40%? How the hell am I supposed to wisely spend money if I don't know where the performance is coming from (brute forcing power etc.).
 
Or it's just curiosity. Like I bought a 13900k, 14900k and the 9800x 3d just because I wanted to test them / see performance improvements compared to 12th gen. It's a tech forum, people do that kind of thing around here.
If you're genuinely interested in how things work, that's cool. I do that myself. :)

But I don't need the latest and greatest to feel good about myself. With my gaming habits, I'm fine on mid-range. I don't buy stuff just to be able to say that I have it.

Whoever says that the 4090 is not enough, and you definitely 100% need to swap it for a 5090 as soon as it's out is lying to himself.

Correct: I still classify myself as a customer, not a consumer. I have nothing to to prove to anyone, I buy what I "need" I live my life simplistically. I don't need new cloths every week, or the latest car, or w/e.

I just hate being taken advantage of when I do need to make these purchases when the time comes, that is what annoys me, and the realization comes, that I have to fork out money to idiotic price points because others have no self control.
Our whole society is built around individuals with no self-control, unfortunately. I completely agree with you, though.
 
If you're genuinely interested in how things work, that's cool. I do that myself. :)

But I don't need the latest and greatest to feel good about myself. With my gaming habits, I'm fine on mid-range. I don't buy stuff just to be able to say that I have it.

Whoever says that the 4090 is not enough, and you definitely 100% need to swap it for a 5090 as soon as it's out is lying to himself.
I think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D
 
It's also practical, im running my 4090 locked to 320w. If the review shows the 5090 being 50% faster than the 4090 while pulling 30% more power, I really have no idea how that translates to my usecase. Is it going to be 10% faster at the same 320w? Is it going to be 40%? How the hell am I supposed to wisely spend money if I don't know where the performance is coming from (brute forcing power etc.).
You can only compare stock power-to-performance ratios, and have an estimated guess. No review is gonna test any card at your individually set power level.
 
he says over 575 W. those 12VHPWR have 600w limit and 75w from pci slot? perhaps they need two or three 12VHPWR
 
I think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D

Oh how I wish developers would start optimizing their games again, just look how fantastic DOOM ran on Vulkan, it can be done, but they use these technologies that nVidia offers as a crutch and I absolutely despise them for it. :banghead:
 
I think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D
Need is one thing. If you have the money, go for it. But coming to a forum and boasting "mwahaha, I'll buy this thing because I'm so awesome and also 'Murica" is just plain dumb.
 
You can only compare stock power-to-performance ratios, and have an estimated guess. No review is gonna test any card at your individually set power level.
Im not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
 
Back
Top