• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

will gpu continue to have crazy TDP?

  • Thread starter Thread starter Deleted member 234478
  • Start date Start date
Its not that crazy to be honest and its not an unprecedented thing either. The Fury X for example had huge spikes up to 500W, this happens for a lot of gpu's much higher spikes of power.

Generally speaking 300W is not a big deal, its 3 light bulbs, your hair dryer uses much more energy, generally between 800w to 2000w. Your aspirator generally consumes 200W, etc...

So 300W is nothing really and each generation has its ups and downs, there is no trend showing higher power usage, in order to be a trend it has to be at least 3 generations of increase over and over, one generation is not enough to call it a trend.
 
So 300W is nothing really and each generation has its ups and downs, there is no trend showing higher power usage, in order to be a trend it has to be at least 3 generations of increase over and over, one generation is not enough to call it a trend.
Erm....May want to look at that. Maximum power draws seen (not including spikes)

1080Ti ~270
2080Ti ~290
3090Ti ~470
4090 ~470

Data taken from TPU FE reviews.

I also wonder if this is why we arent seeing a Ti/Super varient of the 4090. One because there isnt any direct competition. Two because the powerdraw under worst case scenarios would probably be north of 500 watts.....
 
From TPU GPU Data base

780Ti: 250W
980Ti: 250W
1080Ti: 250W
2080Ti: 250W
3080Ti: 350W
3090: 350W
3090Ti: 450W
4080: 320W
4090: 450W

It’s the 3-4000series cards that broke the pattern

Generally speaking 300W is not a big deal, its 3 light bulbs, your hair dryer uses much more energy, generally between 800w to 2000w. Your aspirator generally consumes 200W, etc...
3 light bulbs from last century.
My whole apartment lights are 300W now, yet my GPU can draw 460+W… lol
 
Its not that crazy to be honest and its not an unprecedented thing either. The Fury X for example had huge spikes up to 500W, this happens for a lot of gpu's much higher spikes of power.

Generally speaking 300W is not a big deal, its 3 light bulbs, your hair dryer uses much more energy, generally between 800w to 2000w. Your aspirator generally consumes 200W, etc...

So 300W is nothing really and each generation has its ups and downs, there is no trend showing higher power usage, in order to be a trend it has to be at least 3 generations of increase over and over, one generation is not enough to call it a trend.
Nothing in home power usage is a big deal, until you start adding it up.

Every little helps, they say. You can deny that, but that's not being realistic. The fact is, we keep using more power, while we keep saving more power on 'better' electronics. Strange hm?
Fact is since Ampere power usage on GPUs has exploded. Its not some 20-50W here or there. We're talking 50% increase and more. That's retarded. You didn't save that elsewhere in the meantime, so the fact is your gaming got 50% more expensive beyond the initial purchase cost unless you tweak something down again.
 
yea those cards use a bit of power but with small undervolting can cut that a bit. I have 4070 super which pulls 220watts at stock settings at stock 1.1volts around 2740mhz. I've seen people have stable 2700mhz at .925 volts sadly mine needs .960 to be stable but even then it drops from 220watts to 160ish watts draw maybe 170. Only 2% lower clock at the end but around 20-25% lower power draw. I would expect pretty much wattage saving on 4080/4090 if using around the same.
 
Basically you can get the base clocks at 50% power @0.7V depending on how much headroom is available at stock settings.

3080 Ti 0.950V to fit in 350W vs 4090 running a 1.05V both by default. both are 350 W at equal voltage.

this means 4090 uses 25% more power for no good reason. just because it can.

My 2080 Ti for example has a 350 W bios and it can run with 100W overhead and nothing is gained. it just pumps more voltage through the gpu.
 
Last edited:
The current idiot on the block is RT.
What a take, needless to say I disagree, as does the long list of RT games I've played and enjoyed visually. Time will tell, but I suppose by then you'd reserve the right to change your opinion.

From where I sit, two gpu vendors sell complete solutions allowing excellent visual experiences with RT on, one has not, maybe just barely with their latest best.

The current idiot on the block isn't RT.
Yeah, I think its hard to deny we definitely made massive progress since the first GPUs. And that progress (every time...) has reinforced our belief that it can't ever stop, but reality is starting to knock on the door at this point.
Are you a visionary gpu and visual technology designer / expert? Consumers don't innovate, they want more, better, faster, and exceptionally rarely innovate. We don't know what we don't know, and I'd love to circle back in another 20 years.
i dont know i know some who enjoy their 300 bucks 4050(60). :laugh: we had 200 bucks 8gb gpus 8 years ago lmao. that rt and dlss sure is worth it on it tho:slap:. i actually hope it gets even more expensive then people will cry and i will sip the tears since they same people enabled these prices.

look at this shit and start thinking maybe some will finally wake up form the marketing brainwash

View attachment 335064
You've picked one metric in a sea of ways we can compare GPU's, and you hang your hat on the percentage of cores from a full die VS MSRP? Uhh well done?
lets be real most people especially in the us dont give a shit about tdp. fps is all that matters. electricity is what below 10 cents? who cares?
I care, as do many. Electricity isnt cheap everywhere and there are multiple other problems associated with high tdp. This argument flip flops a heck of a lot depending on who is most efficient.
Erm....May want to look at that. Maximum power draws seen (not including spikes)

1080Ti ~270
2080Ti ~290
3090Ti ~470
4090 ~470
Expert avoidance here of multiple cards from both camps than ruin this little exaggerated narrative. In general, yes we're trending higher, but you've cherry picked to make the eh.. Point?
 
From where I sit, two gpu vendors sell complete solutions allowing excellent visual experiences with RT on, one has not, maybe just barely with their latest best.

The current idiot on the block isn't RT.

I have pretty good visuals too with my idiot 6950xt... But this is down to preference I think, visuals is low priority for me, and I don't use the GPGPU stuff Nvidia has, so it made sense for me to buy AMD for the improved performance across all titles vs a 4070. I really wanted a 4080, but that would have been an arm and a leg.
 
Its not that crazy to be honest and its not an unprecedented thing either. The Fury X for example had huge spikes up to 500W, this happens for a lot of gpu's much higher spikes of power.

Generally speaking 300W is not a big deal, its 3 light bulbs, your hair dryer uses much more energy, generally between 800w to 2000w. Your aspirator generally consumes 200W, etc...

So 300W is nothing really and each generation has its ups and downs, there is no trend showing higher power usage, in order to be a trend it has to be at least 3 generations of increase over and over, one generation is not enough to call it a trend.
300 W is a big deal when you're talking about extended periods of time. I don't suppose you use your hair dryer or your vacuum cleaner for 2-4 hours a day, do you?

Using 300 W more when you're gaming for 4 hours a day is an extra £11 on top of your monthly bill, or £132 a year here in the UK. Wouldn't you rather spend that money on something else?

Even if this is pocket money territory for you, I'm sure you care about all that heat dumped inside your chassis.
 
I simply don't buy AMD because they don't know how to manage idle power consumption. Coming from the Netherlands where the cost of electricity is the highest in the world.
 
I simply don't buy AMD because they don't know how to manage idle power consumption. Coming from the Netherlands where the cost of electricity is the highest in the world.
That was with the early drivers. AMD has come a long way since then. My 7800 XT sits at 12 W on the Windows desktop with a 3440x1440 144 Hz screen.
 
Expert avoidance here of multiple cards from both camps than ruin this little exaggerated narrative. In general, yes we're trending higher, but you've cherry picked to make the eh.. Point?
Was mainly just to show that the argument of "there is no trend showing higher power usage" is false

I Know AMD is no Better and in certain cases is actually the worse offender. But i thought showing the power increase across the highest end Non Titan class card (even though a 4090 SHOULD be) would be a good way to disprove the thought that "power consumption isnt bad at all".

I run a 6950XT, but I have power limits and voltage downclocked a little to reduce the outrageous power consumption. Yes I lose a little performance but going the other way gains me miniscule performance increases while I can see power draw goes insane.
 
I simply don't buy AMD because they don't know how to manage idle power consumption. Coming from the Netherlands where the cost of electricity is the highest in the world.
Im looking at 7W idle vriend :)
 
I simply don't buy AMD because they don't know how to manage idle power consumption. Coming from the Netherlands where the cost of electricity is the highest in the world.

You can't manage AMD's idle consumption - that's a bug in their design by selection.

What you can do is to become an independent electricity producer using solar panels or the like.
Another option is to move abroad.

1708623627739.png

1708623643672.png

 
You can't manage AMD's idle consumption - that's a bug in their design by selection.

What you can do is to become an independent electricity producer using solar panels or the like.
Another option is to move abroad.

View attachment 335870
View attachment 335871

What do you mean? 2 previous posts directly contradict your random claim with actual data.

I'll add that my 5600 XT is sitting at 8W right now. My 7700 XT usually hangs at about 9W and 6800 XT at 11W.
 
Sure, but it's still better to have "free" healthcare funded by your tax than to pay tax and not have free healthcare, in my opinion.

Let's not stray too far from the topic, though. :)
nah, we can talk about all things. im glad we have social health care. my friend has had surgeries and other issues with his spinal cord. he had addisons disease. hed never get health insurance in the usa. its also so much beaurocracy. will they pay will they deny. on the other hand usa has the best prices for things. its like 50% of what we pay.
 
You missed where the person mentioned IDLE power consumption.

They didn't mention multimonitor or not but yes, the 4080 uses about 19 W less if using 2+ monitors. Don't forget that those monitors will be using quite a bit more than 19W each so these are small though measurable wattage differences.

Around 20$ a year if the monitors work 8 hours a day. If they work 24 hours a day, that's 60$ a year additional expenditure.
 
Around 20$ a year if the monitors work 8 hours a day. If they work 24 hours a day, that's 60$ a year additional expenditure.

True. And with those extra monitors, then $20-60 a year will be swamped by the additional running costs of those monitors and attached PC. Yes it's an expense but a small one compared to the total when you're buying $1000 GPUs with the intent to use their power for gaming or productivity.
 
Around 20$ a year if the monitors work 8 hours a day. If they work 24 hours a day, that's 60$ a year additional expenditure.
With CPUs like Intel's 14th gen, I think one has much bigger fish to fry than saving a couple of bucks on GPU idle (which is already saved by new drivers anyway).
 
Basically you can get the base clocks at 50% power @0.7V depending on how much headroom is available at stock settings.

3080 Ti 0.950V to fit in 350W vs 4090 running a 1.05V both by default. both are 350 W at equal voltage.

this means 4090 uses 25% more power for no good reason. just because it can.

My 2080 Ti for example has a 350 W bios and it can run with 100W overhead and nothing is gained. it just pumps more voltage through the gpu.
Um no maybe you should learn about how things like gpu and cpu's work. When they test the chips out they gotta figure out a voltage that is still safe for the chip AND also stable across the board that even the worst binned chip still 100% unquestionably stable at. You also don't get that gpu's are using different node 8nm vs 5nm. the smaller node allows for better clocking at same volts. What you are asking is for every gpu to tested to figure out what would be volts each gpu should be individually set to which would be very expensive.
 
Well I wont as I am pretty sure I also lowered power limit :p But I dont own the game so cant test. You mean enhanced edition I assume? Probably a version thats had RT nonsense added to it to really work the card.
Ironically hit 250w yesterday when during testing bumped FF15 up to 60fps, was nice to see my hardware can now handle that game at consistent 60 now with max water reflections, lighting, 4k rendering etc. but wow the power usage lol. This is also with the 3080 undervolted so at spec probably clears 300w.

My 13700k still was only around 45w, so the GPU was by far the big guzzler.
 
Around 20$ a year if the monitors work 8 hours a day. If they work 24 hours a day, that's 60$ a year additional expenditure.
The real question then is why you would let a monitor and a PC sit there idling for an entire work day.

If you're on the idea of saving the 'unnecessary' power usage, you wouldn't let hardware sit idle. And you can simply fix that by letting the monitor turn itself off after X minutes. Letting the PC sleep. Etc. Zero effort. Just two brain cells are needed here.

You're making a mountain out of a mole hill and your calculation isn't very realistic.
 
The real question then is why you would let a monitor and a PC sit there idling for an entire work day.

If you're on the idea of saving the 'unnecessary' power usage, you wouldn't let hardware sit idle. And you can simply fix that by letting the monitor turn itself off after X minutes. Letting the PC sleep. Etc. Zero effort. Just two brain cells are needed here.

You're making a mountain out of a mole hill and your calculation isn't very realistic.

Yep, I do care about power saving/efficiency well my entire family does hence why I had to buy a power meter for my PC to prove that its not a power drain hog like they thought it is. 'I'm not the one paying the bills so it was fair I guess..'
I do have my monitor turn off enabled in case I afk too much.
As for GPUs yeah I aint allowing/interested in anything higher than 200-230W in my PC at stock settings and even those I will undervolt right away. 'for the record our household pays around +/-20 Eurocent/Kw/h so it does add up'
 
Last edited:
Back
Top