• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

will gpu continue to have crazy TDP?

Joined
Sep 23, 2023
Messages
754 (1.26/day)
it seems out of hand. lets not focus on the monstrosity sizes they have become and the sag, but why arent they doing anything about tdp with them. 300-450W for gpu is out of hand. cpu have stayed pretty stable with theirs, maybe higher intel chips have gotten high but they are just a few but if cpu can stay pretty low why gpu are getting higher and higher the jumps are quite large for the higher end ones. will this trend continue more and more? at the pace were in now, well be well over 600w for a card in the next few years. and with electricity prices jumping, thats paying the card twice in its use time. or a good chunk. seems like they put mno effort into the efficiency of them. hell I remember a time when I was thinking 180w was a lot for a gpu.
 
It ebbs and flows.

Some gens sip power, others push things to the limit and gulp it down.
I don't worry about it too much, I bought an 850W power supply for that reason.

Boy Scout rules essentially - always be prepared.
 
Last edited by a moderator:
looking for a gpu to learn editing, the watts is ridiculous. having an 850w psu is not the solution. those electrical bills are unnecessarily high. from 6650 onward in the heirarchy, the tdp is crazy high so your "more efficient" is not true. id consider 200w for a gpu as efficient anything over that, id say they just didnt give a rats ass. and you can see it. the jumps in TDP is not linear with past models.
 
looking for a gpu to learn editing, the watts is ridiculous. having an 850w psu is not the solution. those electrical bills are unnecessarily high. from 6650 onward in the heirarchy, the tdp is crazy high so your "more efficient" is not true. id consider 200w for a gpu as efficient anything over that, id say they just didnt give a rats ass. and you can see it. the jumps in TDP is not linear with past models.
4070 is the best GPU you can get at 200w though the 4070 super is a better buy and only 230 watts, then you have the 7700xt/7800xt at 230/250w

Those are all gaming max loads btw
 
When it begins it never stops,

more eriously, since Intel CPU pump 350w... why GPU builder would care, double or even triple the CPU consuption has always been the case right ?
 
As long as there is heavy demand for high performance GPUs regardless of efficiency, NVIDIA (for example) will keep on pushing them out. If NVIDIA gets 200% more MSRP for a xx90 compared to the xx80, which is 20% faster but 40% less efficient, they'll keep doing it. The numbers are made up, but the point is not. For 2 generations, consumers have been sending the message that they want performance at any cost. Until that demand changes, cards will continue to push the performance limit at the cost of efficiency. Remember when you used to be able to overclock quite a lot at the cost of efficiency? See what happened? We are now getting the inefficient version out of the box with little room for additional overclocking.
 
looking for a gpu to learn editing, the watts is ridiculous. having an 850w psu is not the solution. those electrical bills are unnecessarily high. from 6650 onward in the heirarchy, the tdp is crazy high so your "more efficient" is not true. id consider 200w for a gpu as efficient anything over that, id say they just didnt give a rats ass. and you can see it. the jumps in TDP is not linear with past models.
I'm not entirely sure what you are saying here.

In 2013 a GTX 780 was a 250W TDP.

In 2016 a GTX 1080 was a 180W TDP. Yeah Pascal was golden age.

In 2020 a RTX 2080 was a 225W TDP.

Since the 30xx we've seen over 300W. It's higher, but visibly nVidia has been stretching the power they can yield out of the current chips. However the avg power in use is around 250W. Playing with numbers a bit there they do, and I've not looked into typical draw of the old cards, but there you are. Now add into the mix the 30xx and 40xx were the first xx80 to go the 12GB/16GB vRAM and it might explain stuff.
 
I'm not entirely sure what you are saying here.

In 2013 a GTX 780 was a 250W TDP.

In 2016 a GTX 1080 was a 180W TDP. Yeah Pascal was golden age.

In 2020 a RTX 2080 was a 225W TDP.

Since the 30xx we've seen over 300W. It's higher, but visibly nVidia has been stretching the power they can yield out of the current chips. However the avg power in use is around 250W. Playing with numbers a bit there they do, and I've not looked into typical draw of the old cards, but there you are. Now add into the mix the 30xx and 40xx were the first xx80 to go the 12GB/16GB vRAM and it might explain stuff.
over the years there were many cards that had horrible power efficiency. but mostly it was linear, but if you look at the scale and the last 3 years or so, it just went berzerk. if you put all the cards on graph youll see the last few years it jumped like crazy.

4070 is the best GPU you can get at 200w though the 4070 super is a better buy and only 230 watts, then you have the 7700xt/7800xt at 230/250w

Those are all gaming max loads btw
for me, a card at under 220 is "decent" not great by any stretch of the imagination. just look at the 5700xt and then 6600xt dropped down hard so its possible. but if thats the case then screw that ill wait to buy used and at least save some of the money for the electricity bill. and its not a small increase with that and some here kind of dismissing it. maybe youre not the one paying the bills. maybe some get help. I dont know. but even my brother who has his own business and can buy what he wants always considers all aspects including power efficiency. and lets be honest, wasting less electricity is better for the world. we just seem to use more and more.
 
Last edited:
i mean 20-30w difference between an already 200w card is not really going to cost you a fortune in energy, you can also reduce the power limit of them yourself, undervolt etc I don't see it as a big issue when you are likely going to be pulling 450w+ anyway
 
i mean 20-30w difference between an already 200w card is not really going to cost you a fortune in energy, you can also reduce the power limit of them yourself, undervolt etc I don't see it as a big issue when you are likely going to be pulling 450w+ anyway
every little bit counts. I look at these things ahead. a set a line and try not to pass them. hopw they can make one card decently efficient like the 6650xt then go bonkers with 6700xt. nvidia is far worse with tdp. I was thinking to get the 5700xt in the past but that tdp is trash and then 6600xt showed the mistake they made. its evident they realized this. since covid all hell broke lose with tdp. efficient cards are not a thing and this is the same thing with buying pc hard ware. you can always just add another $10 and jump just a little bit higher, just a little bit more power, just a little more speed etc but I set criteria and try to find products that fit in that.
 
Undervolting being a thing underlines and proves there is absolutely no customer demand to use more power.

The demand is for more performance and we are simply stretching things as far as they go. The efficiency of GPUs rises, but our demands somehow... strangely... rise always just beyond whats reasonably possible.

The real answer here is commerce and its constant search to give people incentive to buy. Look at cars. SUVs nullified the efficiency gains ICE engines have gained over the years. Overall emission of the ICE car park is up globally - per car. It is all more of the same.

Humanity has no self restraint. We are lemmings, happily going off the cliff.
 
I hear what you're saying but you're ignoring most of what I have said, you mention energy bill and less than a light bulb is not going to make any significant difference to your energy bill, you either get one that fits in your criteria as I stated the best is the 4070 at bang on 200w gaming load or if you want better performance then you buy one with a higher tdp and limit the clocks, power and voltage, just picking a random wattage and basing your buying decision based on that one metric is crazy and there are many options under 200w unfortunately the higher end GPU's are always likely to be starting around the 200w mark and above, I don't think that will be changing any time soon.
 
Top end cards I suspect will only rise in TDP demands for the near future. I can see 600 watts with the awful 12VHPWR connector being the intention for flagship level cards in the next gen or two. Looking at the datacenter 800+ watt behemoths are currently the flavour of the month.
 

@Marcus L

yea, I wont buy nvidia anymore. with evga and all that has happened, never again.

the whole market is pushed for wanting something and not needing something. I dont understand this stupidity of needing 200fps to play a game or its not worth playing. this is some bullshit pushed on the younger folks. only a select few may need 120fps, maybe and in only a handful of games, but 99.9% dont need more then 75fps. a guy just posted in a forum saying he wants to play some games. not top games, guy pushed him to a $1000gpu so he can get over 150fps. im certain that these are specific PR reps and influencers who have brainwashed people to think you must get the best cards to play 150fps and more. all my life since playing atari2600 I have never felt I needed more then 75fp an im perfectly fine with 60fps.

@Vayra86
"Humanity has no self restraint. We are lemmings, happily going off the cliff."

bravo well said. specifically the last 3 years or so, shet has gone crazy.
 
Wait a few years and you will have a middle-high end card like a 6070 that has 4090 performance for 200W.
Other than that it's a nothingburger, the 0.1% of gamers with 4090 and 7900XTX cards are not even a drop in an ocean. If you want to save the planet, stop using electricity and petrol based stuff, welcome back to the stone age.

Also you can always undervolt and/or frame cap the game to save even more trees :)) My 4070 undervolts to 920mV@2700Mhz and draws about 150-160W maximum and my 12600K with -100mV draws about 40-50W in more demanding games.
 
Last edited:
looking for a gpu to learn editing, the watts is ridiculous. having an 850w psu is not the solution. those electrical bills are unnecessarily high. from 6650 onward in the heirarchy, the tdp is crazy high so your "more efficient" is not true. id consider 200w for a gpu as efficient anything over that, id say they just didnt give a rats ass. and you can see it. the jumps in TDP is not linear with past models.
in the Early 2000's people were editing on desktop computers. It's technique you need to learn, not starting to edit 4/8k video. Get a decent specced machine and practice your cuts/cutaways and story development. When you are any good, maybe then look towards a more powerrful efficient machine tailored to production; a decent modern multi-thread Xeon/Threadripper processor with inbuilt gfx would do for editing anyway, there are more specialist cards for editing which are not specifically for gaming.
 
if you are concerned about 30 cents per month extra... how can you afford a GPU or a PC in the first place?
 
just look at the 5700xt and then 6600xt dropped down hard so its possible.
Then again this is misleading man.

The proper comparison to a 5700XT is the 6700XT, and the TDP for that is also 225W.

The 160W 6600XT has 2048 cores instead of 2560 cores (5700xt - 6700xt) and 128 texture units instead of 160 (5700xt - 6700xt), which I'm guessing explains the TDP difference.

On just a general note as well, yeah I was a bit surprised to see that where the broad market of all things has been driven by lowering energy consumption and looking proactively ecologically concerned, it does seem high end PC is not one. Not surprised, high end performance was never a domain of efficiency, more of "complacency" (I know I know we NEED that performance, do we?).

As others pointed out, if that is a deterrent for your electricity bill, how can you afford the PC or the card?

If you want to rather save the world, well as I just said, high performance is not something that you should indulge in your life's choices.

Not saying this against you. I just don't see the point of this. We're speaking two last gens of nVidia mostly, the rest of them were always treading the same waters and still do. If Blackwell continues to rely on 300W + high end skus, well I guess we'll really have at least a good reason to point a finger at it and say - is that the best you can do?
 

@Marcus L

yea, I wont buy nvidia anymore. with evga and all that has happened, never again.

the whole market is pushed for wanting something and not needing something. I dont understand this stupidity of needing 200fps to play a game or its not worth playing. this is some bullshit pushed on the younger folks. only a select few may need 120fps, maybe and in only a handful of games, but 99.9% dont need more then 75fps. a guy just posted in a forum saying he wants to play some games. not top games, guy pushed him to a $1000gpu so he can get over 150fps. im certain that these are specific PR reps and influencers who have brainwashed people to think you must get the best cards to play 150fps and more. all my life since playing atari2600 I have never felt I needed more then 75fp an im perfectly fine with 60fps.

@Vayra86
"Humanity has no self restraint. We are lemmings, happily going off the cliff."

bravo well said. specifically the last 3 years or so, shet has gone crazy.
A lot of it is also just part of the hobby. I mean, PC gaming has all these options, so it makes sense to explore them at some point. And also, its hard to go back sometimes, as in, if you're getting used to >100 FPS you will definitely frown a little bit on games running at 30-50 FPS, and 60 FPS also still is a noticeable gap. But high FPS isn't really the reason people buy expensive graphics cards. You can get high FPS on cheaper cards today. Just lower your resolution or details. If you stick to 1080p you're going to have a lot of frames. People who buy into the top end are really just chasing the 'cutting edge' so to speak, in their minds. They want the experience to be as good as possible, so that translates to having the maximum power available - within whatever budget or reach every individual has: its not even just in the high end that this happens. Not a single GPU is 'just a gaming GPU', its a different beast from say, CPUs. For CPUs you can just pick a tiny handful of best options for gaming at any given point in time.

Car analogies kinda work here. Some people love to drive the biggest baddest thing around, even though its utterly pointless on 99% of all roads on the planet. Top end GPUs are a little bit like that. You don't need it to play games, and it probably won't pay off in most games either. The key difference is the factor of time, and the 'roads of gaming' are constantly accepting faster car(d)s.

Top end cards I suspect will only rise in TDP demands for the near future. I can see 600 watts with the awful 12VHPWR connector being the intention for flagship level cards in the next gen or two. Looking at the datacenter 800+ watt behemoths are currently the flavour of the month.
I've already got my popcorn ready tbh. 4090 was a good start.
 
I don't get the complaints about power consumption for GPUs. Within a family of GPUs, all of them have about the same efficiency, yeah some a bit more than others. You want more performance? It will cost you proportionately more power. You want less power? It will perform proportionately slower.

There's no problem with excessive power draw nowadays unless you're looking for a GPU using under 100W. Sticking to Nvidia, the 4090 is delivering you a freakton of performance and it uses a similar amount of power as the 4060 per performance delivered. Don't like the wattage? Don't buy it!! Buy a 4060. Or 4070 or 4080. You have choices. Use them.

Want even better efficiency? Buy a bit above the performance you need and undervolt and even underclock to find your GPU's max efficiency.

Does this logic not make sense to some people?
 
it seems out of hand. lets not focus on the monstrosity sizes they have become and the sag, but why arent they doing anything about tdp with them. 300-450W for gpu is out of hand. cpu have stayed pretty stable with theirs, maybe higher intel chips have gotten high but they are just a few but if cpu can stay pretty low why gpu are getting higher and higher the jumps are quite large for the higher end ones. will this trend continue more and more? at the pace were in now, well be well over 600w for a card in the next few years. and with electricity prices jumping, thats paying the card twice in its use time. or a good chunk. seems like they put mno effort into the efficiency of them. hell I remember a time when I was thinking 180w was a lot for a gpu.

No, I think there are physical limitations. Around 250W is the upper limit for the sane high-end users, while extreme overclockers can go far beyond that, with their responsibility about PC case usage, cooling solutions, noise, electricity bills, etc.

Look at the history. Rage 128 Pro was an 8W card
Today Radeon RX 7900 XTX is whooping crazy abnormal 355W

Radeon RX 7900 XTX 355W TBP 2022
Radeon RX 6900 XT 300W
Radeon RX 5700 XT 225W
Radeon VII 300W
Radeon RX Vega 64 295W
Radeon RX 580 185W
Radeon RX 480 150W
Radeon Pro Duo 350W
Radeon R9 Fury X 275W
Radeon R9 390X 275W
Radeon R9 295X2 500W
Radeon RX 290X 250W
Radeon HD 7990 375W
Radeon HD 7970 250W
Radeon HD 6990 375W
Radeon HD 6970 250W
Radeon HD 5970 294W
Radeon HD 5870 188W 2010
Radeon HD 4870 X2 286W
Radeon HD 4870 150W
Radeon HD 3870 X2 165W
Radeon HD 3870 106W
Radeon HD 2900 XT 215W
Radeon X1950 XTX 125W
Radeon X850 XT 69W
Radeon X800 XT 54W
Radeon 9800 XT 60W
Radeon 7500 23W
Rage Fury MAXX 13W
Rage 128 Ultra 8W
Rage 128 Pro 8W 1999
 
It ebbs and flows.

Some gens it will be super efficient, other gens they're pushing things to the limit and it's a fireball.
I don't worry about it too much, I bought an 850W power supply for that reason.

Boy Scout rules essentially - always be prepared.
The latest gen is the most efficient.

Power draw ≠ efficiency.

it seems out of hand. lets not focus on the monstrosity sizes they have become and the sag, but why arent they doing anything about tdp with them. 300-450W for gpu is out of hand. cpu have stayed pretty stable with theirs, maybe higher intel chips have gotten high but they are just a few but if cpu can stay pretty low why gpu are getting higher and higher the jumps are quite large for the higher end ones. will this trend continue more and more? at the pace were in now, well be well over 600w for a card in the next few years. and with electricity prices jumping, thats paying the card twice in its use time. or a good chunk. seems like they put mno effort into the efficiency of them. hell I remember a time when I was thinking 180w was a lot for a gpu.
If you want a 300 W or xxx W card limit the power budget and/or undervolt. You're still going to have a card that significantly outperforms previous generations.

Even at stock these cards are significantly more efficient.

Look at VSYNC 60 Hz results from reviews to find out what kind of power draw you're using for the same performance. Smaller dies naturally draw less power, the 4080 is actually the most efficient die here when under heavier loads.

power-vsync-2.png


This also shows the issues with AMD's architecture and the chiplet design for efficiency.

Maxed out, the efficiency changes slightly.

energy-efficiency.png
 
Back
Top