• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 vs 7900XTX power consumption - Optimum Tech

I'm not taking anything personally lol, this is the Internet after all. If people want to get upset, they will.

Not all testing is perfect, and it's been pointed out there are some obvious and simple ways to add more context detail to this video, still, I find simple tests can still offer insight.
Good to hear. And I dont disagree, the problem is you're building history coated green. Your opinion is fine, then state it as such. We have a first page full of caveats you failed to mention or even pick up on in the OP.
 
Good to hear. And I dont disagree, the problem is you're building history coated green.
From my observations things are pretty one sided at the moment GPU wise, last generation was more competitive between the two, even pricing tiers are pretty similar between the two main brands although that's the one area AMD has an advantage.

My opinions aren't really going to change history, I'm just trying to present content I found interesting, simple as that. People may be reading into things a little much.

We have a first page full of caveats you failed to mention or even pick up on in the OP.
Good thing this is a public forum then where people are free to add details , discussion is a good thing and generally adds value.
 
The testing wasn't framecapped, if you actually watched the video, you could have seen that he mentioned that, also, framecapping is a good thing for any tier of GPU.

Some game engines have FPS limits, overwatch AFAIK, I don't play it, is 600 FPS.
That is HIS sentiment. I could care less about gimping my card but I guess I like getting high natural frames at 4K rather than 1440P low (Really?). I am not kidding it seems daft to me. I could give you a scenario where the Cost of Living should demand that people don't spend that much for a GPU and power saving would be nice but both the 7900XTX used has 3 8 Pins and the 4080 has the 600W 12 pin so in the very design of the card does this imply how it is meant to be used. You want efficiency buy a 4060TI or if you want efficiency/cost (for what it can do) get a 6750XT.

Here we go with someone buying one of these cards to play Overwatch when that can be played just fine with a 580 8GB card. I can also tell that you don't play at 4K as using a 144Hz at 4K demands pure horsepower (and lot's of VRAM). It just seems ridiculous to set my 144Hz monitor to run at 60 and the same I would say about Frame capping. TWWH3 would probably feel like Molasses with that and in no way does that seem enjoyable. The days of 60HZ monitors for PC Gamers is last decade.
 
Good thing this is a public forum then where people are free to add details , discussion is a good thing and generally adds value.
Thats exactly right but you didnt add value, @evernessince did replying to you.
 
That is HIS sentiment. I could care less about gimping my card but I guess I like getting high natural frames at 4K rather than 1440P low (Really?). I am not kidding it seems daft to me. I could give you a scenario where the Cost of Living should demand that people don't spend that much for a GPU and power saving would be nice but both the 7900XTX used has 3 8 Pins and the 4080 has the 600W 12 pin so in the very design of the card does this imply how it is meant to be used. You want efficiency buy a 4060TI or if you want efficiency/cost (for what it can do) get a 6750XT.

Here we go with someone buying one of these cards to play Overwatch when that can be played just fine with a 580 8GB card. I can also tell that you don't play at 4K as using a 144Hz at 4K demands pure horsepower (and lot's of VRAM). It just seems ridiculous to set my 144Hz monitor to run at 60 and the same I would say about Frame capping. TWWH3 would probably feel like Molasses with that and in no way does that seem enjoyable. The days of 60HZ monitors for PC Gamers is last decade.
You've been linked these before by myself, and others, but perhaps you didn't get a chance to read up on input lag.

In short, there's a good argument to make for capping frame rate 3 FPS below monitor refresh rate.


 
From my observations things are pretty one sided at the moment GPU wise, last generation was more competitive between the two, even pricing tiers are pretty similar between the two main brands although that's the one area AMD has an advantage.

My opinions aren't really going to change history, I'm just trying to present content I found interesting, simple as that. People may be reading into things a little much.
One sided? You are right though for Gamers AMD makes more sense at all tiers except the highest one. Pricing is similar? How could a card that is $1299 be close to a card for $1799. That is $500. There is no denying that you are heavily biased towards Nvidia as when people post threads about 6800XT vs 7900XTX you always seem to push (seems like the 4070TI is your fave) Nvidia onto the OP. It is no different than last week on the PC world podcast that one of the presenters said that in the 5+ years he had been at PC World he had never used an AMD GPU. Does that seem objective when someone asks him between a 6800XT and 3070 what they should get? This is the same energy you project.

Maybe the reason I am so incredulous is that my PC is the fastest machine I have ever had and I am not talking about Crysis but Silpheed and Space Racer or my favourite DOS Game Zeliard.
 
One sided? You are right though for Gamers AMD makes more sense at all tiers except the highest one. Pricing is similar? How could a card that is $1299 be close to a card for $1799. That is $500. There is no denying that you are heavily biased towards Nvidia as when people post threads about 6800XT vs 7900XTX you always seem to push (seems like the 4070TI is your fave) Nvidia onto the OP. It is no different than last week on the PC world podcast that one of the presenters said that in the 5+ years he had been at PC World he had never used an AMD GPU. Does that seem objective when someone asks him between a 6800XT and 3070 what they should get? This is the same energy you project.

Maybe the reason I am so incredulous is that my PC is the fastest machine I have ever had and I am not talking about Crysis but Silpheed and Space Racer or my favourite DOS Game Zeliard.
You're going off topic again.

Yes, I'm sure you have a nice PC.
 
Actually that's just 150 usd difference going by my calculations I'd buy that Zotac 4080 in a heartbeat over that MSI 7900XTX or that msi 4080.

Here in the states there can be a much larger difference in price though.
Yeah Zotac over MSI with one of the best GPU coolers on the market. No surprise.

You're going off topic again.

Yes, I'm sure you have a nice PC.
It's not off topic I own a 7900 series GPU so if this thread is about it then it applies. No?
 
People know him and respect him for the SFF stuff, watercooling, and high refresh esports products reviews. Methinks he should stick to that.
Exactly. First come, first served. That's what you get from an obvious clickbait title. He fails to deliver to the masses, and should stick to his niche SFF, and high end (competitive overpriced) FPS game genre gear audience. It's been a while since I saw any useful video besides that specifically targeted crowd ones. Might be a good time to reconsider my subscription.

@dgianstefani Admitting that you're wrong is mostly considered a positive personality trait, especially when timed right. Another meaningless red vs. green thread is just pathetic, even more considering it's coming from a staff member. If you feel this as a personal assault, please report it, but please dude get your things right for once at least.
 
It's not off topic I own a 7900 series GPU so if this thread is about it then it applies. No?
Are you talking about power use?
 
the clock speeds are missing. XTX at 67% usage delivering the same framerate as 4080 at 32% means the latter has dropped to a lower power state that combined with voltage is exactly what you see.

edit: actually no. it means the opposite, i'm confused
It is a behaviour I do recognize on my 7900XT. RDNA3 doesnt clock down as aggressively, there is some kind of plateau around 50-70% utilization. It doesnt seem related to how the chiplet based products idle either; in lighter game loads you can see it clock down further.

Example: I run Diablo 4 at half monitor refresh rate FPS cap - 72 FPS so it remains butter smooth. GPU will never drop below 60% utilization. Darkest Dungeon 2: never below 60% either even if some scenes are clearly lighter to run.
 
Last edited:
That is HIS sentiment. I could care less about gimping my card but I guess I like getting high natural frames at 4K rather than 1440P low (Really?). I am not kidding it seems daft to me. I could give you a scenario where the Cost of Living should demand that people don't spend that much for a GPU and power saving would be nice but both the 7900XTX used has 3 8 Pins and the 4080 has the 600W 12 pin so in the very design of the card does this imply how it is meant to be used. You want efficiency buy a 4060TI or if you want efficiency/cost (for what it can do) get a 6750XT.

Here we go with someone buying one of these cards to play Overwatch when that can be played just fine with a 580 8GB card. I can also tell that you don't play at 4K as using a 144Hz at 4K demands pure horsepower (and lot's of VRAM). It just seems ridiculous to set my 144Hz monitor to run at 60 and the same I would say about Frame capping. TWWH3 would probably feel like Molasses with that and in no way does that seem enjoyable.

Sure, I respect your opinion. If I found a FV43U at the time and been gaming at 4K all this time, I would share much of your sentiments. It doesn't diminish the way many others run their cards, though, just because that's not how you play. Have you run a higher power GPU on air in something sub-20L? You feel every watt of power draw. Given the popularity of NCASE/NR200P I hope I don't have to debunk the myth that SFF is "niche" and not the way one "should" play.

I'm not calling this efficiency thing a bug or a problem, and I don't believe it is. But this whole "maybe that's because you're not playing like you SHOULD" is pretty stupid, counterproductive, and even obstructive to drawing attention to getting real bugs fixed (high refresh multi monitor power, which thankfully is finally getting on the right track). And even kinda hypocritical?? I thought Nvidia elitism was what got us in this pit in the first place ("omg real gamers care about RTX").

The days of 60HZ monitors for PC Gamers is last decade.
 
Sure, I respect your opinion. If I found a FV43U at the time and been gaming at 4K all this time, I would share much of your sentiments. It doesn't diminish the way many others run their cards, though, just because that's not how you play. Have you run a higher power GPU on air in something sub-20L? You feel every watt of power draw. Given the popularity of NCASE/NR200P I hope I don't have to debunk the myth that SFF is "niche" and not the way one "should" play.

I'm not calling this efficiency thing a bug or a problem, and I don't believe it is. But this whole "maybe that's because you're not playing like you SHOULD" is pretty stupid, counterproductive, and even obstructive to drawing attention to getting real bugs fixed (high refresh multi monitor power, which thankfully is finally getting on the right track). And even kinda hypocritical?? I thought Nvidia elitism was what got us in this pit in the first place ("omg real gamers care about RTX").
Plus there's only extremely minimal latency advantages when going higher than your monitors refresh rate, sometimes it causes even worse input lag too.

It is a behaviour I do recognize on my 7900XT. RDNA3 doesnt clock down as aggressively, there is some kind of plateau around 50-70% utilization. It doesnt seem related to how the chiplet based products idle either; in lighter game loads you can see it clock down further.
I wonder if that's the reason for the power draw disparity. Besides the "40W" AIB model issue, which doesn't account for the rest of the % difference.
 
It is a behaviour I do recognize on my 7900XT. RDNA3 doesnt clock down as aggressively, there is some kind of plateau around 50-70% utilization.

I'm not sure as to how accurate those reported utilization numbers really are. Haven't trusted them for a while, on RTX 30, Vega iGPU, RDNA3, RTX 40 and now RDNA2 iGPU and dGPU.

If anything personally I observed clocks to be a better window into what RDNA3 GPUs are doing. imo a lot of it doesn't actually appear to be hardware constrained, just laziness on the driver's part. The hardware seems capable of much more, if it wasn't being treated as the dumb GCN blockhead the drivers seem to think it is.

Whereas by comparison, clocks are at least outwardly pretty disconnected from load and power on nvidia.
 
Sure, I respect your opinion. If I found a FV43U at the time and been gaming at 4K all this time, I would share much of your sentiments. It doesn't diminish the way many others run their cards, though, just because that's not how you play. Have you run a higher power GPU on air in something sub-20L? You feel every watt of power draw. Given the popularity of NCASE/NR200P I hope I don't have to debunk the myth that SFF is "niche" and not the way one "should" play.

I'm not calling this efficiency thing a bug or a problem, and I don't believe it is. But this whole "maybe that's because you're not playing like you SHOULD" is pretty stupid, counterproductive, and even obstructive to drawing attention to getting real bugs fixed (high refresh multi monitor power, which thankfully is finally getting on the right track). And even kinda hypocritical?? I thought Nvidia elitism was what got us in this pit in the first place ("omg real gamers care about RTX").
I think we share the same sentiments because we have the exact same panel. Did you turn the Contrast and Colour up? I am not a fan of SFF. I once did a build in a Raven02 and don't need compromises. I am currently housing my PC in a Corsair 7000D Airflow. Big cases are there because people buy them.

I agree that the clocks on 7000 are more aggressive than 6000 but that is ok with me it is part of the reason I bought the card in the first place.
 
I think we share the same sentiments because we have the exact same panel. Did you turn the Contrast and Colour up? I am not a fan of SFF. I once did a build in a Raven02 and don't need compromises. I am currently housing my PC in a Corsair 7000D Airflow. Big cases are there because people buy them.

I agree that the clocks on 7000 are more aggressive than 6000 but that is ok with me it is part of the reason I bought the card in the first place.

I wish I had the FV43U - right as I was about to pull the trigger it went out of stock everywhere and the sale ended. Also finally left the SFF camp (at least for my main), but I appreciate that there are many concerns that otherwise just would not exist in bigger cases. RDNA3 offers a theoretical wealth of precision and control (even if chiplets cause a bit higher power floor), but AMD didn't do a whole lot with it.
 
I wish I had the FV43U - right as I was about to pull the trigger it went out of stock everywhere and the sale ended. Also finally left the SFF camp (at least for my main), but I appreciate that there are many concerns that otherwise just would not exist in bigger cases. RDNA3 offers a theoretical wealth of precision and control (even if chiplets cause a bit higher power floor), but AMD didn't do a whole lot with it.
I never really heard anything after the initial review about the air temperature sensors on the fan intakes, are those visible in the driver software (adrenaline) or do you need to use more advanced monitoring software to see them?
 
I never really heard anything about the air temperature sensors on the fan intakes, are those visible in the driver software (adrenaline) or do you need to use more advanced monitoring software to see them?

iirc the ambient temp sensor/daughterboard is on only 1 model (XTX reference). Maybe they worked it in later on, definitely didn't factor into much at release. It shows up under HWInfo
 
iirc the ambient temp sensor/daughterboard is on only 1 model (XTX reference). Maybe they worked it in later on, definitely didn't factor into much at release. It shows up under HWInfo

I thought it was a neat idea... i really liked evga cards that included all the extra sensors I wish more cards did that.
 
I really enjoy watching you all converse, you are awesome :cool:

I made a thread awhile back, I bought that Zotac and then cancelled.

I looked into the 4080/XTX and I will be buying a 4080 tonight :toast:
 
I really enjoy watching you all converse, you are awesome :cool:

I made a thread awhile back, I bought that Zotac and then cancelled.

I looked into the 4080/XTX and I will be buying a 4080 tonight :toast:

You can do it!!!! All night long!!!!
 
My question is somewhat relevant to the topic, so how much power is each card capable of drawing? If the 7900 XTX has a typical draw of say, 400 watts under full load, do any models exist which will allow you to operate it at 520-550 watts? Do we have RTX 4080/RTX 4070 Ti GPUs which are also capable of letting the user juice their card? I hate throttling, and it's the one thing I curse the most in my 3090. It genuinely makes for a poor experience IMHO.

So, to rephrase it better perhaps, what are the corresponding 7900 XTX and RTX 4080/4070 Ti with the highest power limit?
 
My question is somewhat relevant to the topic, so how much power is each card capable of drawing? If the 7900 XTX has a typical draw of say, 400 watts under full load, do any models exist which will allow you to operate it at 520-550 watts? Do we have RTX 4080/RTX 4070 Ti GPUs which are also capable of letting the user juice their card? I hate throttling, and it's the one thing I curse the most in my 3090. It genuinely makes for a poor experience IMHO.

So, to rephrase it better perhaps, what are the corresponding 7900 XTX and RTX 4080/4070 Ti with the highest power limit?

There doesn't really seem to be a significant difference in performance between alternative models. You should go for the card that is quiet, fits your budget and has the form factor you like.

tdp-adjustment-limit.png


Screenshot_20230711_010125.png

As you can see, despite the zotac having the biggest power limit, it isn't the fastest, and they're all pretty close.
 

There doesn't really seem to be a significant difference in performance between alternative models. You should go for the card that is quiet, fits your budget and has the form factor you like.

View attachment 304262

Quiet isn't even something I look at, to be honest with you. Louder the better, as long as the air is flowing... I'm going to set the fans to 100% and use the card that way for as long as I have it. When you live in a hot climate with no AC, quiet parts are a luxury. The reason I want a card with a very high power limit allowance is to eliminate clock speed dips that occur in some workloads while allowing myself to flatline the GPU clocks. I've mostly achieved this on my 3090 by reducing the maximum frequency, but under extreme RT workloads it will still dip to low 1300 MHz's and I want to eliminate this phenomenon. The only way is allowing the card to chug what it wants.

Looks like the 4080 Strix has a pretty healthy power limit. That's good. I gotta look up the 7900 XTX TUF's as well...
 
Quiet isn't even something I look at, to be honest with you. Louder the better, as long as the air is flowing... I'm going to set the fans to 100% and use the card that way for as long as I have it. When you live in a hot climate with no AC, quiet parts are a luxury. The reason I want a card with a very high power limit allowance is to eliminate clock speed dips that occur in some workloads while allowing myself to flatline the GPU clocks. I've mostly achieved this on my 3090 by reducing the maximum frequency, but under extreme RT workloads it will still dip to low 1300 MHz's and I want to eliminate this phenomenon. The only way is allowing the card to chug what it wants.

Looks like the 4080 Strix has a pretty healthy power limit. That's good. I gotta look up the 7900 XTX TUF's as well...

Not sure how the rdna cards or the 4080 behaves but my 4090 fluctuates from 2600-2800mhz depending on the game power means almost 0 yeah I can blast the fans and set a 600w limit and it'll stay close to 3000mhz but performance difference is less than 5% so not worth it.... It's actually better in my experiences to cap it around 350w lose almost no performance and run very cool/quiet around 2550mhz.

The 40 series cards are more limited by voltage than power likely due to longevity concerns. Most of them are already pushed beyond the efficiency curve sweet spot.
 
Back
Top