• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 vs 7900XTX power consumption - Optimum Tech

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,560 (2.05/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga/S25U-1TB
Processor Ryzen 9800X3D @ 5.4ghz AC 1.18 V, TG AM5 High Performance Heatspreader/1185 G7/Snapdragon 8 Elite
Motherboard ASUS ROG Strix X870-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 x2, D5/Res, 4x Noctua A12x25, 1x A14G2, Conductonaut Extreme
Memory 64 GB Dominator Titanium White 6000 MT, 130 ns tRFC, active cooled, TG Putty Pro
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 40 W/mK 3D Graphite pads, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 34" 240 Hz 3440x1440 34GS95Q LG MLA+ W-OLED, 31.5" 165 Hz 1440P NanoIPS Ultragear, MX900 dual VESA
Case Sliger SM570 CNC Alu 13-Litre, 3D printed feet, TG Minuspad Extreme, LINKUP Ultra PCIe 4.0 x16 White
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & Leather LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF1000 Plat, 13 A transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White w/Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerL60 V2, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Interesting that in less demanding games, the % difference is much higher.

I knew that the 4080 was a bit more efficient in general, but wasn't aware of how the efficiency changes when utilization isn't 100%.

1689014867882.png

1689014886284.png


 
Three major problems with this video:

1) System specs are not listed. This is the basics of the basics and he should really know better (and I say this as a long time sub to his channel). Driver version is also important here as AMD recently released an update addressing idle power consumption.

2) He compares a reference 4080 to an aftermarket OC'd 7900 XTX. One from ASUS no less. It's not an Apples to Apples comparison, it's entirely possible ASUS has done a horrible job with the voltage curve during middling loads.

3) His numbers don't align with any other review in the same games. Outlets like HWUB and GamersNexus do a sanity check wherein if their numbers significantly deviate from what other's are getting, they will do additional testing to ensure their numbers are valid. Optimum's numbers deviate far beyond what you'd expect compared to other reviews so I have to assume he completely forgot to check his numbers. TPU has the 7900 XTX on average using 50w more and that includes games that are CPU limited and in the video he's getting more than twice that. Toms hardware and HWUB have the 7900 XTX only using 38w more on average when gaming.

Overall this video is not up to the quality I expect out of his channel. Of course the 7900 XTX does consume more power but this video does more harm than good and contradicts every other review that was done on these products.
 
Three major problems with this video:

1) System specs are not listed. This is the basics of the basics and he should really know better (and I say this as a long time sub to his channel). Driver version is also important here as AMD recently released an update addressing idle power consumption.

2) He compares a reference 4080 to an aftermarket OC'd 7900 XTX. One from ASUS no less. It's not an Apples to Apples comparison, it's entirely possible ASUS has done a horrible job with the voltage curve during middling loads.

3) His numbers don't align with any other review in the same games. Outlets like HWUB and GamersNexus do a sanity check wherein if their numbers significantly deviate from what other's are getting, they will do additional testing to ensure their numbers are valid. Optimum's numbers deviate far beyond what you'd expect compared to other reviews so I have to assume he completely forgot to check his numbers. TPU has the 7900 XTX on average using 50w more and that includes games that are CPU limited and in the video he's getting more than twice that. Toms hardware and HWUB have the 7900 XTX only using 38w more on average when gaming.

Overall this video is not up to the quality I expect out of his channel. Of course the 7900 XTX does consume more power but this video does more harm than good and contradicts every other review that was done on these products.
1) Idle isn't what's being tested.
2) TPU tested both cards, max power draw is 400 W for the ASUS TUF OC 7900XT and 320 W for the 4080 FE, the ASUS didn't stand out from an average XTX in power draw.

More harm than good?
 
1) Idle isn't what's being tested.

Which doesn't change the fact that the driver version used or any of the system specs are missing.

2) TPU tested both cards, max power draw is 400 W for the ASUS TUF OC 7900XT and 320 W for the 4080 FE, the ASUS didn't stand out from an average XTX in power draw.

Which perfectly proves my point. Max power draw of the reference 7900 XTX is 360w, a whopping 40w less and hence why comparing an OC card to a reference card for the purpose of demonstrating the power efficiency of a given generation broadly is highly misleading:

1689017214649.png


The difference from other reviews is even less:

1689017295247.png

More harm than good?

The point of content pieces like this is to inform individuals. If your video fails to do that but in fact misinforms them, it is detrimental to the market. It would be fine if it were labeled ASUS 7900 XTX TUF power issues, which is correct. In this case though the video is implying that this is a much broader issue when all other reviews indicate otherwise.
 
Which doesn't change the fact that the driver version used or any of the system specs are missing.



Which perfectly proves my point. Max power draw of the reference 7900 XTX is 360w, a whopping 40w less and hence why comparing an OC card to a reference card for the purpose of demonstrating the power efficiency of a given generation broadly is highly misleading:
I don't think a "whopping" 40 W power limit difference explains an up to 200 W difference between the two cards in the testing, especially since neither card is being run at 100%.

The point of this testing is to highlight efficiency differences at non full load, so these cards weren't even drawing their full power limit in many of the tested games.

Could the testing be better by listing system specs? Sure, but I don't see how those would change things, since the cards were being measured independently of the rest of the system.

Could the testing be better by listing the driver version? Sure, but again, these aren't idle numbers we're comparing.
 
the ASUS didn't stand out from an average XTX in power draw.
Not really, there's about a 40 W difference in both average and maximum power draw between the TUF and a reference model:

avg.png
max.png


I also question the practicality of Optimum Tech's tests, which show both cards running at 350-600 fps. Gaming at such ridiculous frame rates is and always has been a huge waste of power. A far more interesting comparison would be with a common fps limit, like 120 or 144.
 
Not really, there's about a 40 W difference in both average and maximum power draw between the TUF and a reference model:

View attachment 304242View attachment 304243

I also question the practicality of Optimum Tech's tests, which show both cards running at 350-600 fps. Gaming at such ridiculous frame rates is and always has been a huge waste of power. A far more interesting comparison would be with a common fps limit, like 120 or 144.
Yeah a frame limiter on/off test would be good.

Still, 300 fps at 1440p is a similar load to 120 at 4K, and he's testing esports games, so this isn't abnormal frame rates for people who play those games (a lot of people).
 
I don't think a "whopping" 40 W power limit difference explains an up to 200 W difference between the two cards in the testing, especially since neither card is being run at 100%.

40w is absolutely significant but that's besides the point. The point was that he is comparing an OC model 7900 XTX to a reference model 4080 when we know for a fact that's a notable difference in power draw between the OC and reference design. Regardless of your own definition of what constitutes a notable power difference, that's an inherently flawed comparison.

And yes something is definitely up with those power consumption numbers. He should have done further investigation to see if other 7900 XTXs exhibit the same issue or if it's just down to classic ASUS overpumping the voltage during middling loads with a poorly optimized voltage curve.

The point of this testing is to highlight efficiency differences at non full load, so these cards weren't even drawing their full power limit in many of the tested games.

Of which would depend on the card being used. The ASUS 7900 XTX TUF is going to have a different voltage curve than the 7900 XTX reference card. I'm not sure if AMD still allows GPU board partners more leeway in regards to their GPU design but this may be another case of that akin to the ASUS AM5 issues where voltages used are unnecessarily high. The problem is that no one really knows because the video itself doesn't answer these questions. It just implies that this is an issue despite having only tested one card, one card of which we already know consumes a decent bit more than the reference card.

Could the testing be better by listing system specs? Sure, but I don't see how those would change things, since the cards were being measured independently of the rest of the system.

Could the testing be better by listing the driver version? Sure, but again, these aren't idle numbers we're comparing.

Listing system specs and software versions used is important for others and the reviewer themselves to be able to reproduce the results. It's impossible to say whether any given result is valid if such information is excluded. There's no reason to not include this information either, unless of course you don't have confidence in your data.
 
the clock speeds are missing. XTX at 67% usage delivering the same framerate as 4080 at 32% means the latter has dropped to a lower power state that combined with voltage is exactly what you see.

edit: actually no. it means the opposite, i'm confused
 
Last edited:
Sadly the price is in favor of nascar (RX 7900 XTX) and not mercedes (RTX 4080).
 
  • Like
Reactions: N/A
Great, fair testing there.

I won't repeat but @evernessince has my opinion covered so I don't need too.

Still nice to see staff putting out quality thread's (:/), while other staff burn others.
 
Great, fair testing there.

I won't repeat but @evernessince has my opinion covered so I don't need too.

Still nice to see staff putting out quality thread's :)/), while other staff burn others.
Its becoming increasingly hard to take this joker staff member seriously. The bias is flowing freely. The pattern is clear.

In short, it is disgusting.

/thread

Still, 300 fps at 1440p is a similar load to 120 at 4K,
This doubly confirms it. One huge pile of nonsense. Very high FPS loads are not comparable to 120@4K at all. The excessive coil whine in the 300 fps run should give that away... You're stressing the GPU in a radically different way. Where at 4K you might simply fully tax certain GPU bits, leaving the rest not fully utilized even if the GPU shows 100% utilization, at lower res you might cap out an entirely different part of the system. 4K pushes harder on memory and bandwidth for example.

I guess the OP has managed to lower himself to the gutter trash YT crowd rushing for clicks and headlines. No need to be right! We're clicking so goal achieved. Sad indeed.
 
Last edited:
Its becoming increasingly hard to take this joker staff member seriously. The bias is flowing freely. The pattern is clear.

In short, it is disgusting.

/thread


This doubly confirms it. One huge pile of nonsense. Very high FPS loads are not comparable to 120@4K at all. The excessive coil whine in the 300 fps run should give that away... You're stressing the GPU in a radically different way. Where at 4K you might simply fully tax certain GPU bits, leaving the rest not fully utilized even if the GPU shows 100% utilization, at lower res you might cap out an entirely different part of the system. 4K pushes harder on memory and bandwidth for example.

I guess the OP has managed to lower himself to the gutter trash YT crowd rushing for clicks and headlines. No need to be right! We're clicking so goal achieved. Sad indeed.
Oh well.

Riled up emotions I see.

:D

Would be fun to see a member with both 7900XTX and a 4080 to do similar testing.

Apparently sharing some results and commenting is a crime though, so maybe not.
 
While seeing the AMD fanboys riot over power consumption is amusing this sort of testing is almost useless to me.

Ada can be super efficient in some games my 4090 system even with similar gpu usage draws 100-150w less at the wall.

Forza Horizon 5 is one example where it doesn't even hit 300w.

There are clear reasons why someone would buy these two gpus especially in the United States where the 7900XTX is up to 300 usd cheaper at times.

My bigger concern with amd currently is poor 1% low performance in some games even with a 7800X3D and FSR still being meh not that it uses more power both are likely fixable.
 
Last edited:
This doubly confirms it. One huge pile of nonsense. Very high FPS loads are not comparable to 120@4K at all. The excessive coil whine in the 300 fps run should give that away... You're stressing the GPU in a radically different way. Where at 4K you might simply fully tax certain GPU bits, leaving the rest not fully utilized even if the GPU shows 100% utilization, at lower res you might cap out an entirely different part of the system. 4K pushes harder on memory and bandwidth for example.

I guess the OP has managed to lower himself to the gutter trash YT crowd rushing for clicks and headlines. No need to be right! We're clicking so goal achieved. Sad indeed.

I chuckled this morning when I saw Optimumtech's video. Not because it's wrong, but because there's no way to bring up this topic without drawing the ire of the internet Radeon crowd. It's not even news, it's not like all of a sudden after 6 months RDNA3 suddenly decided to become less efficient in these circumstances.

Look, discounting some of the power consumption and multi monitor quirks, the XTX is a great product. It's even coming down to an attractive price now, while Nvidia prices never budge. What's not to like?

You can framecap or undervolt all you want. RDNA3 simply does not get down to the same efficiency levels as Ada, when under light loads or when framecapped. All the "gains" that desperate youtube commenters or forum users point at to justify "omg optimumtech doesn't understand how to run RDNA3!!!" are pretty moot, considering Ada pulls even farther ahead when you apply those same undervolts or framecaps to Geforce.

I don't agree with OT's conclusion though. I don't think it's something to be fixed in driver (dude, just look at any XT or XTX's power consumption behaviour across the memory rails), but I also don't think it's something that necessarily needs to be fixed. There are plenty of reviews all around; most people don't buy the XTX for outright low power or efficiency, and anyone who potentially feels betrayed by AMD after discovering after purchase clearly didn't do much research. I think if Ali spent a little more time with the XTX staring at HWInfo or measuring with Elmor(?)/PCAT he'd know that it's not really a "problem". Though, with the especially space-efficient build he's been preoccupied with in recent months, I can understand why, since waterblocked the FEs offer a much smaller footprint than Navi31 and are therefore more suitable.

The 4080 running fanless is pretty familiar to most 40 series users - XT and XTX simply can't get to that level, or under 100W in most of those scenarios. And that's just fine, it comes with the [fanout link, chiplet] territory. It's not a deal-breaker for most, but that also doesn't mean you get to dismiss it as "niche" or "useless". Navi31 is a product that likes to run balls to the wall when it gets the chance; contrast the XT and 4070 Ti, the latter *in practice* is a very different product than its 285W "TDP" suggests.

As expected it is pretty funny to see people trashing Optimumtech on here though. Tell me you have 0 experience with SFF, without telling me you have 0 experience with SFF. Ali may not get into 100% depth or understanding in every product he reviews (especially when it comes to niche things like PBO2, mem OC, and some under the hood GPU stuff like here), but you'll be hard pressed to find anyone else with this level of SFF and watercooling experience. Some things only become clear when all of a sudden you have to pay attention to power and thermal constraints that don't exist in a massive 50L tower.

At least, that's all I can say, really. Don't wanna be dragged into another red vs green war that didn't need to exist because people don't watch whole videos or fully read all the text on the page.
 
Last edited:
You can framecap or undervolt all you want. RDNA3 simply does not get down to the same efficiency levels as Ada, when under light loads or when framecapped. .
There was never a contest on that point.. nor is it warfare... thats why reposting something like this attracts fire. Even TPUs reviews confirm Adas efficiency and no one ever battled it either...

Its bad enough we have the YT cowd as it is. If OP had ran some of its own testing you wouldnt have heard me for a second but allI see in earnest is a person baiting with someone elses content. Its just a hair short of 'OMG outrage, your opinions pls'.

Its sad and doesnt belong here.
 
While seeing the AMD fanboys riot over power consumption is amusing this sort of testing is almost useless.

Ada can be super efficient in some games my 4090 system even with similar gpu usage draws 100-150w less at the wall in some games.

Forza Horizon 5 is one example where it doesn't even hit 300w.

There are clear reasons why someone would buy these two gpus especially in the United States where the 7900XTX is up to 300 usd cheaper at times.

My bigger concern with amd currently is poor 1% low performance in some games even with a 7800X3D and FSR still being meh not that it uses more power both are likely fixable.
Yeah, performance is similar, bit better for AMD on raster, much better for NVIDIA on RT, but you have to decide if better efficiency, features like DLAA etc are worth the premium to you.
I chuckled this morning when I saw Optimumtech's video. Not because it's wrong, but because there's no way to bring up this topic without drawing the ire of the internet Radeon crowd. It's not even news, it's not like all of a sudden after 6 months RDNA3 suddenly decided to become less efficient in these circumstances.

Look, discounting some of the power consumption and multi monitor quirks, the XTX is a great product. It's even coming down to an attractive price now, while Nvidia prices never budge. What's not to like?

You can framecap or undervolt all you want. RDNA3 simply does not get down to the same efficiency levels as Ada, when under light loads or when framecapped. All the "gains" that desperate youtube commenters or forum users point at to justify "omg optimumtech doesn't understand how to run RDNA3!!!" are pretty moot, considering Ada pulls even farther ahead when you apply those same undervolts or framecaps to Geforce.

I don't agree with OT's conclusion though. I don't think it's something to be fixed in driver (dude, just look at any XT or XTX's power consumption behaviour across the memory rails), but I also don't think it's something that necessarily needs to be fixed. There are plenty of reviews all around; most people don't buy the XTX for outright low power or efficiency, and anyone who potentially feels betrayed by AMD after discovering after purchase clearly didn't do much research. I think if Ali spent a little more time with the XTX staring at HWInfo or measuring with Elmor(?)/PCAT he'd know that it's not really a "problem".

The 4080 running fanless is pretty familiar to most 40 series users - XT and XTX simply can't get to that level, or under 100W in most of those scenarios. And that's just fine, it comes with the [fanout link, chiplet] territory. It's not a deal-breaker for most, but that also doesn't mean you get to dismiss it as "niche" or "useless". Navi31 is a product that likes to run balls to the wall when it gets the chance; contrast the XT and 4070 Ti, the latter *in practice* is a very different product than its 285W "TDP" suggests.

As expected it is pretty funny to see people trashing Optimumtech on here though. Tell me you have 0 experience with SFF, without telling me you have 0 experience with SFF. Ali may not get into 100% depth or understanding in every product he reviews (especially when it comes to niche things like PBO2, mem OC, and some under the hood GPU stuff like here), but you'll be hard pressed to find anyone else with this level of SFF and watercooling experience.

At least, that's all I can say, really. Don't wanna be dragged into another red vs green war that didn't need to exist because people don't watch whole videos or fully read all the text on the page.

Yep, it's not perfect testing by any means, but I also don't think it's out of line with what other reviewers and TPU have found. I get the feeling Ali is venturing into new grounds here outside his usual case/build reviews content. It's something to be encouraged, hopefully he'll improve in this area to the level of his general production quality and builds.

His production style is definitely on the minimal side though, so that could explain the lack of details re drivers and other specs etc.

I'm also not too hopeful for RDNA efficiency fixes in drivers, I'd imagine AMD would be more focused on the dual issue shaders if that's possible to fix. It could also be due to the chiplet design, maybe efficiency doesn't scale with load, similar to how idling Zen CPUs still draw a lot of power.
 
Back to the Advert Real ,Dlss DlAA



Sigh.

You don't need to be hopefull of rDNA 3/4 or 5.

You, are not buying anything other than Nvidia.
 
There was never a contest on that point.. nor is it warfare... thats why reposting something like this attracts fire. Even TPUs reviews confirm Adas efficiency and no one ever battled it either...

Better to let sleeping dogs lie. Still, to people who get angry over seeing stuff like this (not addressed to you, of course), if it gets you riled up, think about how I felt when I picked up the XT and discovered a whole bunch of dank stuff that the reviews never told me about.......

Yeah, performance is similar, bit better for AMD on raster, much better for NVIDIA on RT, but you have to decide if better efficiency, features like DLAA etc are worth the premium to you.

Yep, it's not perfect testing by any means, but I also don't think it's out of line with what other reviewers and TPU have found. I get the feeling Ali is venturing into new grounds here outside his usual case/build reviews content. It's something to be encouraged, hopefully he'll improve in this area to the level of his general production quality and builds.

His production style is definitely on the minimal side though, so that could explain the lack of details re drivers and other specs etc.

I'm also not too hopeful for RDNA efficiency fixes in drivers, I'd imagine AMD would be more focused on the dual issue shaders if that's possible to fix. It could also be due to the chiplet design, maybe efficiency doesn't scale with load, similar to how idling Zen CPUs still draw a lot of power.

People know him and respect him for the SFF stuff, watercooling, and high refresh esports products reviews. Methinks he should stick to that. This deep dive stuff isn't his style, nor is he that good at it, and it takes a lot of time and effort to go in depth (think derbauer, GN, igorslab).
 
Wow, talk about a biased post. You will probably call me an AMD fanboy but in my opinion one does not buy a $1000 GPU to framecap it. I am also calling this out for being totally what we don't need in this hyper focused Hot Wheels vs Matchbox theme that is currently so prevalent. Then there is the fact that the cards are not even close in price.

This is the lowest priced 7900XTX on Newegg



Here is the budget MSI 4080


Again though this thread makes it seem like you would pair a 7900XTX or a 4080 with a 10400F or AMD 3600 but again ignores the most important factor price.
 
Better to let sleeping dogs lie. Still, to people who get angry over seeing stuff like this (not addressed to you, of course), if it gets you riled up, think about how I felt when I picked up the XT and discovered a whole bunch of dank stuff that the reviews never told me about.......
Well, videos like this could have provided some insight then :).

Wow, talk about a biased post. You will probably call me an AMD fanboy but in my opinion one does not buy a $1000 GPU to framecap it. I am also calling this out for being totally what we don't need in this hyper focused Hot Wheels vs Matchbox theme that is currently so prevalent. Then there is the fact that the cards are not even close in price.

This is the lowest priced 7900XTX on Newegg



Here is the budget MSI 4080


Again though this thread makes it seem like you would pair a 7900XTX or a 4080 with a 10400F or AMD 3600 but again ignores the most important factor price.
The testing wasn't framecapped, if you actually watched the video, you could have seen that he mentioned that, also, framecapping is a good thing for any tier of GPU.

Some game engines have FPS limits, overwatch AFAIK, I don't play it, is 600 FPS.
 
Better to let sleeping dogs lie. Still, to people who get angry over seeing stuff like this (not addressed to you, of course), if it gets you riled up, think about how I felt when I picked up the XT and discovered a whole bunch of dank stuff that the reviews never told me about.......
You and me both. Still I would love to see staff approach forum topics with more consciousness than this one.

@dgianstefani its not a personal thing. If you make sense, you get my thumb up. The above should be taken as advice not critique
 
Wow, talk about a biased post. You will probably call me an AMD fanboy but in my opinion one does not buy a $1000 GPU to framecap it. I am also calling this out for being totally what we don't need in this hyper focused Hot Wheels vs Matchbox theme that is currently so prevalent. Then there is the fact that the cards are not even close in price.

This is the lowest priced 7900XTX on Newegg



Actually that's just 150 usd difference going by my calculations I'd buy that Zotac 4080 in a heartbeat over that MSI 7900XTX or that msi 4080.

Here in the states there can be a much larger difference in price though.
 
You and me both. Still I would love to see staff approach forum topics with more consciousness than this one.

@dgianstefani its not a personal thing. If you make sense, you get my thumb up. The above should be taken as advice not critique
I'm not taking anything personally lol, this is the Internet after all. If people want to get upset, they will.

Not all testing is perfect, and it's been pointed out there are some obvious and simple ways to add more context and detail to this video, still, I find simple tests can still offer insight.

My opinions don't tend to change based on what is popular or not.
 
Back
Top