• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4080 vs 7900XTX power consumption - Optimum Tech

Well, it's what I've been trying to say lol, these OT results aren't exactly out of line with what we already know, it's just TPU doesn't test low load at framerates higher than 60 (yet).


More efficient than RDNA3 you mean ;).

I'd like to also point out that people frequently ask me "why do you always recommend 4060/Ti over 6700 XT" or "muh 8 GB VRAM" (how's that working out now that 16 GB variant has no performance difference) this kind of chart is a big reason.

Twice or even three times the energy efficiency at frame capped games (of which there are many locked at 60 FPS, eg bethesda games or certain console ports etc.), or other associated low loads isn't irrelevant. People take 100% load efficiency and raw raster performance and compare the options, but there's a lot more states the GPU will be in, often more frequently. Another thing to remember is that TPU tests with maxed out CPU hardware, most people aren't rocking a 13900K, and I doubt their GPU is pegged at 100% all the time.

It's a whole other factor if you want to start comparing RT efficiency too, due to the dedicated hardware used in NV cards.

4070 and 6900 XT are roughly comparable in both price and performance, 6900 XT is 5-10% faster in raster, but 20% slower in RT.

Yet one peaks at 636 w, 130 w higher than a 4090, and the other is 235 w.
View attachment 305650
I was gonna say...
I really appreciated when you guys started adding these, I always felt running with uncapped fps is a waste of electricity.
Yup, save some money on GPU only to spend more on PSU and electricity bill sounds good to some people I guess ;).
People are really, really bad at figuring costs over time. And this is not only about GPUs. Our brain somehow puts most of the weight on the upfront cost and seemingly refuses to go past that.
 
The power consumption graph with vsync 60Hz is interesting but we have to take into account the percentage of load at that frame rate.
A 4080 can hover at 20% running a game at 60fps while a low tier card has to run probably full load to achieve the same result.
I think the graph is worthy when we compare gpus with exact same performance. Like 4080 and 7900XTX.
 
The power consumption graph with vsync 60Hz is interesting but we have to take into account the percentage of load at that frame rate.
A 4080 can hover at 20% running a game at 60fps while a low tier card has to run probably full load to achieve the same result.
I think the graph is worthy when we compare gpus with exact same performance. Like 4080 and 7900XTX.
That's debatable. If your card sits at 20% usage with a fps cap in place, chances are you'll just enable additional effects and push that load higher any.
This is worth more when you compare similar cards, like you correctly point out. But, in general, it can be useful in choosing the right PSU. E.g. my monitors are both 60Hz and I always play with v-sync on.
 
I play at 3440x1440, 100Hz with vsync on.
Most games do not have any more effects to enable, so my 4080 runs at 30, 40, 50, 60, 70, 80%....depends on the game.
So you can imagine if you cap the same gpu at 60fps! Anyway...

The consistency of the Ada gpus is impressive. All of them, no matter the tier, manage to consume nearly identical energy to achieve the 60fps target.
1689932098144.png
 
I play at 3440x1440, 100Hz with vsync on.
Most games do not have any more effects to enable, so my 4080 runs at 30, 40, 50, 60, 70, 80%....depends on the game.
So you can imagine if you cap the same gpu at 60fps! Anyway...
Yup, not much else you can add there...
The consistency of the Ada gpus is impressive. All of them, no matter the tier, manage to consume nearly identical energy to achieve the 60fps target.
Tbh, that's what I would expect to see: for the same amount of effort, the same power draw. It will vary a little because of different VRAM speeds and sizes, but the GPU itself has no reason to draw more power.
 
Well, it's what I've been trying to say lol, these OT results aren't exactly out of line with what we already know, it's just TPU doesn't test low load at framerates higher than 60 (yet).


More efficient than RDNA3 you mean ;).

I'd like to also point out that people frequently ask me "why do you always recommend 4060/Ti over 6700 XT" or "muh 8 GB VRAM" (how's that working out now that 16 GB variant has no performance difference) this kind of chart is a big reason.

Twice or even three times the energy efficiency at frame capped games (of which there are many locked at 60 FPS, eg bethesda games or certain console ports etc.), or other associated low loads isn't irrelevant. People take 100% load efficiency and raw raster performance and compare the options, but there's a lot more states the GPU will be in, often more frequently. Another thing to remember is that TPU tests with maxed out CPU hardware, most people aren't rocking a 13900K, and I doubt their GPU is pegged at 100% all the time.

It's a whole other factor if you want to start comparing RT efficiency too, due to the dedicated hardware used in NV cards.

4070 and 6900 XT are roughly comparable in both price and performance, 6900 XT is 5-10% faster in raster, but 20% slower in RT.

Yet one peaks at 636 w, 130 w higher than a 4090, and the other is 235 w.
View attachment 305650
As far as power consumption under load is concerned, It looks like AMD has made progress with at least one RDNA3 SKU, the RX 7600, but its bigger brethren don't go as low probably due to their chiplet based design.

1689943546350.png


Yup, save some money on GPU only to spend more on PSU and electricity bill sounds good to some people I guess ;).

I just built a mini PC for friend with 13900K + 4080 and the entire PC draws maximum 380W at the wall (CP2077 and Last of US @ 4K Ultra settings). I'm pretty sure my friend wouldn't want to replace his aging AC just so he could use a 7900XTX in his PC :roll: .
View attachment 305653
Hyperbole doesn't serve the discussion. While the 4080 is definitely more efficient than the 7900 XTX, they are both power hungry GPUs. Techspot's review of the reference 7900 XTX showed a maximum power difference of 64 W under load for the entire system. Some games showed even smaller differences: 19 W in the case of Doom. Those differences aren't large enough to require an AC upgrade.
 

Attachments

  • 1689943501891.png
    1689943501891.png
    108 KB · Views: 53
Speaking for myself, the power draw difference under partial load or in non gaming situations like video playback is a bigger deal than the Furmark spikes.
Of course it is, because you'll spend much more time at partial load than in full load situations like Furmark, or certain games.

Still, more heat and more power draw at the extreme, as well as on average, isn't a good look.
 
the power draw difference under partial load or in non gaming situations like video playback is a bigger deal than the Furmark spikes.
This, exactly this.
 
I am playing a 4K Youtube video right now and it is pulling between 22 to 30 watts. Using old data to justify arguments seems a little dishonest to me.
Screenshot 2023-07-21 102809.png
 
Still, more heat and more power draw at the extreme, as well as on average, isn't a good look.
Average is what matters, spikes do not influence the heat output of power draw in a significant manner. Spikes happen when the card says "hey, I have plenty of room to stretch my legs" and a few milliseconds later it goes "oh cr@p, I don't".
 
Last edited:
Well here is another well known YTber talking about how inefficient 7900XTX is at idle with high refresh monitor :)

 
Yes, the power draw in lower load situations has improved significantly since launch. However, there seem to be issues with 4K 144 Hz monitors.
What irks me is nobody seems to be able to fix this for good. Both AMD and Nvidia have fixed it in the past, yet a few years down the road, it manages to resurface.
Since it seems implausible they don't test for this with every driver release, the only conclusion is they do test, they notice the problem, but deem it not a blocker and release anyway :(
 
Well here is another well known YTber talking about how inefficient 7900XTX is at idle with high refresh monitor :)

I suspect it's limited to a few models, because we have board members, e.g. @Makaveli, who have high refresh rate monitors, and report nominal power consumption.
 
I don't know what you could possibly mean by "better quality" in this context, like I said all of these new GPUs have better efficiency compared to previous generations and are similar to each other anyway.

I don't think anyone ever picks a 4080 over a 7900XTX because it's 10% more efficient or whatever and it's therefore a "better quality product", sorry I don't believe it and I wont ever believe it. Real consumers simply do not care about that, these are made up reasons proposed by forum dwellers to argue which brand is better.
This. If you have gpu headroom you will generally crank up the visual quality. I dont buy there is a noticeable difference in practice, though for academic purposes, sure, Ada is much better at clocking and powering down than RDNA3. Is it relevant? Highly debatable and use case dependant. If you blow north of 600 bucks on a GPU, you will want to use it - in games where a Radeon would sit largely idle, you will/can up render resolution, framerate cap, or you will enable RT. Boom you are again north of 60% load and in the efficient zone of this GPU family.

Having ran Nvidia for over a decade I can say I even undervolted Pascal and still ran my 1080 at 190-200W virtually all the time. If temps are in check, all is well and I dont care and count on the fact the GPU is pegged at max util, anything below, who cares. Not once did I consider I saved a light bulb or two in the house.

What irks me is nobody seems to be able to fix this for good. Both AMD and Nvidia have fixed it in the past, yet a few years down the road, it manages to resurface.
Since it seems implausible they don't test for this with every driver release, the only conclusion is they do test, they notice the problem, but deem it not a blocker and release anyway :(
Its as little of a blocker as the fact a 4090 can use north of 450W, or the fact all GPUs are clocking way beyond their optimal efficiency curve. Its not Nvidia or AMDs power bill, and fuck climate anyway right? Shareholders dont care.
 
Last edited:
Average is what matters, spikes do not influence the heat output of power draw in a significant manner. Spikes happen when the card says "hey, I have plenty of room to stretch my legs" and a few milliseconds later it goes "oh cr@p, I don't".
I agree, in general, for heat reasons.

However my original comment was responding to discussion about power supplies compatibility. With the exception of the still very new ATX 3.0/3.1 standard, many psus have OCP set not that much higher than their rating. E.g 900 W for a 750 W unit. The 3090 Ti and some RDNA2 GPUs are known for tripping OCP on even high end PSUs, like the Seasonic Prime series etc.

So, in general, I prefer cards that stay closer to their nominal power draw during spikes.
 
I agree, in general, for heat reasons.

However my original comment was responding to discussion about power supplies compatibility. With the exception of the still very new ATX 3.0/3.1 standard, many psus have OCP set not that much higher than their rating. E.g 900 W for a 750 W unit. The 3090 Ti and some RDNA2 GPUs are known for tripping OCP on even high end PSUs, like the Seasonic Prime series etc.

So, in general, I prefer cards that stay closer to their nominal power draw during spikes.
OCP threshold is not a fixed value. It's 30-50% over the rated current (or smth like that). But yeah, if you don't want your system shutting down out of the blue, you should account for spikes too. Normally, you wouldn't want any spike at all, because even if your GPU and CPU have only mild spikes, there's always the possibility of them spiking at the same time...
 
That strongly suggest that it's a monitor specific issue.
I don't think it's that either.

If the card is using power it means something is running on it, all of this reported issues of high power draw at idle are misnomers, clearly the cards cannot be "idle" if there is high power draw. Something is running on the card, you can't just have it consume power doing nothing. Btw never had any of these issues and I use multiple monitors and one is high refresh rate.
 
That strongly suggest that it's a monitor specific issue.
It might even be something as simple as the wrong cable. I have a theory that some cables are not up to their listed specs. I have tried and let's say you have a DP 1.4 Connection on the GPU but are using a 1.2 cable(listed as 1.4). If the panel is also 1.4 it will force that speed through the cable. As in reality the signal will have to wait on the cable to transmit from one to the other. It could also be the same for HDMI formats. As evidence I would use the super hot cable issue. I have done experiments and if you use a out of spec cable it can be hot to the touch after about 30 minutes of use. It could be the GPU is using power to keep the integrity of the signal as it waits for the cable to finish. I know the HDMI cables are basically fibre but that does not mean the connector on the end is up to snuff. I have an 8K DP 1.4 Cable and it is never hot. It could also be monitors out of spec as well like giving you 144hz over 1.4a or whatever some 4K TVs will downgrade to 1440P for 120Hz if they don't have the full 1.4 Spec.

It might even be something as simple as the wrong cable. I have a theory that some cables are not up to their listed specs. I have tried and let's say you have a DP 1.4 Connection on the GPU but are using a 1.2 cable(listed as 1.4). If the panel is also 1.4 it will force that speed through the cable. As in reality the signal will have to wait on the cable to transmit from one to the other. It could also be the same for HDMI formats. As evidence I would use the super hot cable issue. I have done experiments and if you use a out of spec cable it can be hot to the touch after about 30 minutes of use. It could be the GPU is using power to keep the integrity of the signal as it waits for the cable to finish. I know the HDMI cables are basically fibre but that does not mean the connector on the end is up to snuff. I have an 8K DP 1.4 Cable and it is never hot. It could also be monitors out of spec as well like giving you 144hz over 1.4a or whatever some 4K TVs will downgrade to 1440P for 120Hz if they don't have the full 1.4 Spec.
Sorry I should have said HDMI 2.1. There are no TVs with DP connections.
 
I suspect it's limited to a few models, because we have board members, e.g. @Makaveli, who have high refresh rate monitors, and report nominal power consumption.
Agreed from what I can see the majority of single monitor setups are fixed. The issue remains for certain combinations of dual monitor configurations.

It might even be something as simple as the wrong cable. I have a theory that some cables are not up to their listed specs. I have tried and let's say you have a DP 1.4 Connection on the GPU but are using a 1.2 cable(listed as 1.4). If the panel is also 1.4 it will force that speed through the cable. As in reality the signal will have to wait on the cable to transmit from one to the other. It could also be the same for HDMI formats. As evidence I would use the super hot cable issue. I have done experiments and if you use a out of spec cable it can be hot to the touch after about 30 minutes of use. It could be the GPU is using power to keep the integrity of the signal as it waits for the cable to finish. I know the HDMI cables are basically fibre but that does not mean the connector on the end is up to snuff. I have an 8K DP 1.4 Cable and it is never hot. It could also be monitors out of spec as well like giving you 144hz over 1.4a or whatever some 4K TVs will downgrade to 1440P for 120Hz if they don't have the full 1.4 Spec.


Sorry I should have said HDMI 2.1. There are no TVs with DP connections.
I also tried swapping cables brought brand new one from amazon and they made do difference prior to this driver update. I still saw 50 watts on idle with them.

These are the cables I picked up here.

1689963052346.png
 
Last edited:
Back
Top