• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 vs 7900XTX power consumption - Optimum Tech

Not sure how the rdna cards or the 4080 behaves but my 4090 fluctuates from 2600-2800mhz depending on the game power means almost 0 yeah I can blast the fans and set a 600w limit and it'll stay close to 3000mhz but performance difference is less than 5% so not worth it.... It's actually better in my experiences to cap it around 350w lose almost no performance and run very cool/quiet around 2550mhz.

The 40 series cards are more limited by voltage than power likely due to longevity concerns. Most of then are already pushed beyond the efficiency curve sweet spot.

^^ This ^^

A 70% power limit in Afterburner provides about 95% of full performance. Cool and quiet.
 
^^ This ^^

A 70% power limit in Afterburner provides about 95% of full performance. Cool and quiet.

I've reduced my card to around 1800 MHz target (so some 150-200MHz below peak boosts) and it sustains that speed effortlessly with a low wattage. The higher limit would be to ensure this flatlined, consistent performance rather than pushing it further. In my card's case, once I load some very heavy RT workload, it will immediately hit the 375W power limit and reduce performance accordingly ;)

That 4080 Strix has a white version, it's looking quite delicious... I can afford it too once I sell my 3090. We'll see. The 7900 XTX is much cheaper though and also a good deal, I'll see when I have the money in hand. Consistency is important to me, I hate GPU boost mechanisms.
 
I've reduced my card to around 1800 MHz target (so some 150-200MHz below peak boosts) and it sustains that speed effortlessly with a low wattage. The higher limit would be to ensure this flatlined, consistent performance rather than pushing it further. In my card's case, once I load some very heavy RT workload, it will immediately hit the 375W power limit and reduce performance accordingly ;)

That 4080 Strix has a white version, it's looking quite delicious... I can afford it too once I sell my 3090. We'll see. The 7900 XTX is much cheaper though and also a good deal, I'll see when I have the money in hand. Consistency is important to me, I hate GPU boost mechanisms.

Compared to my 3080ti which has a similar power limit the 4090 is way more straightforward as far as being able to tweak it to a sweet spot in power/performance. I'm guessing the 4080 should behave much better than a 3090 when trying to get to a sweet spot.

I do kinda wish gpu pricing wasn't so crazy because it would be nice to own both a 7900XTX and a 4090 just to mess around with both and see how they behave in similar systems I know you can read and see stuff online but until you actually have hardware in hand it really doesn't give you the full picture. I know they are not comparable in price but they are the best of what both companies offer.
 
Compared to my 3080ti which has a similar power limit the 4090 is way more straightforward as far as being able to tweak it to a sweet spot in power/performance. I'm guessing the 4080 should behave much better than a 3090 when trying to get to a sweet spot.

I do kinda wish gpu pricing wasn't so crazy because it would be nice to own both a 7900XTX and a 4090 just to mess around with both and see how they behave in similar systems I know you can read and see stuff online but until you actually have hardware in hand it really doesn't give you the full picture. I know they are not comparable in price but they are the best of what both companies offer.

Yeppers... 4090 is out of the question for me. I figured I can still get a nice price for my card, so while I can do it i'm between 4080 and 7900 XTX. I'm most concerned about value rot, if I lose the aggregated value to time I can't sell it and (inexpensively) upgrade. Not 5 years ago I had the Sapphire R9 280X Toxic, MSI R9 290X Lightning plus all three generations of AMD HBM cards (two Fury X, Vega Frontier and Radeon VII) and could easily afford a 1080 Ti if I wanted (I didnt feel the need), but now I scrape to get a single high end card... and this isn't me being poorer than I was, in fact I'm better off today... it's just that all of those cards *combined* didn't cost what a 4090 costs :cry:

All of those GPUs? I sold to pay for my 3090 and 5950X upgrades. The 5950X has already gone, so I guess I really should flip the GPU before it's a complete 3 years old.
 
Yeppers... 4090 is out of the question for me. I figured I can still get a nice price for my card, so while I can do it i'm between 4080 and 7900 XTX. I'm most concerned about value rot, if I lose the aggregated value to time I can't sell it and (inexpensively) upgrade. Not 5 years ago I had the Sapphire R9 280X Toxic, MSI R9 290X Lightning plus all three generations of AMD HBM cards (two Fury X, Vega Frontier and Radeon VII) and could easily afford a 1080 Ti if I wanted (I didnt feel the need), but now I scrape to get a single high end card... and this isn't me being poorer than I was, in fact I'm better off today... it's just that all of those cards *combined* didn't cost what a 4080 costs :cry:

All of those GPUs? I sold to pay for my 3090 and 5950X upgrades. The 5950X has already gone, so I guess I really should flip the GPU before it's a complete 3 years old.

I think the real question you have to ask yourself is what you hope to gain by upgrading. A 3090 is still a very good card and is going to play all the latest games at maxed out settings.

I stuck with my 1080 Ti for a long time until I upgraded to a 4080 and while it's nice it was nowhere near worth the price. It should have been half the cost to be frank. The higher frame-rate is good but it's not something I couldn't fix by lowering quality settings on my 1080 Ti. medium quality settings today look pretty darn good IMO.

At the end of the day graphics improvements aren't nearly as large as they used to be. Most of what's going to make a game enjoyable for people is going to be the gameplay itself. The vast majority of people are not going to stop and complain that the textures are 4K instead of 8K or that the shadows have edges that are a bit too sharp due to them being rasterized vs ray traced.


Compared to my 3080ti which has a similar power limit the 4090 is way more straightforward as far as being able to tweak it to a sweet spot in power/performance. I'm guessing the 4080 should behave much better than a 3090 when trying to get to a sweet spot.

I do kinda wish gpu pricing wasn't so crazy because it would be nice to own both a 7900XTX and a 4090 just to mess around with both and see how they behave in similar systems I know you can read and see stuff online but until you actually have hardware in hand it really doesn't give you the full picture. I know they are not comparable in price but they are the best of what both companies offer.

Yes, very easy to find the sweet spot on the 4080 using afterburner. Lowers power consumption a good amount. That's the only reason I bought this generation TBH, the only reasonable replacement to the 1080 Ti within the same power envelope when underclocked.

If the current pricing maintains this card will have to last at least as long as my 1080 ti did, maybe longer if pricing goes up. It's not for a lack of funds but a lack of value.
 
Last edited:
I think the real question you have to ask yourself is what you hope to gain by upgrading. A 3090 is still a very good card and is going to play all the latest games at maxed out settings.

I stuck with my 1080 Ti for a long time until I upgraded to a 4080 and while it's nice it was nowhere near worth the price. It should have been half the cost to be frank. The higher frame-rate is good but it's not something I couldn't fix by lowering quality settings on my 1080 Ti. medium quality settings today look pretty darn good IMO.

At the end of the day graphics improvements aren't nearly as large as they used to be Most of what's going to make a game enjoyable for people is going to be the gameplay itself. The vast majority of people are not going to stop and complain that the textures are 4K instead of 8K or that the shadows have edges that are a bit too sharp due to them being rasterized vs ray traced.




Yes, very easy to find the sweet spot on the 4080 using afterburner. Lowers power consumption a good amount. That's the only reason I bought this generation TBH, the only reasonable replacement to the 1080 Ti within the same power envelope when underclocked.

If the current pricing maintains this card will have to last at least as long as my 1080 ti did, maybe longer if pricing goes up. It's not for a lack of funds but a lack of value.
My view as well. I have a decent paying career since 21. I can afford an Aqua 7900XTX or Nitro+/Toxic or equivalent AIB 4090, but I am not taking half my months pay for a gpu, when I can build an entire machine for the price, or go get a XBX, PS5 and already be playing.

I can buy a performance torque converter from Circle D Specialties to move this heavy truck for $756.67(price includes tax already calculated). The prices for a gpu that at most will get you maybe 3-5 years at best now is plain stupid.
 
It's not for a lack of funds but a lack of value.

Yes, it is - but look from my perspective, I've had this thing for 3 years, and it's beginning to lose market value. I looked up and 3090's are currently selling for about 5-6k BRL, half of what they used to be when new. I could very well keep my card, but it's the same reason I flipped the 5950X... I want to keep the resale value of the parts up. It sounds really weird to consider it an asset of sorts when it's as easy as going into a Micro Center or opening Amazon to order a first-party Founders card whenever you feel like, but we don't have this luxury outside the US.
 
Yes, it is - but look from my perspective, I've had this thing for 3 years, and it's beginning to lose market value. I looked up and 3090's are currently selling for about 5-6k BRL, half of what they used to be when new. I could very well keep my card, but it's the same reason I flipped the 5950X... I want to keep the resale value of the parts up. It sounds really weird to consider it an asset of sorts when it's as easy as going into a Micro Center or opening Amazon to order a first-party Founders card whenever you feel like, but we don't have this luxury outside the US.
Flip it then, plenty of users out there would buy that card pretty quickly.
 
Yes, it is - but look from my perspective, I've had this thing for 3 years, and it's beginning to lose market value. I looked up and 3090's are currently selling for about 5-6k BRL, half of what they used to be when new. I could very well keep my card, but it's the same reason I flipped the 5950X... I want to keep the resale value of the parts up. It sounds really weird to consider it an asset of sorts when it's as easy as going into a Micro Center or opening Amazon to order a first-party Founders card whenever you feel like, but we don't have this luxury outside the US.

Still, that seems like a good reason to keep your 3090 instead. These days anything is a lottery (edge to hotspot temp delta, bad fan QC, coil whine), and being happy with your current GPU is a great reason not to play said lottery, especially since you said it yourself that it's not that easy just to get your hands on a GPU.
 
Flip it then, plenty of uninformed users out there would buy that card pretty quickly.

I would never rip anyone off, I would obviously state that it's a card I've had since the launch, was used only for gaming on my personal rig and I treated it exceptionally well. It was never opened and has the warranty seal intact, so that counts a lot around here.

Still, that seems like a good reason to keep your 3090 instead. These days anything is a lottery (edge to hotspot temp delta, bad fan QC, coil whine), and being happy with your current GPU is a great reason not to play said lottery, especially since you said it yourself that it's not that easy just to get your hands on a GPU.

Yeah, I get what you mean. It's always a gamble when you're acquiring new hardware, I'll give it a good thought before I settle on the idea. If it looks like I can get a 4080 Strix or a 7900 XTX of equal caliber (Sapphire Nitro+), I might do it, if not, i'll probably end up keeping the 3090 ;)
 
You can framecap or undervolt all you want. RDNA3 simply does not get down to the same efficiency levels as Ada, when under light loads or when framecapped. All the "gains" that desperate youtube commenters or forum users point at to justify "omg optimumtech doesn't understand how to run RDNA3!!!" are pretty moot, considering Ada pulls even farther ahead when you apply those same undervolts or framecaps to Geforce.
While I don't doubt that Ada is the more efficient architecture of the two, I'd love to have someone test the 4080 vs. 7900XTX (or the 4070Ti vs. 7900XT) using the same frame rate cap across different resolutions. I would be very interested in seeing some hard data on power usage in fps limited scenarios -- the kind I've been posting in the owners' thread -- rather than the same "trust me bro" comments.

TPU's own V-sync testing only uses Cyberpunk at 1080p60. It's one of the most GPU-heavy titles currently, so it would be useful to learn how other less demanding titles compare.
 
I think the overall finding here isn't exactly wrong, just exaggerated, I'd have preferred to see reference vs reference cards, or AIB vs AIB, I do think that the use of the TUF 7900XTX has exaggerated the result here, by how much I cant be certain. I'm not sure it can be fixed with drivers, we know Navi 31 is an ambitious MCM design that inherently will need some more juice. It all depends if it really matters to you, the community goes through wild swings of whether efficiency matters to them depending on if they care, or if their favorite team has the better efficiency. Personally, I'm not all that bothered by the outright wattage draw as much as how much I get for it, and there I don't think the 7900XTX is unacceptable, just not class leading like ada.

I do like OT's video's and Ali generally has a pretty good take, so I think perhaps a couple of methodology tweaks and more apples to apples would have made this video a winner, where right now it's interesting, and kinda academic, but isn't a be all to hang your hat on.
 
I look at it this way...

  • Is Ada Lovelace more power efficient than RDNA3? Hell yeah it is.
  • Does it make up for the difference in price? Hell no it doesn't.
None of this is really false information, even if it is kinda skewed. The only question that must be asked is this:

"Why aren't we seeing posts about the colossal difference between Raptor Lake and Zen?"

The answer is "Everybody already knows so it would be beating a dead horse."

The same applies to this. It's just beating a dead horse and isn't worth talking about.

I would never rip anyone off, I would obviously state that it's a card I've had since the launch, was used only for gaming on my personal rig and I treated it exceptionally well. It was never opened and has the warranty seal intact, so that counts a lot around here.
Absolutely it does, yes.
Yeah, I get what you mean. It's always a gamble when you're acquiring new hardware, I'll give it a good thought before I settle on the idea. If it looks like I can get a 4080 Strix or a 7900 XTX of equal caliber (Sapphire Nitro+), I might do it, if not, i'll probably end up keeping the 3090 ;)
I don't know what "of equal calibre" means because I've used Nitro, Windforce, THICC-III and reference models. I've never really seen a significant difference between them. The most important thing is the GPU on the card, not the model of the card itself. Personally, I'd take a Pulse over a Nitro because the difference in performance will be far less than the difference in price. It will still be an RX 7900 XTX no matter what the AIB model is. :D
 
The point AMD is pulling the same 512 Watts in the less demanding Overwatch2 as in DOOM on nightmare settings while 4080 has 120 W difference between the two.
So 4080 could be shutting down portions of the chip that are not needed for example the RT is using 40 W without dropping the clock speed and voltage.
 
iirc the ambient temp sensor/daughterboard is on only 1 model (XTX reference). Maybe they worked it in later on, definitely didn't factor into much at release. It shows up under HWInfo
This is why I prefer the reference XTX MBA model compared to the others. I think Nitro+ has the sensors (I could be wrong since its the same PCB), but my Pulse does not and it kind of annoys me (especially when I compare the sensors shown by my RTX 3090 FE).

Anyways, I'll most likely return my Pulse and get a "fixed" (working vapor chamber) reference XTX next week. The 7000 series seems to have gotten better after the latest July Adrenalin update (idle power consumption, its now like the 6000 series), although the power consumption at not-so-heavy loads is clearly one of its pain points.
 
This is why I prefer the reference XTX MBA model compared to the others. I think Nitro+ has the sensors (I could be wrong since its the same PCB), but my Pulse does not and it kind of annoys me (especially when I compare the sensors shown by my RTX 3090 FE).

Anyways, I'll most likely return my Pulse and get a "fixed" (working vapor chamber) reference XTX next week. The 7000 series seems to have gotten better after the latest July Adrenalin update (idle power consumption, its now like the 6000 series), although the power consumption at not-so-heavy loads is clearly one of its pain points.

You'd return a Pulse for a MBA just to gain that sensor? Interesting. I'd only go the opposite way. If the Pulse was out at the same price, I would've kept it through even all the problems. I did not keep my MBA.

Especially considering the MBAs use Delta Superflos like they did for RDNA2 reference, which are just glorified sleeve bearing fans not too far removed from Gigabyte's "enhanced" sleeve bearing fans. Poor quality control on 2 of 3 of my MBA fans sealed the deal. No shot I'd take that over Sapphire's ball bearing fare.

Not to mention the fuse protection that Sapphire integrates at the 8-pin.
 
You'd return a Pulse for a MBA just to gain that sensor? Interesting. I'd only go the opposite way. If the Pulse was out at the same price, I would've kept it through even all the problems. I did not keep my MBA.

Especially considering the MBAs use Delta Superflos like they did for RDNA2 reference, which are just glorified sleeve bearing fans not too far removed from Gigabyte's "enhanced" sleeve bearing fans. Poor quality control on 2 of 3 of my MBA fans sealed the deal. No shot I'd take that over Sapphire's ball bearing fare.

Not to mention the fuse protection that Sapphire integrates at the 8-pin.

I would. The only reason I got the Pulse is:
- Its the only XTX that is just as small than the RTX 4080/4090 FE and can fit in most ITX cases, just like the RTX 3000 cards and RX 6000 MBAs
- The reference MBA was not available on AMD.com by that time (it is now) and I just got the ASRock MBA one discounted off brand new NewEgg with a free copy of Starfield Premium :laugh:

I know the AIBs have better fans, PCB layout, etc, but fan noise doesn't really bother me, the PCB size of the MBA has the dimensions I want and the stock MBA temps (after being rectified by AMD) seem to be okay now. As long as it doesn't push 85+ C hotspot at full load in my FD Terra/Corsair 5000X, I'll be happy.

EDIT: For those PMing me about the deal: ASRock Radeon RX 7900 XTX 24GB GDDR6 PCI Express 4.0 Video Card RX7900XTX 24G - Use ZIPTECH to cut US $100.00 off
 
Last edited:
Absolutely it does, yes.

I don't know what "of equal calibre" means because I've used Nitro, Windforce, THICC-III and reference models. I've never really seen a significant difference between them. The most important thing is the GPU on the card, not the model of the card itself. Personally, I'd take a Pulse over a Nitro because the difference in performance will be far less than the difference in price. It will still be an RX 7900 XTX no matter what the AIB model is. :D

It sold in less than 16 hours listed o_O

I went with the RTX 4080 in the end, a nice Strix OC White model (yes, the one that's pricier than your avg. 4090), it was on sale and I just couldn't resist it :p New monitor coming in too, it'll be a blast!
 
Low quality post by A&P211
Maybe the OP needs to get back to proofreading those articles, let the reviews to the professionals.
 
Maybe the OP needs to get back to proofreading those articles, let the reviews to the professionals.

You almost sound offended by the points that @dgianstefani brought up :eek:

Nah, it's fine. The higher power consumption of the 7900 XTX isn't that bad, it is a larger GPU and on a slightly less advanced process node, but it is still well manageable, with the price difference, the question you must ask yourself is if you're going to miss the Nvidia RTX ecosystem and the GPU features that Nv offers and AMD's doesn't, or if power is really really expensive in your area.

With the low prices they are commanding right now, the 7900 GPUs are a steal, people should really be buying them. I went with that 4080 myself because

1. Regarding the RTX ecosystem and features, when push came to shove, I felt like it wasn't worth giving it up
2. Less power means less heat generated, and summer here can be quite harsh
3. It's my birthday! I like treating myself. And I like expensive things I can't justify otherwise :D

If you're fine with what AMD currently offers software-wise and don't mind the janky Windows drivers, the 7900 XTX is the GPU to get this generation, especially with the current prices.
 
Low quality post by Assimilator
Nothing entertains me more than threads about Radeon power consumption, guaranteed to get the red team fanboys apologising and deflecting in all directions:

* "it doesn't matter because the test is flawed"
* "it doesn't matter because NVIDIA is more expensive"
* "it doesn't matter because energy is cheap"
* "it doesn't matter because chiplets are a superior design"
* "it doesn't matter because some other bulls**t reason"
 
stock MBA temps (after being rectified by AMD) seem to be okay now. As long as it doesn't push 85+ C hotspot at full load in my FD Terra/Corsair 5000X, I'll be happy.
The highest I've seen on my MBA card was 82 C. In most games it's below 80 when running uncapped, but you may see higher temps depending on your ambient.
 
Not to mention last generation with Ampere vs RDNA2 AMD had an efficiency advantage and that is all you heard from AMD fanboys. Now that it's reversed it doesn't matter again.....

The 7900XTX for anyone that can pick it up around 800 usd is a crazy good deal no need to defend it. It has amazing Raster performance, a ton of vram, and the power while higher than the competing 4080 is manageable.

Currently AMD occupies 3 out of the top ten spots on the gpu best sellers list on both Amazon and Newegg US they need to do better.... Only 1 out of the 20 spots is a 7000 series gpu smh.
 
Not to mention last generation with Ampere vs RDNA2 AMD had an efficiency advantage and that is all you heard from AMD fanboys. Now that it's reversed it doesn't matter again.....

The 7900XTX for anyone that can pick it up around 800 usd is a crazy good deal no need to defend it. It has amazing Raster performance, a ton of vram, and the power while higher than the competing 4080 is manageable.

Currently AMD occupies 3 out of the top ten spots on the gpu best sellers list on both Amazon and Newegg US they need to do better.... Only 1 out of the 20 spots is a 7000 series gpu smh.
It should have launched at $800. XTX $900. Would be compelling at those prices.

The efficiency advantage was solely due to AMD being on vastly superior TSMC 7nm while NVIDIA was stuck on 8nm (10nm) Samsung node. The fact that they were even close was a testament to the NVIDIA architecture.

Now that they're both on TSMC 5nm, we see who actually has the better architecture.
 
It should have launched at $800. XTX $900. Would be compelling at those prices.

The efficiency advantage was solely due to AMD being on vastly superior TSMC 7nm while NVIDIA was stuck on 8nm (10nm) Samsung node. The fact that they were even close was a testament to the NVIDIA architecture.

Now that they're both on TSMC 5nm, we see who actually has the better architecture.

I think Nvidia is still on a more advanced version of 5nm called 4n and the 6nm MCD that take up 225mm2 of the die space aren't helping matters when it comes to power I'm sure. My guess is that is why they didn't do some crazy 400+ mm2 GCD with 8 MCDs to compete with the 4090 is the power it would take.
 
Back
Top