• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX Drops to $799, Pressure on RTX 4070 Ti

It's widely known that the 7900 series chug power when the hardware is under relatively light loads
can you post a source?
 
can you post a source?
Optimum Tech posted a video a few days ago looking into the power between the 4080 and the 7900xtx. I have also observed higher power draw for low loads on my red devil 7900 xtx.

 
In Europe the RX7900XTX is still around 1050€ while the RTX4070Ti is around 850€, VAT included.
Same in my country, 7900 xtx starting from around 970€ and 4070 ti below 800€, these are mostly grey imports. At least the cheapest AMD model is Sapphire pulse (nitro is more than 90€ pricier), while cheapest Nvidia models are Inno and Zotac, although to be fair Zotac’s warranty can be usually extended from 3 to 5 years.
 
Today playing Diablo 4 @ 1440p max setting will devour all 16GB on my 6900XT. The game does not run smoothly at max settings either...
Interesting, it seems like something else is going on there. For me it stays consistently in the mid 90's fps at max settings (no DLSS) with a 2080ti 11GB, playing on a 4K/120 TV.
 
Limiting the fps seems to overcome this problem. Of course, Ada still does better, but it isn't as bad as that video shows:

1689268003118.png


7900 XTX power consumption in the same scenario

1689268050371.png
 

Attachments

  • 1689268039718.png
    1689268039718.png
    48.7 KB · Views: 63
Naturally limiting the fps also limits power consumption this is true of any GPU though
You mentioned light load. Lightening the load, if the game is new enough, naturally means limiting fps.

As far as pricing is concerned, at least one model of the 7900 XTX is available below MSRP here too. 1199 CAD works out to 913 USD at today's exchange rate.

1689268816480.png


The 4080 and 4070 Ti are also available below MSRP with their lowest prices being $1395 (1063 USD) and $950 (724 USD) respectively.
 
Last edited:
Agreed, the series X only has 10GB of high speed RAM and 2GB of slower RAM, at worst 12GB will be the minimum this generation. My guess is 10GB 3080s will continue to run fine.


Where are you getting 220w more consumption from? TPU shows at worst a 52 watt difference.
quick correction, 6GB slower memory.

The difference is that with a PC, we run all these applications and many of them use VRAM. I can easily be using >4GB of VRAM before I even start a game.
 
can you post a source?

It's a well known fact that the power management on AMD's 7000 Series is a complete fail (though most review shills won't report about it). ;) They're sipping power like cracy in 4k and/or high refresh rate (80W more than a 4090FE), dual monitor, window movement & video playback. In gaming it's even worse where in most games a FPS cap has almost zero effect on power consumption. AMD also crippled their underclocking capabilities to 10% (6000 Series was going with up to 50%). With their latest driver they claimed improvements, ComputerBase tested and found ZERO improvements. If they can't fix it more than a half year after release, it's very likely a hardware issue (which can't be fixed).

https://www.computerbase.de/2023-07...desktop-leistungsaufnahme-der-rx-7000-senken/
If energy costs or massive heat in your room is a concern for you, stay far away from AMD's 7000 Series! Right now I would go for the 4070 or 4080. Or the 4090 if you want to burn cash. If you wanna go cheap AMD's older 6000 Series (6800, 6800XT or 6900XT) would even be a better pick as their power management is far superior compared to their rushed out the door 7000 Series. Of course, if you don't care about power consumption or heat, the 7900XTX delivers massive performance for it's price, beating the 4080 out of it's socks.
 
Last edited:
lol I knew you were going to post this video. where he is comparing a Founder edition card to a AIB asus model with 3x8 pin connector and higher clocks instead of reference.

here is the numbers that Wizzard posted from his review of that asus card.

So reference to reference you are looking at 304w vs 356w

ASUS Radeon RX 7900 XTX TUF OC Review - Amazing Overclocking - Power Consumption | TechPowerUp

1689271333001.png


This is the list of games tested by wizzard much larger

Assassin's Creed Valhalla
Battlefield V
Borderlands 3
Civilization VI
Control
Cyberpunk 2077
Days Gone
Deathloop
Divinity Original Sin II
DOOM Eternal
Dying Light 2
Elden Ring
F1 22
Far Cry 6
Forza Horizon 5
God of War
Guardians of the Galaxy
Halo Infinite
Hitman III
Metro Exodus
Red Dead Redemption 2
Resident Evil Village
The Witcher 3: Wild Hunt
Total War: Warhammer III
Watch Dogs Legion

I trust his numbers more than the youtube video you are using as evidence.
 
lol I knew you were going to post this video. where he is comparing a Founder edition card to a AIB asus model with 3x8 pin connector and higher clocks instead of reference.

here is the numbers that Wizzard posted from his review of that asus card.

So reference to reference you are looking at 304w vs 356w

ASUS Radeon RX 7900 XTX TUF OC Review - Amazing Overclocking - Power Consumption | TechPowerUp

View attachment 304671

This is the list of games tested by wizzard much larger

Assassin's Creed Valhalla
Battlefield V
Borderlands 3
Civilization VI
Control
Cyberpunk 2077
Days Gone
Deathloop
Divinity Original Sin II
DOOM Eternal
Dying Light 2
Elden Ring
F1 22
Far Cry 6
Forza Horizon 5
God of War
Guardians of the Galaxy
Halo Infinite
Hitman III
Metro Exodus
Red Dead Redemption 2
Resident Evil Village
The Witcher 3: Wild Hunt
Total War: Warhammer III
Watch Dogs Legion

I trust his numbers more than the youtube video you are using as evidence.

I think you're going a bit over the top here regardless, mind you that the Nvidia Founder's design is NOT a reference design GPU and it contains a significantly larger power limit than ref.

I don't see what's the big idea anyway. It's worse, so what? N31 is a larger chip set than the AD103.
 
I think you're going a bit over the top here regardless, mind you that the Nvidia Founder's design is NOT a reference design GPU and it contains a significantly larger power limit than ref.

I don't see what's the big idea anyway. It's worse, so what? N31 is a larger chip set than the AD103.
That didn't disprove anything in my post.

I expect the 4080 to use less power but not what that youtube video is showing. And Wizzards own numbers clearly don't show that.

I don't have overwatch 2 but I do have the first game so I will test that myself and produce my own numbers later tonight.
 
I expect the 4080 to use less power but not what that youtube video is showing. And Wizzards own numbers clearly don't show that.

Wizzard used a 60Hz 4k display. ;) No info about the display on the test system page, but if I recall correctly he mentioned using a 60Hz display. 144Hz really drives up the power on AMD's 7000 Series.

Also he didn't measure undervolting + FPS limit power consumption numbers. That's where the picture is getting very clear why AMD's 7000 Series is a complete fail compared to Nvidia's 4000 Series.
 
Hell of deal compared to the 4070 Ti or even 4080.

People buying $800-$2000 GPU's aren't worried about $75 a year in electricity costs.
 
Wizzard used a 60Hz 4k display. ;) No info about the display on the test system page, but if I recall correctly he mentioned using a 60Hz display. 144Hz really drives up the power on AMD's 7000 Series.

Also he didn't measure undervolting + FPS limit power consumption numbers. That's where the picture is getting very clear why AMD's 7000 Series is a complete fail compared to Nvidia's 4000 Series.
I have a 144hz display prior to the 23.7.1 drivers that only affected idle power draw I don't notice any difference in power draw while gaming which I've checked on both old and new driver.
 
It's widely known that the 7900 series chug power when the hardware is under relatively light loads
The last driver helped this.

But as ever it's also up to the end user, and use case and they're are exceptions, they can be run effectively, and efficiently if you choose those settings.

I personally see the 7900XT ageing better.
 
Last edited:
By the time either needs 16 or 24 GB of memory they will all be obsolete, let's not kid ourselves

I'm looking at a 4080 Strix OC White right now, if I can't i'm gonna buy a 7900 XTX TUF OC, that'd be about it, no hard feelings

I want to hear more - when do you expect the RX 8900 series and how much will they cost ?
My bet is 2025 and 1500 am. dollars.

Plenty of time for the developers to release at least one game which will need 18, 20 or 22 GB of VRAM..
 
You trying to rile up the AMD fanboys again bruh.....

I mean, they rile themselves up for being so invested :laugh:

50-50 I end up with a 7900 XTX by the end of the week, so it's all good baby :rockout:


I want to hear more - when do you expect the RX 8900 series and how much will they cost ?
My bet is 2025 and 1500 am. dollars.

Plenty of time for the developers to release at least one game which will need 18, 20 or 22 GB of VRAM..

I dunno man I had my RTX 3090 for 3 years and I never needed the 24 GB capacity for games. 16 is plenty right now, even for 4K gaming.

For AMD to be able to charge 1500 USD on a consumer-grade graphics card they have much work to do. Their tech just isn't there.
 
Optimum Tech posted a video a few days ago looking into the power between the 4080 and the 7900xtx. I have also observed higher power draw for low loads on my red devil 7900 xtx.


Everyone who thinks about buying a 7900XTX should watch this video first! :cool: And then do the Energy Cost Calculator math.

P:S.: If you got air conditioning at home, you can double the cost, lol.

The last driver helped this.

But as ever it's also up to the end user, and use case and they're are exceptions, they can be run effectively, and efficiently if you choose those settings.

Latest driver did nothing, as tested before. ;) Chances are it's a hardware issue which can't get fixed with new drivers.

Maybe with next gen AMD cards, but I am sceptical. If it's an architectural problem it may take years to bug it out.
 
I dunno man I had my RTX 3090 for 3 years and I never needed the 24 GB capacity for games. 16 is plenty right now, even for 4K gaming.

For AMD to be able to charge 1500 USD on a consumer-grade graphics card they have much work to do. Their tech just isn't there.

Well, given the current rate of change/progress with the graphics cards, I think you will be ok for at least 3 more years with your card ;)
 
You'll want 12-16GB to run 4K max settings? Today playing Diablo 4 @ 1440p max setting will devour all 16GB on my 6900XT. The game does not run smoothly at max settings either... I found turning down a few settings helped out game play tremendously. To be fair, you did say most games; however, when you're out to buy a new GPU you aren't looking to buy for most games, you are looking for today's and tomorrow's games. If you don't mind turning down settings, 16GB will be enough. 16GB might be good once GDDR7 hits cards, I don't think our current VRAM is enough.
I play Diablo 4 at 3440x1440 max settings and have no perf issues. Usually butting up against my 177fps limit, sometimes down to 130-150fps depending on the area. I have a 3080Ti. You have something else going on.
 
Everyone who thinks about buying a 7900XTX should watch this video first! :cool: And then do the Energy Cost Calculator math.

P:S.: If you got air conditioning at home, you can double the cost, lol.



Latest driver did nothing, as tested before. ;) Chances are it's a hardware issue which can't get fixed with new drivers.

Maybe with next gen AMD cards, but I am sceptical. If it's an architectural problem it may take years to bug it out.

hmm $131 dollar a year omg what am I going to do now. And this is not even the average because some days the desktop may only see 2-3 hours of use and it won't always be fully loaded during that time aswell.

That is literally some drinks with my wife for the night.

1689275572367.png


Damn those drivers did nothing I went from 50 watt idle to this

1689275450511.png


But i'm surely going to listen to the guy that doesn't even own the hardware.
 
50-50 I end up with a 7900 XTX by the end of the week, so it's all good baby :rockout:
Why are you upgrading? The 3090 is still a very good card; the only meaningful upgrade would be a 4090 and that's very expensive.
 
Back
Top