• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

Just registered to say that I am still very disappointed in idle power consumption. I am very sure it not only applies to multi-monitor setups only but the same goes for high refresh rate displays as well. It was THE reason I sold my 6800 XT (it drew around 45 watts at 4k 120 Hz with no way to lower that (I sent AMD bug report but response was along the line of "tough cookies") and now 7900 series draws around 100 watts. Highly doubt AMD will fix that just like I am sure 6800 XT was (and pretty sure it still is, can anyone confirm what it is like nowadays 4k display at 120 Hz?) left "as is". I really like AMD but its always some details that kills it for me.
 
lol...................how can you justify a $1200 gpu?, BOTH the 4080 and the 7900XTX are ripoffs. People thought 7900XTX was going to be well priced because everyone believed it would be +20% faster than the 4080.
I can't believe you thought I was being serious in the message you quoted
 
Way overhyped... Now I know why the 4080 is so terribly priced and now we will likely get an 800-900 usd 4070 way to go AMD.
Nvidia prices have nothing to do with AMD falling to offer an RTX 4080 killer at $1000. Nvidia prices have everything to do with people willing to pay those prices. Nvidia prices have everything to do with people accusing Nvidia's competition for the prices and not Nvidia itself.

Let's say that AMD's RDNA3 was a total mess. A buggy hardware that needed new revisions to work. Nvidia comes out with the prices we see today and NO ONE was buying. I mean let's say they have managed to sell 12000 RTX 4090s instead of 120000. RTX 4090's price will have been much lower today. And without AMD's help.

And please. If you want AMD to build better products, just be also ready to buy them. Hoping AMD to build better products, so you can keep paying Nvidia, only secures one thing. More expensive GPUs in the future.
 
I wonder how multi monitor power consumption scales here.
I know from experience that while nvidia has relatively low power consumption with 2 monitors, if you use 3 or 4 it is already a lot higher.
I personally use 4 so I would be quite interested to see how this scales.
 
I wonder how multi monitor power consumption scales here.
I know from experience that while nvidia has relatively low power consumption with 2 monitors, if you use 3 or 4 it is already a lot higher.
I personally use 4 so I would be quite interested to see how this scales.
Multi monitor is terrible in TPU's more realistic scenario: monitors with different resolutions and refresh rates. If you have similar screens, then it's better, at least according to Computerbase:

1670873887348.png
 
Nice review @W1zzard - must have taken ages to work through those benchmarks and make up the numbers.

It peaks at 360W for a higher end card. Far more efficient then a Nvidia if you ask me. GPU at less then 1 volt means there's still headroom.
 
Multi monitor is terrible in TPU's more realistic scenario: monitors with different resolutions and refresh rates. If you have similar screens, then it's better, at least according to Computerbase:

View attachment 274201
Interesting, there really is not one multi monitor use case, there are instead a lot of them, so yea at least a good qualifier what is actually being tested would help a lot here.
I personally run 2x 144hz 1440p 1x 155hz 1440p 1x 60hz 1200x1920 (portrait).
 
Great review!

Exactly what I needed to see before launch tomorrow. :D
 
Last edited:
The multi-monitor/video playback power use is shockingly high. It must be the vram clock shooting up like on rx 6000 series. I wonder if the huge increase relative to last gen is related to the MCD’s? Hopefully they can fix it.

The reviews make me glad I pulled the trigger on a $500 6800 months ago rather than wait like people suggested. Pricing is only marginally better for the navi 21 cards, and navi 31 is just too much money at launch for what it’s offering. I really do think they’ll need to drop the price for both models by at least $100 in the coming months.
Navi 32 might be really interesting though for people on rx 5000/rtx 2000 and older. I can see a smaller GCD with 4 MCD’s beating the 6800xt and likely profiting at the same $650 launch price, which is easier for people to stomache. Nvidia will probably struggle to compete there with monolithic dies judging by what they wanted for the 4070ti. And inb4 “AMD will be greedy”; if 7900 series don’t sell that well they will have no choice but to recognize reality and be aggressive with pricing.
I doubt the 7800xt will be a further cut down N31 die, all signs point to good yields. Makes sense, the GCD is so much smaller than recent high end GPU’s. I hope AMD focuses more on this strategy instead of going for a monumental chiplet GPU; for a future upgrade I’d prefer something more modest with modest power draw.
 
The other day, I almost bought a 6650XT for $250 brand new off Amazon for my HTPC/driving sim rig, but decided to wait for the budget 7000 series line up from AMD. I here they have a $400 7000-series card with RX 6800 performance coming out soon.

AMD is currently dominating in the budget to midrange market with their 6650XT and 6700XT cards being much cheaper. Most budget users don't care about RT performance, probably turn it off anyways.
 
The other day, I almost bought a 6650XT for $250 brand new off Amazon for my HTPC/driving sim rig, but decided to wait for the budget 7000 series line up from AMD. I here they have a $400 7000-series card with RX 6800 performance coming out soon.

AMD is currently dominating in the budget to midrange market with their 6650XT and 6700XT cards being much cheaper. Most budget users don't care about RT performance, probably turn it off anyways.

People also eat junk food because they think it is delicious and maybe cheaper, but that doesn't make it right.
I would never touch 6650 or 6700. Let them rot on the shelves!
 
Thanks for the reviews @W1zzard

A special note to AMD and NVIDIA (OUR SAVIOURS) as always nowadays... great cards!! but for the price NO THANK YOU VERY VERY MUCH.... stick it where the sun don't shine!
 
The RT results AND raster results are pretty wide apart in some cases, but if you consider that and then still consider there's 3% in advantage of AMD, that's meaningful. Some games push 20% more FPS, while the biggest loser is what, 15% in favor of Nvidia. The number of games is also higher where it scores better. Given the lacking optimization elsewhere in the product, its safe to say that gap can increase further in favor of AMD. Not a given, but definitely plausible.
I mean, at those price points none of the gpus are recommendable imho, if i had a gun pointed at me and i had to choose, of course i would save the $200 and go for mad, but those prices are stupid. And i repeat, 7900XTX is only that in name, the reality is that is a 7800XT competing with the 4080. Price: $700 max.
 
The mystery of high 4000 series prices is finally solved:
1. AMD lost over 20% in raster performance (admittedly at a slightly lower power draw). RTX 4090 is the only option for a smooth 4K native or 4K RT upscaled.
2. Ray Tracing performance gap is exactly the same as it was in the previous gen, as both companies did only minimal improvements in RT vs non-RT performance, RTX 4090 being the most efficient. AMD's RT is again depending on the game and resolution, unplayable or a huge sacrifice to make.
3. "Power efficiency" crown that AMD's marketing tried to sell us came out to be one huge BS. Radeons were more efficient in the previous gen, especially the ones competing with NVIDIA's high-end, now the tables have turned. It's the most disappointing part of this release to me personally.
4. We still need to wait for custom models for this to be confirmed, but based on the reference models, it looks like AMD cards will be again hotter/louder at the same TDP. It was always the same with Ryzen vs Intel CPUs.

After NVIDIA solved overhead issues in DX12, there seem to be no practical reasons to buy an AMD over NVIDIA in this gen apart from the price.
 
Historically Nvidia has ran margins of 40-60% and they're peaking above 60% today. So strangely enough as nodes shrink, both revenue and margins shoot up, and not one at cost of the other.

700 cost of production? Got source, or just pulling it out of what's been going around the internet in random blurbs? I'm interested if you have real info. And if AMD has to cut into its margins so much more than Nvidia doing its monolithic approach, what the fuck is AMD even on right now? Crack? Or is this the only way they can realistically move forward? Or is it an investment in the future? If so, they should damn well make sure to release it in a better state.

The story just doesn't make sense. If MCM is better wrt yields on smaller nodes, it should end up cheaper.

Note; here's AMD's gross margin. Note that CPU is chasing the chiplet route for a while now too.
View attachment 274198
Of course, I'm not just throwing out random information for fun.

The cost per 5nm wafer is $17k. Considering the yield, similar to 7nm, and the 300mm2 die of the main GPU chip(GCD), we have about 137 usable chips, costing about $124 each. Now we also have 6x cache chips(MCDs) costing about $12 each. So far the cost is at U$ 196.


It's kind of hard to know precisely the cost of GDDR6 20Gbps now, but 3 years ago before the chaos and inflation it was almost $12/Gb of slow 14Gbps memory.. so lets say its $14/Gb.

24Gb * 14 = U$ 336 + $196 = $532.

Now add the cost of all the other components and logistical complexity of the modular design, so... we have something closer to $700, ignore the cost of development and driver support for years to come.

Anyway, selling CPUs is a much better deal for AMD: small chips, very high margin, low investment requirement in support etc...
 
it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.
6800XT and 6900XT were MUCH more interesting and actually had excelent msrp.

I here they have a $400 7000-series card with RX 6800 performance coming out soon.
Yeah, keep dreaming. Make that at least $500.
 
6800XT and 6900XT were MUCH more interesting and actually had excelent msrp.


Yeah, keep dreaming. Make that at least $500.
Of course, they are selling well below their original launch price, perhaps even at a loss. It's a great time to shop at a discount while supplies last.
 
Of course, they are selling well below their original launch price, perhaps even at a loss. It's a great time to shop at a discount while supplies last.

I doubt they are selling anything with a loss. But they should definitely lower their wages for yachts and private jets :D
 
Same performance as 4080, 8GB more VRAM, not sized as a brick, and $200 cheaper
And
Exceeds 4080 in electricity consumption
Weak in raytracing (16% in 4K can make the difference from satisfactory to cursive)
Possibly weak in productivity as well (we are waiting for the Puget tests)
Huge multimonitor consumption (4x more (!!!), but I think it can be fixed with the drivers).
They will not fix the ray tracing performance with drivers. It is clear that here they are a generation behind nVidia.
So you save $200 just by giving something up.
 
Of course, I'm not just throwing out random information for fun.
GDDR6 should be cheaper now; IIRC, @W1zzard estimated about $55 for 8 GB of GDDR6 around RDNA 2 launch. N5 wafer costs are probably an over estimate; they don't line up with AMD's own graphs:

1670877456572.png


Assuming around $10 k for N7, N5 should be less than $15 k per wafer. The oft quoted costs are costs at the time of launch when Apple was the only customer. The MCDs, in particular, should be very cheap. With a defect rate of 0.09 per square cm and 37.5 mm^2 die size, the yield for each should be around 1500 per wafer. All 6 shouldn't cost more AMD than $40. With 50% gross margins, the GCD and 6 MCDs should be sold to the partners for around $300. The total board cost seems to be around the $ 500 - $ 550 mark. Shipping and the margins for retailers should allow a profitable product even at $ 800 or so.
 
Assuming around $10 k for N7, N5 should be less than $15 k per wafer. The oft quoted costs are costs at the time of launch when Apple was the only customer. The MCDs, in particular, should be very cheap. With a defect rate of 0.09 per square cm and 37.5 mm^2 die size, the yield for each should be around 1500 per wafer. All 6 shouldn't cost more AMD than $40. With 50% gross margins, the GCD and 6 MCDs should be sold to the partners for around $300. The total board cost seems to be around the $ 500 - $ 550 mark. Shipping and the margins for retailers should allow a profitable product even at $ 800 or so.

10K for N7 wafer is too much. I bet it's 7-8K today.

1670878420800.png

The price of a 5nm wafer from TSMC is a whopping $ 16.988 - HardwarEsfera
 
Back
Top