@
newsunrise0130
The
4070 Ti Super had a $800 MSRP the same as the
4070 Ti which it replaced.
The 7900 XT was a horrible deal at its MSRP, I don't know what AMD was smoking when they came up with that price.

Judged in a vacuum, yes the bad MSRP was intended to upsell the 7900 XTX. But outside this vacuum there were also the competing RTX cards. The 7900 XTX was priced at $1000 compared to the 4080 at $1200, and the 7900 XT at $900 was competing the 4070 Ti at $800.
The 7900 XT was better in raster than the 4070 Ti but worse in RT, it had a worse upscaler, higher power draw, worse productivity features etc. It did have 20 GB of VRAM compared to just 12 GB though.
When the 4080 Super came along at $1000 and the 4070 Ti Super at $800 the market naturally discounted the 7900 XTX & 7900 XT to prices where they were competitive enough to sell, basically $900 and $700 respectively (even lower sometimes).
The improvements AMD made are real and substantial, however the number of games that support FSR4 is the problem compared to the competition.
The inconsistency in productivity workloads is another issue which granted, affects a smaller userbase than gamers. Than doesn't mean it should be neglected by AMD.
The AM Radeon RX 9070 XT is here for $600 msrp. How does it compare to the 5070 Ti and 5070 in content creation applications?
www.pugetsystems.com
Big AI and ray tracing improvements, but questions about retail supply remain.
www.tomshardware.com
Yes it's still too early for a final verdict, software support is under way, Blackwell also has issues in some apps.
In raster yes, but in RT they are in different tiers. And when it comes to productivity features also different tiers.
AMD went overboard with trying to maximize performance, to the point of factory overclocking the 9070 XT past the efficiency curve and making the power draw skyrocket. All of this because they wanted to trade blows in raster with the 5070 Ti. Talk about goals.
To the people that think otherwise, RDNA4 is actually a power efficient architecture, the proof is the gaming power draw of the 9070 (especially compared to the RTX 5070 and the 7800 XT), and also the idle and V-sync numbers on both AMD cards.
AMD's new Radeon RX 9070 launches today. It comes at the same $550 price point as NVIDIA's RTX 5070, but offers a 16 GB VRAM size while the 5070 has only 12 GB. The new RX 9070 also managed to catch up with NVIDIA in ray tracing performance and efficiency.
www.techpowerup.com
AMD's new RDNA 4 GPUs are launching today. The Sapphire Radeon RX 9070 XT Nitro+ is a fantastic custom design with powerful cooling and whisper-quiet fans. In terms of FPS, the RX 9070 XT offers competitive performance with similarly priced NVIDIA options in both raster and ray tracing.
www.techpowerup.com
There's a ~400 Mhz difference between 9070 and 9070 XT, I don't know where along the way does the power draw increase like crazy, if it's after 100, 150 or 200 Mhz, but it's somewhere along the way.
The gaming/maximum power draw doesn't bother that much, the fact that it's higher than the more expensive 5070 Ti is bad karma but even so it's still a good product, what actually bothers me are the spikes when going from idle to load.
AMD Radeon RX 9070 (XT) im Test: Lautstärke, Temperatur, Leistungsaufnahme und OC / Lautstärke & Kühlung / Temperaturen unter Last
www.computerbase.de
Getting back to the overpaying aspect, yeah it's a very fine line that can be easily crossed, from a gamer's perspective it makes more sense to overpay $50 maybe a bit more on 9070 XT compared to the 5070 Ti at MSRP (lol good luck with that), as the price-to-performance ratio is still competitive but for someone that wants to dabble with a bit of productivity/AI apps on top of gaming then the 9070 XT has less going for it when overpaying. And the more you want to do with the card outside of gaming the less it has going for it apparently.
I'm okay with AMD cards being crap in Blender Cycles, not everything revolves (pun intended) around Cycles (if it's at least decent in Eevee I'm okay with that). There are other productivity workloads where AMD can be competitive, or at least decent compared to the competition. And as always some relevant workloads, here and there, where AMD cards can punch above their price bracket.
All of these things amount to what's generally called the "feature set". AMD needs to milk every opportunity no matter how small, because every good benchmark score can win them a customer. The hardware is there, it proved itself in games, the software is also (sometimes) there as demonstrated by FSR4 and a few productivity results. But they have to chase every relevant scenario like it matters, because nVidia is and it has the market share percentage to back it up. Well at least it had until 9000 series launched.

In conclusion I don't want to make it seem like I'm advocating overpaying. I'm not but at the same time I understand the current market conditions really facilitate that. It's the perfect storm that sweeps through people's wallets.