• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

Good for Amd then that you aren't leading it , you'd run it into the ground in 6 months lol

RT is the single biggest leap in photorealism in real time rendering , EVER, you either use amd and cant enjoy fhe feature properly and need fsr performance or you are simply trolling. Its here to stay. Games are starting to use RT features that cant even be disabled anymore AMD is lagging behind sorely because of RT and updcaling and other bells and whistles, RDNA2 was very competitive in raster and often faster and yet it still got beat because the vast majority wants these new features , a very tiny minority on the Internet moans about RT being a gimmick.

Upscaling surpassed native in more and more ways ever since dlss 2.0 launched , if you still get artifacting or shimmering then you are a fsr user.

and anyone who calls it a gimmick has his brain shoved where the sun doesn't shine.

RT, Updcaling, AI is the future and any company that does not implement better solutions wilm be left in the dust
Delusional much
 
Using A Path Tracing Benchmark? What AMD card supports Path Tracing?
They all do. They're just not very good at it.
Path tracing isn't magic, it's mathematics, just like any other method of generating computer graphics. You can ray trace or path trace on a GPU with no RT cores, or even on a CPU. It won't run very well, but you can do it. The indie game "Teardown" is fully ray-traced (not rasterised - its use of voxels allows ray tracing to work at low ray count without looking like complete ass), doesn't use RT cores, and is playable (albeit only at relatively low resolutions and frame rates) on old GPUs like the RX 580 and GTX 1060. Nvidia's RT cores are just much better at path tracing than AMD's, and RDNA4 will hopefully change that.

I hate to be that guy, but literally nobody cares about path tracing when it comes with that much of a performance penalty. Not even the most die hard Nvidia zealots running 4090s. Ask anyone running one if they'd rather run this game at 1080p 60FPS path traced or 4K 60FPS with RT and DLSS Quality on their 4K monitors when actually playing the game and not benchmarking.
You don't need to hate anything, I completely agree.
I used Cyberpunk 2077 with PT as an example, because I wanted to find a situation which was as close to a performance of pure ray/path tracing performance as possible. The overwhelming majority of games which use RT or PT, are primarily rasterised, and only overlay the tracing on top for reflections, lighting, and shadows as an additional effect or embellishment on top of the rasterised image.
My choice of example was intended to show a situation where tracing performance is the primary factor in the performance result, and rasterisation isn't significant.
I fully understand that this isn't representative of the difference in performance in realistic gaming scenarios, and I apologise if my choice of example was misleading.

In a more realistic situation, of a primarily rasterised game which uses some traced effects, an RX 7900 GRE is much closer to the performance of an RTX 4070 Ti, and an RX 7900 XTX is often faster overall. The point I was trying to make is that the Nvidia GPU will lose much less performance when ray/path tracing is enabled compared to when it's disabled, and that RDNA4 having 3x the tracing performance of RDNA3 would allow them to close this gap. For example, rasterisation might be 85% of the frame time for an RTX 4070 Ti, with the remaining 15% being ray tracing, while an RX 7900 XTX might need to spend 45% of its frame time on ray tracing; so even though its rasterisation performance is much higher, it might not be much faster overall in games that use ray tracing.

And also, over the next few years, more games will make use of more intensive ray/path-traced effects, so tracing performance will become even more important over time. Even so, I don't expect that examples as extreme as Cyberpunk 2077 with PT will be directly relevant to the average gamer any time soon, but it is still indirectly relevant, as an indication of ray/path-tracing performance as a component of total gaming performance.

I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
 
Last edited:
  • Like
Reactions: Am*
I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
I do agree -- but in my opinion, ray tracing is not where AMD's priority should be...but rather the feature parity against CUDA, DLSS, RTX video and RTX HDR. They're the only features I've actually missed moving from my RTX 3060 to the 7800 XT -- and by tackling these features, they'll get a 10x better return on their money than investing in anything related to ray tracing, since it will also finally be a showcase of what their cards can do in AI workloads for professionals and this is still where the current investor gold rush is. Even if AMD beats Nvidia in ray tracing at every equivalent SKU by 20%, the market will still pick Nvidia over AMD's GPUs for DLSS or CUDA alone.

So long as AMD are in both of the higher end (relative to Nintendo at least) consoles from Microsoft and Sony, ray tracing is going nowhere for mainstream gaming and will remain an afterthought -- at least until next gen consoles launch...that is more than evident now, considering we've had GPUs with this capability for almost 7 years with barely any progress (compared to past generations like the GTX 400 series for example, where mass feature adoption happened in about 3-5 years from launch and almost everyone got upgraded to hardware capable of the latest feature set, like decent tessellation performance)...by fragmenting the market with multiple different versions of DLSS, selling non-ray tracing capable SKUs like the GTX 1600 cards, the scalping/unavailability of GPUs for about 2 years and the lack of VRAM progression, Nvidia have been their own worst enemy in slowing down mass adoption of ray tracing capable GPUs. And to add to this -- not to mention the huge number of people running old integrated graphics or several generations old GPUs with no ray tracing capability (which is money no game developer is willing to turn down voluntarily -- especially when so many games are being re-released/ported from last gen consoles with not much else besides minor some visual improvements).
 
Known this for about a year, the big move in later 2025 into 26 is udna
IF they launched UDNA late this year I'd be shocked. I don't expect that until Fall of the following year.
That said I HOPE I am wrong that that UDNA is imminent. It is so weird how they just plan to release a 70-tier card and call it 'good' for an entire GPU Generation.
That shitz Whack AF, y'all.
 
Back
Top