• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Gains 9% Performance at 1440p with Latest Driver, Beats RTX 5070 Ti

They are doing that already, when new cards release they rebench everything with newest drivers afaik. @W1zzard can shed some light

No immediate plans for retesting though. Maybe later in summer, also waiting for new games. It takes like two weeks of full-time work, i.e. it effectively costs a lot of money.
 
Maybe, but then you can handwave any and all complaints about bugs and driver issues.
Not true, depends on the level of evidence and authenticity of what is provided. What you have stated is the slippery slope, which does not necessarily apply.
 
I really dislike this clickbait stunt HUB pulled. We forgetting the 9070 XT was losing to the 4070 Super at CS?
It also had big trouble with Final Fantasy. And your point is... ?
 
I bought my 9070 XT (16 GB) on launch day for the £575 RRP.

Considering the 5070 (12 GB) is currently £495 and the 5070 Ti (16 GB) is currently £900, I am happy with my purchase.
 
I don't want to argue with anyone. Just saying I so far have had no problems with drivers on RX 5700 XT, RX 7800 XT, RX 9070 XT.

I guess I'm damn lucky.

It's time to accept the truth: RTX 5000 drivers are a mess, they even break working things on older RTX generations. AMD currently has better drivers in terms of stability (crashing, BSODing, outputing black screen and so on). Many people might have seen similar problems to mine when card was underperforming due to utilization of crappy oc software.

Last time I was suspicious of new drivers affecting my RX 9070 XTs score in 3DMark. Later I found out the card was not boosting properly due to buggy piece of shit software that ASUS GPU Tweak III is. I wrote about it in another thread, that software is a mess.

Btw, you can get RX 9070 XT for 707e incl. 20% VAT now, which is much better fps/money ratio than currently offers RTX 5070 Ti.
 
I don't want to argue with anyone. Just saying I so far have had no problems with drivers on RX 5700 XT, RX 7800 XT, RX 9070 XT.
And im sure X person will chime in and tell you they have no problems with their RTX 5000. And yet you just callem them "a mess". Based on what? Do you have any actual study conducted on stability of drivers between amd and nvidia? Have you even used an RTX 5xxx?
 
And im sure X person will chime in and tell you they have no problems with their RTX 5000. And yet you just callem them "a mess". Based on what? Do you have any actual study conducted on stability of drivers between amd and nvidia? Have you even used an RTX 5xxx?
Get it together, mate.

 
It also had big trouble with Final Fantasy. And your point is... ?

My point is, that HUB is full of HUBris (;)), that's what. And that's never good when you're trying to make a point. It comes off as trying a little too hard, the 9070 XT isn't a meritless product. If anything, it's something that turned out to be quite good, and it doesn't need this type of push.

The whole "FineWine" rhetoric for a 3 month old product is baffling, it makes no sense whatsoever. It's just poor quality drivers at launch, that's all it ever was.

Get it together, mate.


Do you really think that nothing was done to improve the performance and stability of 50 series cards in 2 months? Come on, now.

1751562864092.png


My main issue with AMD drivers is that we're playing lottery for anything using OpenGL. AMD, as far as my testing goes, can break OpenGL compatibility on Windows within a single update, and we (mostly Minecraft players) already know they just don't care about OpenGL on Windows and prefer focusing on the more modern Vulkan and Linux as a whole.

If you only knew how bad it used to be a couple of years back. AMD entirely rewrote their OpenGL stack, btw. Work on this started about 3 years ago and reached maturity just very recently. Like, from scratch. Everything is new. The new PAL-based UMD is actually quite performant, and if I recall correctly, it's running Minecraft without mods at least on par, if not a bit faster than NV right now. Though, nvidium is just crazy

ngreedium.png
 
Do you really think that nothing was done to improve the performance and stability of 50 series cards in 2 months? Come on, now.
Oh yeah, a lot:
1751563825985.png
 
Oh yeah, a lot:
View attachment 406449

There's been 2 WHQL'd updates since that, with targeted fixes for black screen that were being reported. One of them seems related to having a Ryzen CPU, which makes no sense to me but is what it is. I never had it, but then again I am both on Intel and using HDMI.
 
I hope there is, since it has 10 times the marketshare, it would make sense

What's the misinformation though?
The misinformation stems from the fact that you implied 9070XT matches 5070Ti in raster now, but have worse drivers. Based on the two launches, it's overwhelmingly apparently whose drivers were worse. But let's say for argument's sake that they were equally bad. Even in that scenario your comment holds zero value.
 
The misinformation stems from the fact that you implied 9070XT matches 5070Ti in raster now, but have worse drivers. Based on the two launches, it's overwhelmingly apparently whose drivers were worse. But let's say for argument's sake that they were equally bad. Even in that scenario your comment holds zero value.
I don't know how you can come to a conclusion about whose drivers are worse, but as you've said, let's assume they are equally bad. With that context, how does my post make no sense? I don't get it. I've seen a lot of people complain about rdna4 drivers - so Im wondering if they fixed those issues or not. Why do you take offense to that?

Get it together, mate.

That video - besides the usual youtube clickbait nonsense - tells you absolutely nothing about amd's drivers. How is that video any kind of proof or evidence that amd doesn''t have driver issues?

Guys, you really seem to be taking anything said about amd personally. Just stop it. Fanboying isn't good for your health, chill out, it's going to be okay, amd will survive this.
 
Generally unsurprising as Blackwell looks like a pretty clean extension of Ada with a few tweaks while RDNA4 appears to have more substantial changes under the hood with room for performance to be extracted from drivers.

At least this isn't the dumpster-fire of the GCN days where there was so much performance improvement left on the table that it made AMD's driver team look less like magicians and more like incompetents. 9% feels like a respectable amount, enough for someone to notice, not enough for anyone to really care.

I tend to buy my cards either one generation behind or used near the end of a generation and I've virtually never had the early adopter driver problems of either Nvidia or AMD. My systems tend to be rock solid either way as far as my use cases are concerned. It is quite nice to float above it all.
 
Last edited by a moderator:
I mean, it's always been like this with ATI/AMD graphics cards. I've been noticing better performance with my 6700 xt in older and newer titles since i got it in early 2023. And no, I'm pretty sure I'm not suffering from the placebo effect syndrome.
 
Low quality post by Hecate91
And im sure X person will chime in and tell you they have no problems with their RTX 5000. And yet you just callem them "a mess". Based on what? Do you have any actual study conducted on stability of drivers between amd and nvidia? Have you even used an RTX 5xxx?
And the drivers on the RTX 5000 series have been a mess, based on many news threads here, Nvidia forums, reddit, tech youtubers talking about buggy drivers. But you even refuse to admit nvidia drivers having issues.
Anyway people are too focused on a Youtube title, in the other thread I have seen people attack the person instead even looking at the information presented, unsurprisingly from the same users who can't accept AMD having any good news and performance improvements.
 
And the drivers on the RTX 5000 series have been a mess, based on many news threads here, Nvidia forums, reddit, tech youtubers talking about buggy drivers.
You personally contribute to 70% of the posts around the internet about nvidias drivers. Not kidding. You don't even have a card and you've posted around 150 times about it. That's why I cannot for the life of me take any of this seriously. When there is so much hatred towards a company anything I read about it I take it with a pinch of salt.
 
I don't know how you can come to a conclusion about whose drivers are worse, but as you've said, let's assume they are equally bad. With that context, how does my post make no sense? I don't get it. I've seen a lot of people complain about rdna4 drivers - so Im wondering if they fixed those issues or not. Why do you take offense to that?

I didn't take offense lol, i merely pointed out the obvious. Your post read like this - it's good it matches 5070ti but stability isn't there, what's the point? That implies that 5070ti is stable whereas 9070XT is not. That's not true now is it? Hence the reason you got called out for misinformation.

If you were wondering whether AMD fixed some of the 9070XT issues, all you needed to do was ask that. No reason to word it like there are issues on one camp and not the other.
 
And im sure X person will chime in and tell you they have no problems with their RTX 5000. And yet you just callem them "a mess". Based on what? Do you have any actual study conducted on stability of drivers between amd and nvidia? Have you even used an RTX 5xxx?
I will be one of those I guess.
Got my 5070 on June 13 and I'm yet to have a single crash/Bsod or any serious driver issue to speak of and I did play and bench a variety of games and on top of that I'm crazy enough to use the said card with a 500W PSU on a single cable with the adapter.:laugh: 'This is a very efficient card so apparently its a non issue'
Same with my 3060 Ti that I've had before this card for 2 and half years, nada/zero driver related crashes during those years.

The only weird issue I've had with the 5070 so far is that Unreal Engine games does not like having a global anisotropic filtering enabled so its better to use it on a per game basis or the in game setting if possible.
 
Not even close... AMD drivers were stable as all hell while Nvidia's drivers were crashing left and right. Thankfully, Nvidia's most recent driver seems to work pretty good (only took 5 months).
Yeah maybe AMD chose slow but stable, probably better than Nvidia's early drivers.

If so then Nvidia won by a country mile with their driver issues that after ten hotfixes still plague 50 series by looking at their forums.
AMD drivers were and are very good. Even at launch. If the main problem with driver is performance (that no one even noticed) then that's much easier thing to add later than first trying to resolve issues and stabilize the card.
Nvidia wants to be on top even for the wrong reasons.:slap:

Lol! But in reality Nvidia drivers were bad (black screens). It was really game developers optimizing their games for the RDNA4 architecture along with AMD shifting more to a software centric strategy that improved performance over the launch numbers.
That was obvious I hope. Probably Nvidia has still a lot of work on their gaming drivers, but I'm afraid they are not in a hurry.
 
Ah, good old TPU. AMD in any kind of positive light? Slapfight.

Generally unsurprising as Blackwell looks like a pretty clean extension of Ada with a few tweaks while RDNA4 appears to have more substantial changes under the hood with room for performance to be extracted from drivers.

At least this isn't the dumpster-fire of the GCN days where there was so much performance improvement left on the table that it made AMD's driver team look less like magicians and more like incompetents. 9% feels like a respectable amount, enough for someone to notice, not enough for anyone to really care.

I tend to buy my cards either one generation behind or used near the end of a generation and I've virtually never had the early adopter driver problems of either Nvidia or AMD. My systems tend to be rock solid either way as far as my use cases are concerned. It is quite nice to float above it all.
I take issue about Blackwell being a "clean" extension to Ada. Sure, Nvidia was hamstrung by the lack of progress at TSMC at developing any new nodes. Even then, normalized testing between Ada and Blackwell found no improvement performance wise according to https://www.techpowerup.com/338264/...lackwell-vs-ada-lovelace-amd-rdna-4-vs-rdna-3 .

What was improved in Blackwell is better ray tracing (which I cheer, because there are effects that cannot be feasibly made without ray tracing), video encoding/decoding hardware improvements including support for 4:2:2 chroma video (great for pros), full DisplayPort 2.1a/b support (AMD, you need to catch up to Nvidia about this on your consumer lines!), and PCIe 5.0 support (get a non-Founders Edition for best results because the ribbon inside the FE appears to be degrading signal integrity). It also adds multi-frame generation which is useful for turn-based RPGs, but is worse than useless for real-time action games, especially those that rely on instant predictions and reactions like landing perfect parries in Street Fighter 6 which has a 2-frame window at 60 Hz because that is latency dependent. (Fighting games have standardized at counting frames at 60 Hz to describe motion and which frames do what.) Frame generation adds latency because it uses real frames to predict intermediate frames, so the real frames must be delayed to insert the AI-generated frames. The RTX 50 series also adds hardware flip metering.

However, there are some serious downgrades. First, the removal of 32-bit CUDA and everything that depends on it like 32-bit PhysX and 32-bit OpenCL, both which Nvidia uses thunks to translate both into CUDA. Unless a thunk to translate 32-bit CUDA to 64-bit CUDA is developed, all three in 32-bit applications are useless. This renders some old CUDA code that was intentionally compiled to 32-bit versions because its smaller pointers created greater code density that allowed the code to fit better in caches useless without a rebuild to 64-bit, messing up several use cases. Second, the 12V-2x6 power connector is over-rated and pushed welll past its real safety margin by the RTX 5090 and RTX 5080, especially Founders Edition cards which draw current mostly from one side instead of more evenly like partner cards do. The RTX 4090 pushed the real limits. The RTX 5090 blew past them. The RTX 5080 is more frugal with power and therefore the partner cards are safer, but lacks the VRAM to handle some of the most demanding but beautiful games that the RTX 4090 handled at 4K native like Indiana Jones and the Great Circle in ultra full path tracing mode.

I haven't seen anything about loss of ability to run 32-bit applications from within a 64-bit operating system from Intel or AMD or Intel GPUs, so now AMD now holds the 32-bit legacy game king crown. I don't know if Intel is good enough at that to share the crown with AMD on that.

The RTX 50 series is more of a side-grade than a clean extension at best, and a serious downgrade at worst due to loss of useful features.

EDIT: Add why multi frame generation is useless to real time action games
 
Last edited:
That video - besides the usual youtube clickbait nonsense - tells you absolutely nothing about amd's drivers. How is that video any kind of proof or evidence that amd doesn''t have driver issues?
This was, however, about Nvidia's drivers. Which were a bloody mess. Yet you stated:

And yet you just callem them "a mess". Based on what?
They were a bloody mess. That's why they were being called a bloody mess. Based on sooooo much reporting around it.

Though, granted, it got more quiet in that regard. Hopefully they did fix them well. Not for Nvidia, that company sucks. But for the people who bought their products.

And of course, I had driver issues over the years, too. Probably everybody who games a little more has known them. The fun part was that AMD was known for bad drivers, which wasn't really granted anymore, and then Nvidia came out and had that launch. Pretty funny.
 
I take issue about Blackwell being a "clean" extension to Ada. Sure, Nvidia was hamstrung by the lack of progress at TSMC at developing any new nodes. Even then, normalized testing between Ada and Blackwell found no improvement performance wise according to https://www.techpowerup.com/338264/...lackwell-vs-ada-lovelace-amd-rdna-4-vs-rdna-3 .

What was improved in Blackwell is better ray tracing (which I cheer, because there are effects that cannot be feasibly made without ray tracing), video encoding/decoding hardware improvements including support for 4:2:2 chroma video (great for pros), full DisplayPort 2.1a/b support (AMD, you need to catch up to Nvidia about this on your consumer lines!), and PCIe 5.0 support (get a non-Founders Edition for best results because the ribbon inside the FE appears to be degrading signal integrity). It also adds multi-frame generation which is useful for turn-based RPGs, but is worse than useless for real-time action games, especially those that rely on instant predictions and reactions like landing perfect parries in Street Fighter 6 which has a 2-frame window at 60 Hz because that is latency dependent. (Fighting games have standardized at counting frames at 60 Hz to describe motion and which frames do what.) Frame generation adds latency because it uses real frames to predict intermediate frames, so the real frames must be delayed to insert the AI-generated frames. The RTX 50 series also adds hardware flip metering.

However, there are some serious downgrades. First, the removal of 32-bit CUDA and everything that depends on it like 32-bit PhysX and 32-bit OpenCL, both which Nvidia uses thunks to translate both into CUDA. Unless a thunk to translate 32-bit CUDA to 64-bit CUDA is developed, all three in 32-bit applications are useless. This renders some old CUDA code that was intentionally compiled to 32-bit versions because its smaller pointers created greater code density that allowed the code to fit better in caches useless without a rebuild to 64-bit, messing up several use cases. Second, the 12V-2x6 power connector is over-rated and pushed welll past its real safety margin by the RTX 5090 and RTX 5080, especially Founders Edition cards which draw current mostly from one side instead of more evenly like partner cards do. The RTX 4090 pushed the real limits. The RTX 5090 blew past them. The RTX 5080 is more frugal with power and therefore the partner cards are safer, but lacks the VRAM to handle some of the most demanding but beautiful games that the RTX 4090 handled at 4K native like Indiana Jones and the Great Circle in ultra full path tracing mode.

I haven't seen anything about loss of ability to run 32-bit applications from within a 64-bit operating system from Intel or AMD or Intel GPUs, so now AMD now holds the 32-bit legacy game king crown. I don't know if Intel is good enough at that to share the crown with AMD on that.

The RTX 50 series is more of a side-grade than a clean extension at best, and a serious downgrade at worst due to loss of useful features.

EDIT: Add why multi frame generation is useless to real time action games

- Hear ya, but when I use the words "clean extension" I just mean that the fundamental building blocks of the GPU haven't really been heavily re-engineered vs ADA (in a relative sense). Yeah some features got tweaked, some throughput got cleaned up, transistors supporting older formats got removed etc.

But SP for SP, RT core for RT core, Tensor for Tensor Blackwell doesn't really do anything revolutionarily different or better than ADA.

RDNA4 OTOH seems to be much more efficient on an CU to CU basis vs RDNA 3/2. Not only are there substantial IPC improvements in Raster but really multi-generational improvements in RT handling.

I think we're kind of saying the same thing, just drawing different semantic lines at the end of the day.
 
Ah, good old TPU. AMD in any kind of positive light? Slapfight.

Generally unsurprising as Blackwell looks like a pretty clean extension of Ada with a few tweaks while RDNA4 appears to have more substantial changes under the hood with room for performance to be extracted from drivers.

At least this isn't the dumpster-fire of the GCN days where there was so much performance improvement left on the table that it made AMD's driver team look less like magicians and more like incompetents. 9% feels like a respectable amount, enough for someone to notice, not enough for anyone to really care.

I tend to buy my cards either one generation behind or used near the end of a generation and I've virtually never had the early adopter driver problems of either Nvidia or AMD. My systems tend to be rock solid either way as far as my use cases are concerned. It is quite nice to float above it all.

In all fairness, this one is kind of justified. Once one reputable outlet makes a statement, all others are quick to copy, no fact checking involved. This is something that honestly, TPU editorial needed to start filtering out and filing it under rumors, speculation, etc. - It doesn't even stop there, once the other news outlets start to spread the same statement, you'll find the low quality junk publications and then the regional "tech sites" that just steal and translate posts and reviews to other languages, often with mistranslations and all sorts of bias insertion, you know how it goes. Add fan engagement and enthusiasm, it's no wonder that the scene is very similar to that of a street fight :laugh:

The fact of the matter is, while I don't think HUB is being necessarily malicious with their statement (which is truthful), it's obviously an enhanced truth, meant to attract clicks and generate hub-bub around the subject. There's no change in standing, the 9070 XT is not really "faster than the 5070 Ti" now, they've always been pretty much identical to one another and both options have their charms, these "gains" aren't some black magic that's giving free performance, I think pretty much anyone level headed knows this. This also isn't any sort of "Fine Wine", this product is 3 (three), not 30 months old. These percentiles as improvements (not corrections or bug fixes) at the 30 month mark, that is fine wine.

What happened is that in addition to the usual few percents you get here and there from the first wave of optimizations (the RTX 50 series also received these, and you can see that in HUB's own reviews), coupled with fixes for the games that were clearly not performing as intended (as seen in every review, theirs and W1zzard's here at TPU included). For example, Spider-Man remaster and CS2, the games that showed the yuuuge percentiles that everyone is talking about, were games that were performing inexplicably like crap on RDNA 4. CS2 was bad to get below 4070 Super level. The expected performance for the 9070 XT is somewhere between the RTX 4080 and 5080, usually around RTX 4080 Super or 5070 Ti level.

It's nice that issues are being ironed out, I've always believed the 9070 XT is a winner. But it's clear as day that the reason that the 5070 Ti didn't have "up to 27% gains" in games like CS because... it didn't need it to begin with. It was performing as intended on that game, and the 9070 XT was not.
 
it doesn't need this type of push.
I disagree, AMD absolutely does need this sort of push after years of people whining drivers are bad, many still are. When the leather jacket man has a near complete monopoly, anything that can help AMD punch upwards is a good thing.
Meanwhile Nvidia gets DLSS articles spammed constantly, as well as overhyping the Steam hardware survey.
That video - besides the usual youtube clickbait nonsense - tells you absolutely nothing about amd's drivers. How is that video any kind of proof or evidence that amd doesn''t have driver issues?

Guys, you really seem to be taking anything said about amd personally. Just stop it. Fanboying isn't good for your health, chill out, it's going to be okay, amd will survive this.
Did you even watch the video? Because it's anything but clickbait, it's disappointing when people disagree with something that doesn't fit their bias and just throw out the clickbait accusations.
The video is about Nvidia's driver mess, it doesn't need to be about AMD's drivers. This is a thread about the performance gains for RDNA4, and perhaps you should take your own advice and stop being so defensive for Nvidia in every AMD thread.
You personally contribute to 70% of the posts around the internet about nvidias drivers. Not kidding. You don't even have a card and you've posted around 150 times about it. That's why I cannot for the life of me take any of this seriously. When there is so much hatred towards a company anything I read about it I take it with a pinch of salt.
Your taking a jab at me says everything, and I can't take your comments seriously when you're constantly bashing in every AMD thread. I'll admit I haven't owned an Nvidia card since the RTX 3000 series but I have my own reasons not to. I bought an AMD card after always owning an Nvidia card for 15 years, and the drivers are fine. But, Nvidia has been deserving of all of the criticism they've been getting lately with buggy drivers, melting power connectors, missing ROPs, removal of 32 bit Phsyx, and other things like no hotspot sensor or fake MSRP's.
 
I disagree, AMD absolutely does need this sort of push after years of people whining drivers are bad, many still are. When the leather jacket man has a near complete monopoly, anything that can help AMD punch upwards is a good thing.
Meanwhile Nvidia gets DLSS articles spammed constantly, as well as overhyping the Steam hardware survey.

That push should be done by showcasing its real merits, though. Which, for once, they actually have. It's a competitively priced product with great performance and good power consumption, there really isn't anything not to love about it. Everyone has noticed the upward trend in AMD's driver quality, they may still not have the same level of technical functionality of the NV drivers, but they are getting there slowly but surely - and everyone expected some rough seas with an all-new architecture's launch. The fact that they aren't crash happy for once is a major win. Showing that they have fixed lower than expected performance in certain games and requesting feedback would earn a massive amount of trust, instead of just trying to spin this as a "massive gain". I know HUB doesn't speak for AMD PR, so hopefully, they'll handle this accordingly.
 
Back
Top