• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Gains 9% Performance at 1440p with Latest Driver, Beats RTX 5070 Ti

That's actually quite impressive just from driver updates. Spider Man especially, a whopping 27%.
 
The one flaw in this driver improvement test that I noticed is that there are not enough games that have high ray tracing or path tracing modes turned on. It would be good to see how much ray tracing performance has improved on both cards.

https://chipsandcheese.com/p/rdna-4s-raytracing-improvements shows that RDNA 4 has new GPU shader software instructions designed to accelerate ray tracing operations. Because ray tracing in RDNA 2 and later uses relies a lot more on GPU shader software instructions that are designed to accelerate ray tracing operations by driving the ray accelerators from the shader pipeline instead of RT cores that are separate from the shaders, I could see AMD’s driver possibly affect ray tracing performance much more than Nvidia’s driver could.
 
Woop looks like HUB data are flawed

HUB used old data from when the cards were reviewed vs rebench with new drivers. Since the test system itself had some updates it's no longer an apple vs apple comparison between old driver vs new driver.

Kinda funny that a seasoned reviewer would have other variable other than the driver versions tested, when testing old vs new drivers :kookoo:
 
HUB was wrong about the source of the 9% gain, but not about the gain itself. 9070 XT did gain 9%.
The reason Tech Yes did not find any uplift was because he already had those updates applied.

Whether it was windows updates, chipset drivers or something else we dont know.

And PC Games Hardware tested before HUB and also found an uplift compared to their day 1 data.
 
HUB was wrong about the source of the 9% gain, but not about the gain itself. 9070 XT did gain 9%.
The reason Tech Yes did not find any uplift was because he already had those updates applied.

Whether it was windows updates, chipset drivers or something else we dont know.

And PC Games Hardware tested before HUB and also found an uplift compared to their day 1 data.

What if the new window update or chipset driver (on a Ryzen 9800X3D system) somehow negatively affect Nvidia performance? too many variable to account for.

The only variable should only be the driver version.

BTW HUB is the only one seeing some odd data in Space Marines 2 on RTX, no other channel are seeing it. If anything his test system is doing some weird thing.
 
Why would AMD chipset drivers make Nvidia performance worse? That's nonsense, and this isn't about Nvidia performance, you couldn't make it any more obvious you're looking to take a dig at AMD, as usual. It's interesting how Nvidia fans want the 9070XT performance increase to be wrong.
I don't see the video as some sort of gotcha aimed at HUB, there is still a performance increase.
 
Why would AMD chipset drivers make Nvidia performance worse? That's nonsense, and this isn't about Nvidia performance, you couldn't make it any more obvious you're looking to take a dig at AMD, as usual. It's interesting how Nvidia fans want the 9070XT performance increase to be wrong.
I don't see the video as some sort of gotcha aimed at HUB, there is still a performance increase.

LOL this is about the testing methodology, no saint reviewer use that many different variables (different game settings and now even test systems) when doing reviews.

Computerbase also complain about HUB testing methodology:
ComputerBase plans to take an isolated look at the benefits of new drivers for the Radeon RX 9000 and GeForce RTX 5000 in the near future. The tests will be carried out on a system with the same Windows version and the same game patch level. This is the only way to determine the effect of changing the driver, and even then, the results will likely depend on the game selection, the graphics card, or even the resolution.

I couldn't care less about 9070XT performance, just that HUB is no longer a trusted review source with all that flawed testing.
 
LOL this is about the testing methodology, no saint reviewer use that many different variables (different game settings and now even test systems) when doing reviews.

Computerbase also complain about HUB testing methodology:


I couldn't care less about 9070XT performance, just that HUB is no longer a trusted review source with all that flawed testing.
Was he a trusted reviewer before? I said it 50 times in the past, he keeps changing settings between reviews which makes no sense.
 
Was he a trusted reviewer before? I said it 50 times in the past, he keeps changing settings between reviews which makes no sense.
As pointed out in the YouTube comments, he conveniently forgot to mention he underclocked the 5070ti.
 
As pointed out in the YouTube comments, he conveniently forgot to mention he underclocked the 5070ti.
wow and I thought their testing was already ham fisted and contrived, how much was it underclocked? given he's measuring % gains from drivers, basically any %age matters here.

lol he pinned a comment 15 hours ago, damage control mode.
I don't want to directly or individually address the videos discussing the 9070 XT testing I did recently. Those videos did misunderstand the test, but unfortunately I didn’t explain the video as well as I should have, and did suggest that the bulk of the gains in at least one instance were a result of display drivers when really it could have been any number of things.

To help avoid confusion I should have also labelled the graphs “Review Data” and “Latest Data” rather than review drivers and latest drivers.

The point of the video was to take the review data for the 9070 XT and 5070 Ti, and compare that to what we’re getting now, using the same games and settings featured in the review. This gives us a look at how things have progressed after 4 months, are the margins still the same or has one card improved more than the other?

I wasn’t the first to notice that for whatever reason the 9070 XT had improved in a number of games. @pcgameshardware first reported a 13% increase in performance for the 9070 XT in Hunt Showdown 1896, 13% in Space Marine 2, 24% for Forza Motorsport, 17% for Indiana Jones and the Great Circle, 23% for Half-Life Ray Traced and 14% for Portal RTX.

Outside of that they tested a range of games I didn’t, meaning many of the 16 games I tested weren’t featured in their content, so I was keen to see what changes might be seen in those titles.

The gains for the 9070 XT and RDNA 4 series in general aren’t just a result of newer drivers, they’re likely a combination of things, such as game updates, platform updates and perhaps even Windows updates. So a simple driver test where you compare the release driver with the latest driver won’t necessarily show the same gains for all games, as all other updates have already taken place.

Just to be clear that wasn’t the purpose of this content, and I think I explained the idea behind the video quite well. As stated in the video the idea was to take the outdated review data for the 9070 XT and 5070 Ti, and compare that to what we’re seeing today. My mistake was focusing on drivers as being the leading cause for the performance gains, which was an assumption on my part.
 
wow and I thought their testing was already ham fisted and contrived, how much was it underclocked? given he's measuring % gains from drivers, basically any %age matters here.

I mean, after all, they seem to have backed out after no one could replicate their findings :laugh:
 
I mean, after all, they seem to have backed out after no one could replicate their findings :laugh:
Yep I hadn't realised it, but beyond my own criticisms of his testing methodology other tech press saw it, and more, and decided to put his results to the test, and it fell apart. He even talks about his "idea for the video" and focussing on drivers being the leading cause of gains, sounds an awful lot like choosing a conclusion and creating his tests to demonstrate it, only it was even worse than I thought. Feels good to get the vindication lol.
 
Last edited:
As pointed out in the YouTube comments, he conveniently forgot to mention he underclocked the 5070ti.
I have been searching for any such claims for a while, and could not find any such YouTube comments claiming this so far.
 
lol he pinned a comment 15 hours ago, damage control mode.
I don't want to directly or individually address the videos discussing the 9070 XT testing I did recently. Those videos did misunderstand the test, but unfortunately I didn’t explain the video as well as I should have, and did suggest that the bulk of the gains in at least one instance were a result of display drivers when really it could have been any number of things.

What an absolute clown. Utterly embarrassing. Instead of damage control he should have come clean. That statement is downright embarrassing and pathetic. He has utterly beclowned himself.

Did he fake it to generate some ad impressions? Or did AMD pay him to make a shill video to coincide with the recent price drops so they can boost sales a bit? Sure seems like the latter, and I sure hope they did because if he did it for free that adds another layer of embarrassment. And I say this as someone who runs all AMD and who likes Radeon products.

Not the first time this guy shot himself in the foot. I vaguely remember some comparison shenanigans when DDR5 came out where he used a shit-tier DDR4 kit with hotdog down a hallway loose timings, and magically ddr5 was a massive gain in performance. wow! upgrade to intel 12th gen with ddr5 immediately!

He has sunk beneath Linus. It's almost impressive.
 
Last edited:
This doesn't count as a surprise for me because the move from RDNA3 to RDNA4 is more substantial than just number change, it's good that this arch gets better as it mature by AMD finetuning the driver. You can read about what major changes in RDNA4 at chips and cheese article:



I found it quite surprising because I was always under the impression that RDNA 4 was essentially a "bug fix" of RDNA 3. I wonder if the fact they went back to a monolithic die over chiplets is part of the reason they were able to pull out such a performance bump?

I would like AMD to go back to chiplets with UDNA for no other reason than it's cool, but maybe monolithic is the way to go for now?
 
What an absolute clown. Utterly embarrassing. Instead of damage control he should have come clean. That statement is downright embarrassing and pathetic. He has utterly beclowned himself.

Did he fake it to generate some ad impressions? Or did AMD pay him to make a shill video to coincide with the recent price drops so they can boost sales a bit? Sure seems like the latter, and I sure hope they did because if he did it for free that adds another layer of embarrassment. And I say this as someone who runs all AMD and who likes Radeon products.

Not the first time this guy shot himself in the foot. I vaguely remember some comparison shenanigans when DDR5 came out where he used a shit-tier DDR4 kit with hotdog down a hallway loose timings, and magically ddr5 was a massive gain in performance. wow! upgrade to intel 12th gen with ddr5 immediately!

He has sunk beneath Linus. It's almost impressive.

lol pfp

Personally I'm mostly OK with HUB, though I know every outlet does have some sort of bias. TPU, I reckon, is probably the most impartial of them because W1zz is a seasoned old pro and he personally oversees the GPU review work. He sure has detractors, especially when things don't go their way, but if I had to pick someone to get behind in the industry it'd be W1zz. I've stuck by the forum for a reason haha. I also trust Hilbert over at Guru3D a lot. :toast:

The problem that this episode shows is that people have to have a tendency to take the words of those they trust as law, especially if it bolsters either someone's favorite brand or team, perhaps a weaker or losing position, it's never a good idea to do so. The multitude of review outlets ensure that you have a wide variety of perspectives to look from so you can make an informed purchasing decision.

What has never escaped me is that Steve seems to have a very soft spot for certain bad products in his heart. May or may not be a coincidence they're from AMD haha. He is the one guy who will defend the RX 5700 XT, and you don't defend the 5700 XT. You don't even acknowledge that tripe exists, same goes for the FX processor line. Best left buried in the past. :laugh::fear:

I found it quite surprising because I was always under the impression that RDNA 4 was essentially a "bug fix" of RDNA 3. I wonder if the fact they went back to a monolithic die over chiplets is part of the reason they were able to pull out such a performance bump?

I would like AMD to go back to chiplets with UDNA for no other reason than it's cool, but maybe monolithic is the way to go for now?

The concept of chiplets is great, I'm just not sure if it works all that well. The 7900 XTX has clear performance scaling problems, on paper it shouldn't owe much or anything to the RTX 4090. But it barely manages to keep up with the 4080 overall, and the 4080 is a much simpler, leaner design. In any case, I want UDNA to kick butt, perform like a 6090 pls. Sick and tired of buying Jensen new leather jackets.
 
What an absolute clown. Utterly embarrassing. Instead of damage control he should have come clean. That statement is downright embarrassing and pathetic. He has utterly beclowned himself.

Did he fake it to generate some ad impressions? Or did AMD pay him to make a shill video to coincide with the recent price drops so they can boost sales a bit? Sure seems like the latter, and I sure hope they did because if he did it for free that adds another layer of embarrassment. And I say this as someone who runs all AMD and who likes Radeon products.

Not the first time this guy shot himself in the foot. I vaguely remember some comparison shenanigans when DDR5 came out where he used a shit-tier DDR4 kit with hotdog down a hallway loose timings, and magically ddr5 was a massive gain in performance. wow! upgrade to intel 12th gen with ddr5 immediately!

He has sunk beneath Linus. It's almost impressive.
0 chance he is getting paid by amd. Please lets not dwelve into that kind of conspiracy theory. Its quite well known in the tech industry that youll get more clicks by crapping on intel and nvidia and praising amd, that's why most of the clickbait content revolves around that. You dont need amd to be paying you, amds fans are paying you by buying into your clickbait nonsense.

Take a walk into the youtube tech sphere and youll notice a trend, most people use anti nvidia thumbnails and titles like "amd is faster than nvidia in RT" to drive clicks. Below thumbnails are all from a single channel, is he getting paid by amd? Of course not, he is being paid by the amd hive mind. I don't think AMD wants anything to do with these people and especially it's fandom that's absolutely toxic and keeps people away. If Lisa Su had a magic wand she'd instantly delete 75% of her own company's fans and turn them into nvidia fans. Even in this very forum - even though w1zzard is more numbers driven instead of being opiniated like most youtube reviews, you still see people that are fans of amd just drive people away. After reading ngreedia, leather jacket and idiots buying nvidia - that's enough for me to stay away from any and all amd products.

asdf.JPG


This is funny. HUB said "nvidia confirmed" that there is a bug with 5xxx reducing performance by 65% in warhammer. TYS contacted nvidia asking if it's true and they replied "nope, never happened :D ".
 
Last edited:
I tend to buy my cards either one generation behind or used near the end of a generation and I've virtually never had the early adopter driver problems of either Nvidia or AMD. My systems tend to be rock solid either way as far as my use cases are concerned. It is quite nice to float above it all.

100% this. Added to that I hardly ever play a game until it is a year or two old and most the bugs have been ironed out. I acknowledge that this not a solution for most people and it must be frustrating being excited a game/product launch and having issues.
 
100% this. Added to that I hardly ever play a game until it is a year or two old and most the bugs have been ironed out. I acknowledge that this not a solution for most people and it must be frustrating being excited a game/product launch and having issues.

-Patient gamers untie!
 
Hardware Unboxed released a podcast that responds to this controversy. They admit their testing mistakes like failing to control for things outside of the drivers that could have sped things up like the operating system updates or game updates, mislabeling the results as "Release Drivers" and "Latest Drivers" instead of "Release Data" and "Latest Data", and drawing a bad conclusion that the speedup is due to the drivers without better data to back it up. They admit that they should have used an up-to-date system where everything was tested with up-to-date everything except for the review driver for that test to control for all of the other possible factors and have that data in hand before drawing any conclusions. Basically, what can be concluded using the initial tests is that the RX 9070 XT is faster than it was during the pre-release testing, but the reason why cannot be inferred from that data.
 
Drivers or not the end result is the same now RX 9070 XT is as fast as RTX 5070 Ti that's the most important thing.

AMD have less budget for game optimization etc.... If we look at transistor count and good optimization AMD probably would be slightly ahead of RTX 5070 Ti in raster.
 
Last edited:
Drivers or not the end result is the same now RX 9070 XT is as fast as RTX 5070 Ti that's the most important thing.

AMD have less budget for game optimization etc.... If we look at transistor count and good optimization AMD probably would be slightly ahead of RTX 5070 Ti in raster.
Exactly. Regardless of the source of this gain the end result is the same.

9070 XT can easily be made even faster by undervolting and overclocking the VRAM. Mine constantly runs at ~3300MHz in games with stable undervolt and at ~730GB/s memory bandwidth with fast timings (644GB/s is the default). Based on TPU's testing this results in extra ~11% of performance. Possibly more in my case because TPU's numbers are more conservative 3244Mhz and 711GB/s. This means it will surpass stock 5070 Ti, 7900 XTX, 4080 and 4080S.

Obviously it wont reach 5080 level and i very much doubt it will ever during it's lifetime reaches 5080 performance, but it's still good enough as everything including 5070 Ti costs more than 9070XT. 5070 Ti is +130. 7900 XTX was and is selling at 1000+ and so were the 4080 and 4080S models for much of it's shelf life. I can get 9070 XT for roughly 700.

7900 XTX may have value for those that need 24GB but all the rest are also 16GB GPU's and unless you're really into heavy RT, CUDA or pro app performance there's little reason to spend more.
 
Drivers or not the end result is the same now RX 9070 XT is as fast as RTX 5070 Ti that's the most important thing.

AMD have less budget for game optimization etc.... If we look at transistor count and good optimization AMD probably would be slightly ahead of RTX 5070 Ti in raster.
It's not really the same cause all other publications that use latest drivers don't actually have the same results, TYS, pcgh, computerbase all have the 5070ti faster than the 9070xt. I mean even hubs tests don't seem to translate to 4k gains anyways. Nothing really changed since launch, the 70ti is a bit faster in raster and a lot faster in RT, just like it was on launch.
 
HWUB lost all credibility. Pathetic jerks.
 
It's not really the same cause all other publications that use latest drivers don't actually have the same results, TYS, pcgh, computerbase all have the 5070ti faster than the 9070xt. I mean even hubs tests don't seem to translate to 4k gains anyways. Nothing really changed since launch, the 70ti is a bit faster in raster and a lot faster in RT, just like it was on launch.
The point of the video was to take the review data for the 9070 XT and 5070 Ti, and compare that to what we’re getting now, using the same games and settings featured in the review. This gives us a look at how things have progressed after 4 months, are the margins still the same or has one card improved more than the other? I wasn’t the first to notice that for whatever reason the 9070 XT had improved in a number of games. @pcgameshardware first reported a 13% increase in performance for the 9070 XT in Hunt Showdown 1896, 13% in Space Marine 2, 24% for Forza Motorsport, 17% for Indiana Jones and the Great Circle, 23% for Half-Life Ray Traced and 14% for Portal RTX. Outside of that they tested a range of games I didn’t, meaning many of the 16 games I tested weren’t featured in their content, so I was keen to see what changes might be seen in those titles. The gains for the 9070 XT and RDNA 4 series in general aren’t just a result of newer drivers, they’re likely a combination of things, such as game updates, platform updates and perhaps even Windows updates. So a simple driver test where you compare the release driver with the latest driver won’t necessarily show the same gains for all games, as all other updates have already taken place. Just to be clear that wasn’t the purpose of this content, and I think I explained the idea behind the video quite well. As stated in the video the idea was to take the outdated review data for the 9070 XT and 5070 Ti, and compare that to what we’re seeing today. My mistake was focusing on drivers as being the leading cause for the performance gains, which was an assumption on my part.
 
Last edited:
Back
Top